(2 years, 7 months ago)
Commons ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move, That the Bill be now read a Second time.
Given the time and the number of people indicating that they wish to speak, and given that we will have my speech, the shadow Minister’s speech and the two winding-up speeches, there might be 10 minutes left for people to speak. I will therefore take only a couple of interventions and speak very fast in the way I can, being northern.
Almost every aspect of our lives is now conducted via the internet, from work and shopping to keeping up with our friends, family and worldwide real-time news. Via our smartphones and tablets, we increasingly spend more of our lives online than in the real world.
In the past 20 years or so, it is fair to say that the internet has overwhelmingly been a force for good, for prosperity and for progress, but Members on both sides of the House will agree that, as technology advances at warp speed, so have the new dangers this progress presents to children and young people.
My right hon. Friend will know that, last Wednesday, the man who murdered our great friend Sir David Amess was sentenced to a whole-life term. David felt very strongly that we need legislation to protect MPs, particularly female MPs, from vile misogynistic abuse. In his memory, will she assure me that her Bill will honour the spirit of that request?
Sir David was a friend to all of us, and he was very much at the forefront of my mind during the redrafting of this Bill over the last few months. I give my right hon. Friend my absolute assurance on that.
A number of constituents have contacted me over the last few months about eating disorders, particularly anorexia and bulimia, and about bullying in schools. Will the Secretary of State assure me and this House that those concerns will be addressed by this Bill so that my constituents are protected?
They will. Inciting people to take their own life or encouraging eating disorders in anorexia chatrooms—all these issues are covered by the Bill.
I am grateful to my right hon. Friend, and I thank her for her written communications regarding Angela Stevens, the mother of Brett, who tragically took his own life having been coerced by some of these vile online sites. The Law Commission considered harmful online communications as part of the Bill’s preparation, and one of its recommendations is to introduce a new offence of encouraging or assisting self-harm. I strongly urge my right hon. Friend to adopt that recommendation. Can she say more on that?
Yes. Exactly those issues will be listed in secondary legislation, under “legal but harmful”. I will talk about that further in my speech, but “legal but harmful” focuses on some of the worst harmful behaviours. We are talking not about an arbitrary list, but about incitement to encourage people to take their own life and encouraging people into suicide chatrooms—behaviour that is not illegal but which is indeed harmful.
I am going to whizz through my speech now in order to allow people who have stayed and want to speak to do so.
As the Minister for mental health for two years, too often, I heard stories such as the one just highlighted by my hon. Friend the Member for Stoke-on-Trent North (Jonathan Gullis). We have all sat down with constituents and listened as the worst stories any parents could recount were retold: stories of how 14-year-old girls take their own life after being directed via harmful algorithms into a suicide chatroom; and of how a child has been bombarded with pro-anorexia content, or posts encouraging self-harm or cyber-bullying.
School bullying used to stop at the school gate. Today, it accompanies a child home, on their mobile phone, and is lurking in the bedroom waiting when they switch on their computer. It is the last thing a bullied child reads at night before they sleep and the first thing they see when they wake in the morning. A bullied child is no longer bullied in the playground on school days; they are bullied 24 hours a day, seven days a week. Childhood innocence is being stolen at the click of a button. One extremely worrying figure from 2020 showed that 80% of 12 to 15-year-olds had at least one potentially harmful online experience in the previous year.
We also see this every time a footballer steps on to the pitch, only to be subjected to horrific racism online, including banana and monkey emojis. As any female MP in this House will tell you, a woman on social media—I say this from experience—faces a daily barrage of toxic abuse. It is not criticism—criticism is a fair game—but horrific harassment and serious threats of violence. Trolls post that they hope we get raped or killed, urge us to put a rope around our neck, or want to watch us burn in a car alive—my own particular experience.
All this behaviour is either illegal or, almost without exception, explicitly banned in a platform’s terms and conditions. Commercially, it has to be. If a platform stated openly that it allowed such content on its sites, which advertisers, its financial lifeblood, would knowingly endorse and advertise on it? Which advertisers would do that? Who would openly use or allow their children to use sites that state that they allow illegal and harmful activity? None, I would suggest, and platforms know that. Yet we have almost come to accept this kind of toxic behaviour and abuse as part and parcel of online life. We have factored online abuse and harm into our daily way of life, but it should not and does not have to be this way.
This Government promised in their manifesto to pass legislation to tackle these problems and to make the UK the
“safest place in the world to be online”
especially for children. We promised legislation that would hold social media platforms to the promises they have made to their own users—their own stated terms and conditions—promises that too often are broken with no repercussions. We promised legislation that would bring some fundamental accountability to the online world. That legislation is here in the form of the ground- breaking Online Safety Bill. We are leading the way and free democracies across the globe are watching carefully to see how we progress this legislation.
The Bill has our children’s future, their unhindered development and their wellbeing at its heart, while at the same time providing enhanced protections for freedom of speech. At this point, I wish to pay tribute to my predecessors, who have each trodden the difficult path of balancing freedom of speech and addressing widespread harms, including my immediate predecessor and, in particular, my hon. Friend the Member for Gosport (Dame Caroline Dinenage), who worked so hard, prior to my arrival in the Department for Digital, Culture, Media and Sport, with stakeholders and platforms, digging in to identify the scope of the problem.
Let me summarise the scope of the Bill. We have reserved our strongest measures in this legislation for children. For the first time, platforms will be required under law to protect children and young people from all sorts of harm, from the most abhorrent child abuse to cyber-bullying and pornography. Tech companies will be expected to use every possible tool to do so, including introducing age-assurance technologies, and they will face severe consequences if they fail in the most fundamental of requirements to protect children. The bottom line is that, by our passing this legislation, our youngest members of society will be far safer when logging on. I am so glad to see James Okulaja and Alex Holmes from The Diana Award here today, watching from the Gallery as we debate this groundbreaking legislation. We have worked closely with them as we have developed the legislation, as they have dedicated a huge amount of their time to protecting children from online harms. This Bill is for them and those children.
The second part of the Bill makes sure that platforms design their services to prevent them from being abused by criminals. When illegal content does slip through the net, such as child sex abuse and terrorist content, they will need to have effective systems and processes in place to quickly identify it and remove it from their sites. We will not allow the web to be a hiding place or a safe space for criminals. The third part seeks to force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all sorts of other unacceptable behaviour that they claim not to allow but that ruins life in practice. In other words, we are just asking the largest platforms to simply do what they say they will do, as we do in all good consumer protection measures in any other industry. If platforms fail in any of those basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down the full weight of the law upon them.
I will take just two more interventions and that will be it, otherwise people will not have a chance to speak.
I am very grateful to my right hon. Friend for giving way. The internet giants that run the kind of awful practices that she has described have for too long been unaccountable, uncaring and unconscionable in the way they have fuelled every kind of spite and fed every kind of bigotry. Will she go further in this Bill and ensure that, rather like any other publisher, if those companies are prepared to allow anonymous posts, they are held accountable for those posts and subject to the legal constraints that a broadcaster or newspaper would face?
These online giants will be held accountable to their own terms and conditions. They will be unable any longer to allow illegal content to be published, and we will also be listing in secondary legislation offences that will be legal but harmful. We will be holding those tech giants to account.
I thank the Secretary of State for giving way. She talked about how this Bill is going to protect children much more, and it is a welcome step forward. However, does she accept that there are major gaps in this Bill? For instance, gaming is not covered. It is not clear whether things such as virtual reality and the metaverse are going to be covered. [Interruption.] It is not clear and all the experts will tell us that. The codes of practice in the Bill are only recommended guidance; they are not mandatary and binding on companies. That will encourage a race to the bottom.
The duties are mandatory; it is the Online Safety Bill and the metaverse is included in the Bill. Not only is it included, but, moving forward, the provisions in the Bill will allow us to move swiftly with the metaverse and other things. We did not even know that TikTok existed when this Bill started its journey. These provisions will allow us to move quickly to respond.
I am grateful to my right hon. Friend for giving way. One of the most important national assets that needs protecting in this Bill and elsewhere is our reputation for serious journalism. Will she therefore confirm that, as she has said outside this House, she intends to table amendments during the passage of the Bill that will ensure that platforms and search engines that have strategic market status protect access to journalism and content from recognised news publishers, ensuring that it is not moderated, restricted or removed without notice or right of appeal, and that those news websites will be outside the scope of the Bill?
Will my right hon. Friend give way?
No, I have to continue.
Not only will the Bill protect journalistic content, democratic content and democratic free speech, but if one of the tech companies wanted to take down journalistic content, the Bill includes a right of appeal for journalists, which currently does not exist. We are doing further work on that to ensure that content remains online while the appeal takes place. The appeal process has to be robust and consistent across the board for all the appeals that take place. We have already done more work on that issue in this version of the Bill and we are looking to do more as we move forward.
As I have said, we will not allow the web to be a hiding place or safe space for criminals and when illegal content does slip through the net—such as child sex abuse and terrorist content— online platforms will need to have in place effective systems and processes to quickly identify that illegal content and remove it from their sites.
The third measure will force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all the other unacceptable behaviours. In other words, we are asking the largest platforms to do what they say they will do, just as happens with all good consumer-protection measures in any other industry. Should platforms fail in any of their basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down upon them the full weight of the law. Such action includes searching platforms’ premises and confiscating their equipment; imposing huge fines of up to 10% of their global turnover; pursuing criminal sanctions against senior managers who fail to co-operate; and, if necessary, blocking their sites in the UK.
We know that tech companies can act very quickly when they want to. Last year, when an investigation revealed that Pornhub allowed child sexual exploitation and abuse imagery to be uploaded to its platform, Mastercard and Visa blocked the use of their cards on the site. Lo and behold, threatened with the prospect of losing a huge chunk of its profit, Pornhub suddenly removed nearly 10 million child sexual exploitation videos from its site overnight. These companies have the tools but, unfortunately, as they have shown time and again, they need to be forced to use them. That is exactly what the Bill will do.
Before I move on, let me point out something very important: this is not the same Bill as the one published in draft form last year. I know that Members throughout the House are as passionate as I am about getting this legislation right, and I had lots of constructive feedback on the draft version of the Bill. I have listened carefully to all that Members have had to say throughout the Bill’s process, including by taking into account the detailed feedback from the Joint Committee, the Digital, Culture, Media and Sport Committee and the Petitions Committee. They have spent many hours considering every part of the Bill, and I am extremely grateful for their dedication and thorough recommendations on how the legislation could be improved.
As a result of that feedback process, over the past three months or so I have strengthened the legislation in a number of important ways. There were calls for cyber-flashing to be included; cyber-flashing is now in the Bill. There were calls to ensure that the legislation covered all commercial pornography sites; in fact, we have expanded the Bill’s scope to include every kind of provider of pornography. There were concerns about anonymity, so we have strengthened the Bill so that it now requires the biggest tech platforms to offer verification and empowerment tools for adult users, allowing people to block anonymous trolls from the beginning.
I know that countless MPs are deeply concerned about how online fraud—particularly scam ads—has proliferated over the past few years. Under the new version of the Bill, the largest and highest-risk companies—those that stand to make the most profit—must tackle scam ads that appear on their services.
We have expanded the list of priority offences named on the face of the legislation to include not just terrorism and child abuse imagery but revenge porn, fraud, hate crime, encouraging and assisting suicide, and organised immigration crime, among other offences.
If anyone doubted our appetite to go after Silicon Valley executives who do not co-operate with Ofcom, they will see that we have strengthened the Bill so that the criminal sanctions for senior managers will now come into effect as soon as possible after Royal Assent— I am talking weeks, not years. We have expanded the things for which those senior managers will be criminally liable to cover falsifying data, destroying data and obstructing Ofcom’s access to their premises.
In addition to the regulatory framework in the Bill that I have described, we are creating three new criminal offences. While the regulatory framework is focused on holding companies to account, the criminal offences will be focused on individuals and the way people use and abuse online communications. Recommended by the Law Commission, the offences will address coercive and controlling behaviour by domestic abusers; threats to rape, kill or inflict other physical violence; and the sharing of dangerous disinformation deliberately to inflict harm.
This is a new, stronger Online Safety Bill. It is the most important piece of legislation that I have ever worked on and it has been a huge team effort to get here. I am confident that we have produced something that will protect children and the most vulnerable members of society while being flexible and adaptable enough to meet the challenges of the future.
Let me make something clear in relation to freedom of speech. Anyone who has actually read the Bill will recognise that its defining focus is the tackling of serious harm, not the curtailing of free speech or the prevention of adults from being upset or offended by something they have seen online. In fact, along with countless others throughout the House, I am seriously concerned about the power that big tech has amassed over the past two decades and the huge influence that Silicon Valley now wields over public debate.
We in this place are not the arbiters of free speech. We have left it to unelected tech executives on the west coast to police themselves. They decide who is and who is not allowed on the internet. They decide whose voice should be heard and whose should be silenced—whose content is allowed up and what should be taken down. Too often, their decisions are arbitrary and inconsistent. We are left, then, with a situation in which the president of the United States can be banned by Twitter while the Taliban is not; in which talkRADIO can be banned by YouTube for 12 hours; in which an Oxford academic, Carl Heneghan, can be banned by Twitter; or in which an article in The Mail on Sunday can be plastered with a “fake news” label—all because they dared to challenge the west coast consensus or to express opinions that Silicon Valley does not like.
It is, then, vital that the Bill contains strong protections for free speech and for journalistic content. For the first time, under this legislation all users will have an official right to appeal if they feel their content has been unfairly removed. Platforms will have to explain themselves properly if they remove content and will have special new duties to protect journalistic content and democratically important content. They will have to keep those new duties in mind whenever they set their terms and conditions or moderate any content on their sites. I emphasise that the protections are new. The new criminal offences update section 1 of the Malicious Communications Act 1988 and section 127 of the Communications Act 2003, which were so broad that they interfered with free speech while failing to address seriously harmful consequences.
Without the Bill, social media companies would be free to continue to arbitrarily silence or cancel those with whom they do not agree, without any need for explanation or justification. That situation should be intolerable for anyone who values free speech. For those who quite obviously have not read the Bill and say that it concedes power to big tech companies, I have this to say: those big tech companies have all the power in the world that they could possibly want, right now. How much more power could we possibly concede?
That brings me to my final point. We now face two clear options. We could choose not to act and leave big tech to continue to regulate itself and mark its own homework, as it has been doing for years with predictable results. We have already seen that too often, without the right incentives, tech companies will not do what is needed to protect their users. Too often, their claims about taking steps to fix things are not backed up by genuine actions.
I can give countless examples from the past two months alone of tech not taking online harm and abuse seriously, wilfully promoting harmful algorithms or putting profit before people. A recent BBC investigation showed that women’s intimate pictures were being shared across the platform Telegram to harass, shame and blackmail women. The BBC reported 100 images to Telegram as pornography, but 96 were still accessible a month later. Tech did not act.
Twitter took six days to suspend the account of rapper Wiley after his disgusting two-day antisemitic rant. Just last week, the Centre for Countering Digital Hate said that it had reported 253 accounts to Instagram as part of an investigation into misogynistic abuse on the platform, but almost 90% remained active a month later. Again, tech did not act.
Remember: we have been debating these issues for years. They were the subject of one of my first meetings in this place in 2005. During that time, things have got worse, not better. If we choose the path of inaction, it will be on us to explain to our constituents why we did nothing to protect their children from preventable risks, such as grooming, pornography, suicide content or cyber-bullying. To those who say protecting children is the responsibility of parents, not the job of the state, I would quote the 19th-century philosopher John Stuart Mill, one of the staunchest defenders of individual freedom. He wrote in “On Liberty” that the role of the state was to fulfil the responsibility of the parent in order to protect a child where a parent could not. If we choose not to act, in the years to come we will no doubt ask ourselves why we did not act to impose fundamental online protections.
However, we have another option. We can pass this Bill and take huge steps towards tackling some of the most serious forms of online harm: child abuse, terrorism, harassment, death threats, and content that is harming children across the UK today. We could do what John Stuart Mill wrote was the core duty of Government. The right to self-determination is not unlimited. An action that results in doing harm to another is not only wrong, but wrong enough that the state can intervene to prevent that harm from occurring. We do that in every other part of our life. We erect streetlamps to make our cities and towns safer. We put speed limits on our roads and make seatbelts compulsory. We make small but necessary changes to protect people from grievous harm. Now it is time to bring in some fundamental protections online.
We have the legislation ready right now in the form of the Online Safety Bill. All we have to do is pass it. I am proud to commend the Bill to the House.
In the interest of time, I will just pose a number of questions, which I hope the Minister might address in summing up. The first is about the scope of the Bill. The Joint Committee of which I was a member recommended that the age-appropriate design code, which is very effectively used by the Information Commissioner, be used as a benchmark in the Bill, so that any services accessed or likely to be accessed by children are regulated for safety. I do not understand why the Government rejected that suggestion, and I would be pleased to hear from the Minister why they did so.
Secondly, the Bill delegates lots of detail to statutory instruments, codes of practice from the regulator, or later decisions by the Secretary of State. Parliament must see that detail before the Bill becomes an Act. Will the Minister commit to those delegated decisions being published before the Bill becomes an Act? Could he explain why the codes of practice are not being set as mandatory? I do not understand why codes of practice, much of the detail of which the regulator is being asked to set, will not be made mandatory for businesses. How can minimum standards for age or identity verification be imposed if those codes of practice are not made mandatory? Perhaps the Minister could explain.
Many users across the country will want to ensure that their complaints are dealt with effectively. We recommended an ombudsman service that dealt with complaints that were exhausted through a complaints system at the regulated companies, but the Government rejected it. Please could the Minister explain why?
I was pleased that the Government accepted the concept of the ability for a super-complaint to be brought on behalf of groups of users, but the decision as to who will be able a bring a super-complaint has been deferred, subject to a decision by the Secretary of State. Why, and when will that decision be taken? If the Minister could allude to who they might be, I am sure that would be welcome.
Lastly, there is a number of exemptions and more work to be done, which leaves significant holes in the legislation. There is much more work to be done on clauses 5, 6 and 50—on democratic importance, journalism and the definition of journalism, on the exemptions for news publishers, and on disinformation, which is mentioned only once in the entire Bill. I and many others recognise that these are not easy issues, but they should be considered fully before legislation is proposed that has gaping holes for people who want to get around it, and for those who wish to test the parameters of this law in the courts, probably for many years. All of us, on a cross-party basis in this House, support the Government’s endeavours to make it safe for children and others to be online. We want the legislation to be implemented as quickly as possible and to be as effective as possible, but there are significant concerns that it will be jammed up in the judicial system, where this House is unacceptably giving judges the job of fleshing out the definition of what many of the important exemptions will mean in practice.
The idea that the Secretary of State has the power to intervene with the independent regulator and tell it what it should or should not do obviously undermines the idea of an independent regulator. While Ministers might give assurances to this House that the power will not be abused, I believe that other countries, whether China, Russia, Turkey or anywhere else, will say, “Look at Great Britain. It thinks this is an appropriate thing to do. We’re going to follow the golden precedent set by the UK in legislating on these issues and give our Ministers the ability to decide what online content should be taken down.” That seems a dangerous precedent.
indicated dissent.
The Minister is shaking his head, but I can tell him that the legislation does do that, because we looked at this and took evidence on it. The Secretary of State would be able to tell the regulator that content should be “legal but harmful” and therefore should be removed as part of its systems design online. We also heard that the ability to do that at speed is very restricted and therefore the power is ineffective in the first place. Therefore, the Government should evidently change their position on that. I do not understand why, in the face of evidence from pretty much every stakeholder, the Government agree that that is an appropriate use of power or why Parliament would vote that through.
I look forward to the Minister giving his answers to those questions, in the hope that, as the Bill proceeds through the House, it can be tidied up and made tighter and more effective, to protect children and adults online in this country.
The piece of legislation before the House this evening is truly groundbreaking, because no other jurisdiction anywhere in the world has attempted to legislate as comprehensively as we are beginning to legislate here. For too long, big tech companies have exposed children to risk and harm, as evidenced by the tragic suicide of Molly Russell, who was exposed to appalling content on Instagram, which encouraged her, tragically, to take her own life. For too long, large social media firms have allowed illegal content to go unchecked online.
I have spoken before about dangerous suicide-related content online. The Minister mentions larger platforms. Will the Government go away and bring back two amendments based on points made by the Samaritans? One would bring smaller platforms within the scope of sanctions, and the second would make the protective aspects of the Bill cover people who are over 18, not just those who are under 18. If the Government do that, I am sure that it will be cause for celebration and that Members on both sides of the House will give their support.
It is very important to emphasise that, regardless of size, all platforms in the scope of the Bill are covered if there are risks to children.
A number of Members, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy), have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered. However, all platforms, regardless of size, are in scope with regard to content that is illegal and to content that is harmful to children.
For too long, social media firms have also arbitrarily censored content just because they do not like it. With the passage of this Bill, all those things will be no more, because it creates parliamentary sovereignty over how the internet operates, and I am glad that the principles in the Bill command widespread cross-party support.
The pre-legislative scrutiny that we have gone through has been incredibly intensive. I thank and pay tribute to the DCMS Committee and the Joint Committee for their work. We have adopted 66 of the Joint Committee’s recommendations. The Bill has been a long time in preparation. We have been thoughtful, and the Government have listened and responded. That is why the Bill is in good condition.
I must make some progress, because I am almost out of time and there are lots of things to reply to.
I particularly thank previous Ministers, who have done so much fantastic work on the Bill. With us this evening are my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friends the Members for Maldon (Mr Whittingdale) and for Basingstoke (Mrs Miller), but not with us this evening are my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright), who I think is in America, and my right hon. Friends the Members for Hertsmere (Oliver Dowden) and for Staffordshire Moorlands (Karen Bradley), all of whom showed fantastic leadership in getting the Bill to where it is today. It is a Bill that will stop illegal content circulating online, protect children from harm and make social media firms be consistent in the way they handle legal but harmful content, instead of being arbitrary and inconsistent, as they are at the moment.
I have so many points to reply to that I have to make some progress.
The Bill also enshrines, for the first time, free speech—something that we all feel very strongly about—but it goes beyond that. As well as enshrining free speech in clause 19, it gives special protection, in clauses 15 and 16, for content of journalistic and democratic importance. As my right hon. Friend the Secretary of State indicated in opening the debate, we intend to table a Government amendment—a point that my right hon. Friends the Members for Maldon and for Ashford (Damian Green) asked me to confirm—to make sure that journalistic content cannot be removed until a proper right of appeal has taken place. I am pleased to confirm that now.
We have made many changes to the Bill. Online fraudulent advertisers are now banned. Senior manager liability will commence immediately. Online porn of all kinds, including commercial porn, is now in scope. The Law Commission communication offences are in the Bill. The offence of cyber-flashing is in the Bill. The priority offences are on the face of the Bill, in schedule 7. Control over anonymity and user choice, which was proposed by my hon. Friend the Member for Stroud (Siobhan Baillie) in her ten-minute rule Bill, is in the Bill. All those changes have been made because this Government have listened.
Let me turn to some of the points made from the Opposition Front Bench. I am grateful for the in-principle support that the Opposition have given. I have enjoyed working with the shadow Minister and the shadow Secretary of State, and I look forward to continuing to do so during the many weeks in Committee ahead of us, but there were one or two points made in the opening speech that were not quite right. This Bill does deal with systems and processes, not simply with content. There are risk assessment duties. There are safety duties. There are duties to prevent harm. All those speak to systems and processes, not simply content. I am grateful to the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for confirming that in his excellent speech.
If anyone in this House wants confirmation of where we are on protecting children, the Children’s Commissioner wrote a joint article with the Secretary of State in the Telegraph—I think it was this morning—confirming her support for the measures in the Bill.
When it comes to disinformation, I would make three quick points. First, we have a counter-disinformation unit, which is battling Russian disinformation night and day. Secondly, any disinformation that is illegal, that poses harm to children or that comes under the definition of “legal but harmful” in the Bill will be covered. And if that is not enough, the Minister for Security and Borders, who is sitting here next to me, intends to bring forward legislation at the earliest opportunity to cover counter-hostile state threats more generally. This matter will be addressed in the Bill that he will prepare and bring forward.
I have only four minutes left and there are so many points to reply to. If I do not cover them all, I am very happy to speak to Members individually, because so many important points were made. The right hon. Member for Barking asked who was going to pay for all the Ofcom enforcement. The taxpayer will pay for the first two years while we get ready—£88 million over two years—but after that Ofcom will levy fees on these social media firms, so they will pay for regulating their activities. I have already replied to the point she rightly raised about smaller but very harmful platforms.
My hon. Friend the Member for Meriden (Saqib Bhatti) has been campaigning tirelessly on the question of combating racism. This Bill will deliver what he is asking for.
The hon. Member for Batley and Spen (Kim Leadbeater) and my hon. Friend the Member for Watford (Dean Russell) asked about Zach’s law. Let me take this opportunity to confirm explicitly that clause 150—the harmful communication clause, for where a communication is intended to cause psychological distress—will cover epilepsy trolling. What happened to Zach will be prevented by this Bill. In addition, the Ministry of Justice and the Law Commission are looking at whether we can also have a standalone provision, but let me assure them that clause 150 will protect Zach.
My right hon. Friend the Member for Maldon asked a number of questions about definitions. Companies can move between category 1 and category 2, and different parts of a large conglomerate can be regulated differently depending on their activities. Let me make one point very clear—the hon. Member for Bristol North West (Darren Jones) also raised this point. When it comes to the provisions on “legal but harmful”, neither the Government nor Parliament are saying that those things have to be taken down. We are not censoring in that sense. We are not compelling social media firms to remove content. All we are saying is that they must do a risk assessment, have transparent terms and conditions, and apply those terms and conditions consistently. We are not compelling, we are not censoring; we are just asking for transparency and accountability, which is sorely missing at the moment. No longer will those in Silicon Valley be able to behave in an arbitrary, censorious way, as they do at the moment—something that Members of this House have suffered from, but from which they will no longer suffer once this Bill passes.
The hon. Member for Bristol North West, who I see is not here, asked a number of questions, one of which was about—[Interruption.] He is here; I do apologise. He has moved—I see he has popped up at the back of the Chamber. He asked about codes of practice not being mandatory. That is because the safety duties are mandatory. The codes of practice simply illustrate ways in which those duties can be met. Social media firms can meet them in other ways, but if they fail to meet those duties, Ofcom will enforce. There is no loophole here.
When it comes to the ombudsman, we are creating an internal right of appeal for the first time, so that people can appeal to the social media firms themselves. There will have to be a proper right of appeal, and if there is not, they will be enforced against. We do not think it appropriate for Ofcom to consider every individual complaint, because it will simply be overwhelmed, by probably tens of thousands of complaints, but Ofcom will be able to enforce where there are systemic failures. We feel that is the right approach.
I say to the hon. Member for Plymouth, Sutton and Devonport (Luke Pollard) that my right hon. Friend the Minister for Security and Borders will meet him about the terrible Keyham shooting.
The hon. Member for Washington and Sunderland West (Mrs Hodgson) raised a question about online fraud in the context of search. That is addressed by clause 35, but we do intend to make drafting improvements to the Bill, and I am happy to work with her on those drafting improvements.
I have been speaking as quickly as I can, which is quite fast, but I think time has got away from me. This Bill is groundbreaking. It will protect our citizens, it will protect our children—[Hon. Members: “Sit down!”]—and I commend it to the House.
Question put and agreed to.
Bill accordingly read a Second time.
The Minister just made it. I have rarely seen a Minister come so close to talking out his own Bill.
Online Safety Bill (Programme)
Motion made, and Question put forthwith (Standing Order No. 83A(7)),
That the following provisions shall apply to the Online Safety Bill:
Committal
(1) The Bill shall be committed to a Public Bill Committee.
Proceedings in Public Bill Committee
(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Thursday 30 June 2022.
(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.
Consideration and Third Reading
(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.
(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.
(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.
Other proceedings
(7) Any other proceedings on the Bill may be programmed.—(Michael Tomlinson.)
Question agreed to.
Online Safety Bill (Money)
Queen’s recommendation signified.
Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),
That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise the payment out of money provided by Parliament of:
(1) any expenditure incurred under or by virtue of the Act by the Secretary of State, and
(2) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Michael Tomlinson.)
Question agreed to.
Online Safety Bill (Ways and Means)
Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),
That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise:
(1) the charging of fees under the Act, and
(2) the payment of sums into the Consolidated Fund.—(Michael Tomlinson.)
Question agreed to.
Deferred Divisions
Motion made, and Question put forthwith (Standing Order No. 41A(3)),
That at this day’s sitting, Standing Order 41A (Deferred divisions) shall not apply to the Motion in the name of Secretary Nadine Dorries relating to Online Safety Bill: Carry-over.—(Michael Tomlinson.)
Question agreed to.
(2 years, 6 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Q
Kevin Bakhurst: This is a really important point, which Richard just tried to make. The Bill gives us a great range of tools to try and prevent harm as far as possible; I just think we need to get expectations right here. Unfortunately, this Bill will not result in no harm of any type, just because of the nature of the internet and the task that we face. We are ambitious about driving constant improvement and stopping and addressing the main harms, but it is not going to stop any particular harm. We will absolutely focus on the ones that have a significant impact, but unfortunately that is the nature of the web.
Q
“psychological harm amounting to serious distress”?
Therefore, sending somebody a flashing image with the intention of inducing an epileptic fit would be likely caught under this new harmful communications offence in clause 150, even before a separate future offence that may be introduced.
Richard Wronka: I think we can certainly understand the argument. I think it is important that the Bill is as clear as possible. Ultimately, it is for the courts to decide whether that offence would pick up these kinds of issues that we are talking about around flashing imagery.
Q
You mentioned that you met recently with European regulators. Briefly, because we are short of time, were there any particular messages, lessons or insights you picked up in those meetings that might be of interest to the Committee?
Kevin Bakhurst: Yes, there were a number, and liaising with European regulators and other global regulators in this space is a really important strand of our work. It often said that this regime is a first globally. I think that is true. This is the most comprehensive regime, and it is therefore potentially quite challenging for the regulator. That is widely recognised.
The second thing I would say is that there was absolute recognition of how advanced we are in terms of the recruitment of teams, which I touched on before, because we have had the funding available to do it. There are many countries around Europe that have recruited between zero and 10 and are imminently going to take on some of these responsibilities under the Digital Services Act, so I think they are quite jealous.
The last thing is that we see continued collaboration with other regulators around the world as a really important strand, and we welcome the information-sharing powers that are in the Bill. There are some parallels, and we want to take similar approaches on areas such as transparency, where we can collaborate and work together. I think it is important—
Order. I am afraid we have come to the end of the allotted time for questions. On behalf of the Committee, I thank our witnesses for their evidence.
Examination of Witnesses
Dame Rachel de Souza, Lynn Perry MBE and Andy Burrows gave evidence.
Q
Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.
What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.
I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.
Q
Could you outline for the Committee the areas where you think the Bill, as currently drafted, contains the most important provisions to protect children?
Dame Rachel de Souza: I was really glad to see, in the rewrite of the Online Safety Bill, a specific reference to the role of age assurance to prevent children from accessing harmful content. That has come across strongly from children and young people, so I was very pleased to see that. It is not a silver bullet, but for too long children have been using entirely inappropriate services. The No. 1 recommendation from the 16 to 21-year-olds, when asked what they wish their parents had known and what we should do, was age assurance, if you are trying to protect a younger sibling or are looking at children, so I was pleased to see that. Companies cannot hope to protect children if they do not know who the children are on their platforms, so I was extremely pleased to see that.
Q
Dame Rachel de Souza: Absolutely. I have called together the tech companies. I have met the porn companies, and they reassured me that as long as they were all brought into the scope of this Bill, they would be quite happy as this is obviously a good thing. I brought the tech companies together to challenge them on their use of age assurance. With their artificial intelligence and technology, they know the age of children online, so they need to get those children offline. This Bill is a really good step in that direction; it will hold them to account and ensure they get children offline. That was a critically important one for me.
I was also pleased to see the holding to account of companies, which is very important. On full coverage of pornography, I was pleased to see the offence of cyber-flashing in the Bill. Again, it is particularly about age assurance.
What I would say is that nudge is not working, is it? We need this in the Bill now, and we need to get it there. In my bit of work with those 2,000 young people, we asked what they had seen in the last month, and 40% of them have not had bad images taken down. Those aspects of the Bill are key.
Andy Burrows: This is a landmark Bill, so we thank you and the Government for introducing it. We should not lose sight of the fact that, although this Bill is doing many things, first and foremost it will become a crucial part of the child protection system for decades to come, so it is a hugely important and welcome intervention in that respect.
What is so important about this Bill is that it adopts a systemic approach. It places clear duties on platforms to go through the process of identifying the reasonably foreseeable harms and requiring that reasonable steps be taken to mitigate them. That is hugely important from the point of view of ensuring that this legislation is future-proofed. I know that many companies have argued for a prescriptive checklist, and then it is job done—a simple compliance job—but a systemic approach is hugely important because it is the basis upon which companies have very clear obligations. Our engagement is very much about saying, “How can we make sure this Bill is the best it can possibly be?” But that is on the bedrock of that systemic approach, which is fundamental if we are to see a culture shift in these companies and an emphasis on safety by design—designing out problems that do not have to happen.
I have engaged with companies where child safety considerations are just not there. One company told me that grooming data is a bad headline today and tomorrow’s chip shop wrapper. A systemic approach is the key to ensuring that we start to address that balance.
Q
I would like to turn to a one or two points that came up in questioning, and then I would like to probe a couple of points that did not. Dame Rachel mentioned advocacy and ensuring that the voice of particular groups—in this context, particularly that of children—is heard. In that context, I would like to have a look at clause 140, which relates to super-complaints. Subsection (4) says that the Secretary of State can, by regulations, nominate which organisations are able to bring super-complaints. These are complaints whereby you go to Ofcom and say that there is a particular company that is failing in its systemic duties.
Subsection (4) makes it clear that the entities nominated to be an authorised super-complainant would include
“a body representing the interests of users of regulated services”,
which would obviously include children. If an organisation such as the Office of the Children’s Commissioner or the NSPCC—I am obviously not prejudicing the future process—were designated as a super-complainant that was able to bring super-complaints to Ofcom, would that address your point about the need for proper advocacy for children?
Dame Rachel de Souza: Absolutely. I stumbled over that a bit when Maria asked me the question, but we absolutely need people who work with children, who know children and are trusted by children, and who can do that nationally in order to be the super-complainants. That is exactly how I would envisage it working.
Andy Burrows: The super-complaint mechanism is part of the well-established arrangements that we see in other sectors, so we are very pleased to see that that is included in the Bill. I think there is scope to go further and look at how the Bill could mirror the arrangements that we see in other sectors—I mentioned the energy, postal and water sectors earlier as examples—so that the statutory user advocacy arrangements for inherently vulnerable children, including children at risk of sexual abuse, mirror the arrangements that we see in those other sectors. That is hugely important as a point of principle, but it is really helpful and appropriate for ensuring that the legislation can unlock the positive regulatory outcomes that we all want to see, so I think it contributes towards really effective regulatory design.
Q
Dame Rachel de Souza: Yes, and I was so pleased to see that. The regulator needs to have teeth for it to have any effect—I think that is what we are saying. I want named senior managers to be held accountable for breaches of their safety duties to children, and I think that senior leaders should be liable to criminal sanctions when they do not uphold their duty of care to children.
Q
I will put my last two questions together. Are you concerned about the possibility that encryption in messaging services might impede the automatic scanning for child exploitation and abuse images that takes place, and would you agree that we cannot see encryption happen at the expense of child safety? Secondly, in the context of the Molly Russell reference earlier, are you concerned about the way that algorithms can promote and essentially force-feed children very harmful content? Those are two enormous questions, and you have only two minutes to answer them, so I apologise.
Dame Rachel de Souza: I am going to say yes and yes.
Andy Burrows: I will say yes and yes as well. The point about end-to-end encryption is hugely important. Let us be clear: we are not against end-to-end encryption. Where we have concerns is about the risk profile that end-to-end encryption introduces, and that risk profile, when we are talking about it being introduced into social networking services and bundled with other sector functionality, is very high and needs to be mitigated.
About 70% of child abuse reports that could be lost with Meta going ahead. That is 28 million reports in the past six months, so it is very important that the Bill can require companies to demonstrate that if they are running services, they can acquit themselves in terms of the risk assessment processes. We really welcome the simplified child sexual exploitation warning notices in the Bill that will give Ofcom the power to intervene when companies have not demonstrated that they have been able to introduce end-to-end encryption in a safe and effective way.
One area in which we would like to see the Bill—
Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions of this panel. On behalf of the Committee, I thank our witnesses for their evidence, and I am really sorry that we could not get Lynn Perry online. Could we move on to the last panel? Thank you very much.
Examination of Witnesses
Ben Bradley and Katy Minshall gave evidence.
Q
Katy Minshall: As I say, we share your policy objective of giving users more choice. For example, at present we are testing a tool where Twitter automatically blocks abusive accounts on your behalf. We make the distinction based on an account’s behaviour and not on whether it has verified itself in some way.
Q
I do not think that the concept would necessarily operate as you suggested at the beginning. You suggested that people might end up not seeing content posted by the Prime Minister or another public figure. The concept is that, assuming a public figure would choose to verify themselves, content that they posted would be visible to everybody because they had self-verified. The content in the other direction may or may not be, depending on whether the Prime Minister or the Leader of the Opposition chose to see all content or just verified content, but their content—if they verified themselves—would be universally visible, regardless of whatever choice anyone else exercised.
Katy Minshall: Yes, sorry if I was unclear. I totally accept that point, but it would mean that some people would be able to reply to Boris Johnson and others would not. I know we are short on time, but it is worth pointing out that in a YouGov poll in April, nearly 80% of people said that they would not choose to provide ID documents to access certain websites. The requirements that you describe are based on the assumption that lots of people will choose to do it, when in reality that might not be the case.
A public figure might think, “Actually, I really appreciate that I get retweets, likes and people replying to my tweets,” but if only a small number of users have taken the opportunity to verify themselves, that is potentially a disincentive even to use this system in the first place—and all the while we were creating a system, we could have been investing in or trying to develop new solutions, such as safety mode, which I described and which tries to prevent abusive users from interacting with you.
Q
Ben, you talked about the age verification measures that TikTok currently takes. For people who do not come via an age-protected app store, it is basically self-declared. All somebody has to do is type in a date of birth. My nine-year-old children could just type in a date of birth that was four years earlier than their real date of birth, and off they would go on TikTok. Do you accept that that is wholly inadequate as a mechanism for policing the age limit of 13?
Ben Bradley: That is not the end of our age assurance system; it is just the very start. Those are the first two things that we have to prevent sign-up, but we are also proactive in surfacing and removing under-age accounts. As I said, we publish every quarter how many suspected under-13s get removed.
Q
Ben Bradley: It is based on a range of signals that they have available to them. As I said, we publish a number every quarter. In the last quarter, we removed 14 million users across the globe who were suspected to be under the age of 13. That is evidence of how seriously we take the issue. We publish that information because we think it is important to be transparent about our efforts in this space, so that we can be judged accordingly.
Q
Earlier, we debated content of democratic importance and the protections that that and free speech have in the Bill. Do you agree that a requirement to have some level of consistency in the way that that is treated is important, particularly given that there are some glaring inconsistencies in the way in which social media firms treat content at the moment? For example, Donald Trump has been banned, while flagrant disinformation by the Russian regime, lying about what they are doing in Ukraine, is allowed to propagate—including the tweets that I drew to your attention a few weeks ago, Katy.
Katy Minshall: I agree that freedom of expression should be top of mind as companies develop safety and policy solutions. Public interest should always be considered when developing policies. From the perspective of the Bill, I would focus on freedom of expression for everyone, and not limit it to content that could be related to political discussions or journalistic content. As Ben said, there are already wider freedom of expression duties in the Bill.
Q
Katy Minshall: Sorry, but I do not know the Bill in those terms, so you would have to tell me the definition.
Order. I am afraid that that brings us to the end of the time allotted for the Committee to ask questions in this morning’s sitting. On behalf of the Committee, I thank our witnesses for their evidence. We will meet again at 2 pm in this room to hear further oral evidence.
(2 years, 6 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I am sorry, but I must move on. Minister, I am afraid you only have five minutes.
Q
Richard Earley: What information are you referring to?
Data, in particular on the operation of algorithmic promotion of particular kinds of content.
Richard Earley: We already do things like that through the direct opportunity that anyone has to see why a single post has been chosen for them in their feed. You can click on the three dots next to any post and see that. For researcher access and support, as I mentioned, we have contributed to the publishing of more than 400 reports over the last year, and we want to do more of that. In fact, the Bill requires Ofcom to conduct a report on how to unlock those sorts of barriers, which we think should be done as soon as possible. Yes, in general we support that sort of research.
I would like say one thing, though. I have worked at Facebook—now Meta—for almost five years, and nobody at Facebook has any obligation, any moral incentive, to do anything other than provide people with the best, most positive experience on our platform, because we know that if we do not give people a positive experience, through algorithms or anything else, they will leave our platform and will not use it. They tell us that and they do it, and the advertisers who pay for our services do not want to see that harmful content on our platforms either. All of our incentives are aligned with yours, which are to ensure that our users have a safe and positive experience on our platforms.
Q
Richard Earley: I am afraid to say that that is not correct. We have multiple algorithms on our services. Many of them, in fact, do the opposite of what you have just described: they identify posts that might be violent, misleading or harmful and reduce the prevalence of them within our feed products, our recommendation services and other parts of the service.
We optimise the algorithm that shows people things for something called meaningful social interaction. That is not just pure engagement; in fact, its focus—we made a large change to our algorithms in 2018 to focus on this—is on the kinds of activities online that research shows are correlated with positive wellbeing outcomes. Joining a group in your local area or deciding to go to an event that was started by one of your friends—that is what our algorithms are designed to promote. In fact, when we made that switch in 2018, we saw a decrease in more than 50 million hours of Facebook use every day as a result of that change. That is not the action of a company that is just focused on maximising engagement; it is a company that is focused on giving our users a positive experience on our platform.
Q
Richard Earley: No, because as I just said, when we sent the algorithm this instruction to focus on social interaction it actually decreased the amount of time people spent on our platform.
Q
Richard Earley: As I said, it is about ensuring that people who spend time on our platform come away feeling that they have had a positive experience.
Q
Richard Earley: I think that a really valuable part of the Bill that we are here to discuss is the fact that Ofcom will be required, and we in our risk assessments will be required, to consider the impact on the experience of our users of multiple different algorithms, of which we have hundreds. We build those algorithms to ensure that we reduce the prevalence of harmful content and give people the power to connect with those around them and build community. That is what we look forward to demonstrating to Ofcom when this legislation is in place.
Q
Q
Katie O'Donovan: I welcome the opportunity to address the Committee. It is so important that this Bill has parliamentary scrutiny. It is a Bill that the DCMS has spent a lot of time on, getting it right and looking at the systems and the frameworks. However, it will lead to a fundamentally different internet for UK users versus the rest of the world. It is one of the most complicated Bills we are seeing anywhere in the world. I realise that it is very important to have scrutiny of us as platforms to determine what we are doing, but I think it is really important to also look at the substance of the Bill. If we have time, I would welcome the chance to give a little feedback on the substance of the Bill too.
Becky Foreman: I would add that the Committee spent a lot of time talking to Meta, who are obviously a big focus for the Bill, but it is important to remember that there are numerous other networks and services that potentially will be caught by the Bill and that are very different from Meta. It is important to remember that.
While the Bill is proportionate in its measures, it is not designed to impose undue burdens on companies that are not high risk. I have one more question for Richard. I think Katie was saying that she wanted to make a statement?
We are out of time. I am sorry about this; I regard it as woefully unsatisfactory. We have got three witnesses here, a lot of questions that need to be answered, and not enough time to do it. However, we have a raft of witnesses coming in for the rest of the day, so I am going to have to draw a line under this now. I am very grateful to you for taking the trouble to come—the Committee is indebted to you. You must have the opportunity to make your case. Would you be kind enough to put any comments that you wish to make in writing so that the Committee can have them. Feel free to go as broad as you would like because I feel very strongly that you have been short-changed this afternoon. We are indebted to you. Thank you very much indeed.
Richard Earley: We will certainly do that and look forward to providing comments in writing.
Examination of Witnesses
Professor Clare McGlynn, Jessica Eagelton and Janaya Walker gave evidence.
Q
“only entitled to conclude that it is not possible for children to access a service…if there are systems or processes in place…that achieve the result that children are not normally able to access the service”.
Ofcom will then interpret in codes of practice what that means practically. Professor McGlynn, do you think that standard set out there—
“the result that children are not normally able to access the service or that part of it”
—is sufficiently high to address the concerns we have been discussing in the last few minutes?
Professor Clare McGlynn: At the moment, the wording with regard to age assurance in part 5—the pornography providers—is slightly different, compared with the other safety duties. That is one technicality that could be amended. As for whether the provision you just talked about is sufficient, in truth I think it comes down, in the end, to exactly what is required, and of course we do not yet know what the nature of the age verification or age assurance requirements will actually be and what that will actually mean.
I do not know what that will actually mean for something like Twitter. What will they have to do to change it? In principle, that terminology is possibly sufficient, but it kind of depends in practice what it actually means in terms of those codes of practice. We do not yet know what it means, because all we have in the Bill is about age assurance or age verification.
Q
Professor Clare McGlynn: My understanding as well is that those terms are, at the moment, being interpreted slightly differently in terms of the requirements that people will be under. I am just making a point about it probably being easier to harmonise those terms.
Q
Professor Clare McGlynn: I read your piece in The Times this morning, which was a robust defence of the legislation, in that it said that it is no threat to freedom of speech, but I hope you read my quote tweet, in which I emphasised that there is a strong case to be made for regulation to free the speech of many others, including women and girls and other marginalised people. For example, the current lack of regulation means that women’s freedom of speech is restricted because we fear going online because of the abuse we might encounter. Regulation frees speech, while your Bill does not unduly limit freedom of speech.
Q
Professor Clare McGlynn: There are many ways in which speech is regulated. The social media companies already make choices about what speech is online and offline. There are strengths in the Bill, such as the ability to challenge when material is taken offline, because that can impact on women and girls as well. They might want to put forward a story about their experiences of abuse, for example. If that gets taken down, they will want to raise a complaint and have it swiftly dealt with, not just left in an inbox.
There are lots of ways in which speech is regulated, and the idea of having a binary choice between free speech and no free speech is inappropriate. Free speech is always regulated, and it is about how we choose to regulate it. I would keep making the point that the speech of women and girls and other marginalised people is minimised at the moment, so we need regulation to free it. The House of Lords and various other reports about free speech and regulation, for example, around extreme pornography, talk about regulation as being human-rights-enhancing. That is the approach we need to take.
Thank you very much indeed. Once again, I am afraid I have to draw the session to a close, and once again we have probably not covered all the ground we would have liked. Professor McGlynn, Ms Walker, Ms Eagleton, thank you very much indeed. As always, if you have further thoughts or comments, please put them in writing and let us know. We are indebted to you.
Examination of Witnesses
Lulu Freemont, Ian Stevenson and Adam Hildreth gave evidence.
Q
Adam Hildreth: I had covid at the time, yes.
Covid struck. I would like to ask Adam and Ian in particular about the opportunities provided by emerging and new technology to deliver the Bill’s objectives. I would like you both to give examples of where you think new tech can help deliver these safety duties. I ask you to comment particularly on what it might do on, first, age assurance—which we debated in our last session—and secondly, scanning for child sexual abuse images in an end-to-end encrypted environment. Adam, do you want to go first?
Adam Hildreth: Well, if Ian goes first, the second question would be great for him to answer, because we worked on it together.
Fair enough. Ian?
Ian Stevenson: Yes, absolutely. The key thing to recognise is that there is a huge and growing cohort of companies, around the world but especially in the UK, that are working on technologies precisely to try to support those kinds of safety measures. Some of those have been supported directly by the UK Government, through the safety tech challenge fund, to explore what can be done around end-to-end encrypted messaging. I cannot speak for all the participants, but I know that many of them are members of the safety tech industry association.
Between us, we have demonstrated a number of different approaches. My own company, Cyacomb, demonstrated technology that could block known child abuse within encrypted messaging environments without compromising the privacy of users’ messages and communications. Other companies in the UK, including DragonflAI and Yoti, demonstrated solutions based on detecting nudity and looking at the ages of the people in those images, which are again hugely valuable in this space. Until we know exactly what the regulation is going to demand, we cannot say exactly what the right technology to solve it is.
However, I think that the fact that that challenge alone produced five different solutions looking at the problem from different angles shows just how vibrant the innovation ecosystem can be. My background in technology is long and mixed, but I have seen a number of sectors emerge—including cyber-security and fintech—where, once the foundations for change have been created, the ability of innovators to come up with answers to difficult questions is enormous. The capacity to do that is enormous.
There are a couple of potential barriers to that. The strength of the regulation is that it is future proof. However, until we start answering the question, “What do we need to do and when? What will platforms need to do and when will they need to do it?” we do not really create in the commercial market the innovation drivers for the technical solutions that will deliver this. We do not create the drivers for investment. It is really important to be as specific as we can about what needs to be done and when.
The other potential barrier is regulation. We have already had a comment about how there should be a prohibition of general monitoring. We have seen what has happened in the EU recently over concerns about safety technologies that are somehow looking at traffic on services. We need to be really clear that, while safety technologies must protect privacy, there needs to be a mechanism so that companies can understand when they can deploy safety technologies. At the moment there are situations where we talk to potential customers for safety technologies and they are unclear as to whether it would be proportionate to deploy those under, for example, data protection law. There are areas, even within the safety tech challenge fund work on end-to-end encrypted messaging, where it was unclear whether some of the technologies—however brilliant they were at preventing child abuse in those encrypted environments —would be deployable under current data protection and privacy of electronic communications regulations.
There are questions there. We need to make sure that when the Online Safety Bill comes through, it makes clear what is required and how it fits together with other regulations to enable that. Innovators can do almost anything if you give them time and space. They need the certainty of knowing what is required, and an environment where solutions can be deployed and delivered.
Q
Adam Hildreth: I agree with Ian that the level of innovation is amazing. If we start talking about age verification and end-to-end encryptions, for me—I am going to say that same risk assessment phrase again—it absolutely depends on the type of service, who is using the service and who is exploiting the service, as to which safety technologies should be employed. I think it is dangerous to say, “We are demanding this type of technology or this specific technology to be deployed in this type of instance,” because that removes the responsibility from the people who are creating it.
Q
Adam Hildreth: Absolutely. Sorry, I was saying that I agree with how it has been worded. We know what is available, but technology changes all the time and solutions change all the time—we can do things in really innovative ways. However, the risk assessment has to bring together freedom of speech versus the types at risk of abuse. Is it children who are at risk, and if so, what are they at risk from? That changes the space massively when compared with some adult gaming communities, where what is harmful to them is very different from what harms other audiences. That should dictate for them what system and technology is deployed. Once we understand what best of breed looks like for those types of companies, we should know what good is.
Q
Adam Hildreth: The technology is there. It exists and it is absolutely deployable in the environments that need it. I am sure Ian would agree; we have seen it and done a lot of testing on it. The technology exists in the environments that need it.
Q
Adam Hildreth: There are ways that can work. Again, it brings in freedom of expression, global businesses and some other areas, so it is more about regulation and consumer concerns about the security of data, rather than whether technological solutions are available.
Ms Freemont, Mr Hildreth and Mr Stevenson, thank you all very much indeed. We have run out of time. As ever, if you have any further observations that you wish to make, please put them in writing and let the Committee have them; we shall welcome them. Thank you for your time this afternoon. We are very grateful to you.
Examination of Witnesses
Jared Sine, Nima Elmi and Dr Rachel O’Connell gave evidence.
Right. For once, we seem to have run out of questions. Minister, do you wish to contribute?
Everything I was going to ask has already been asked by my colleagues, so I will not duplicate that.
Q
Jared Sine: I would just make one brief comment. I think it has been mentioned by everyone here. Everyone has a role to play. Clearly, the Government have a role in proposing and pushing forward the legislation. The platforms that have the content have an obligation and a responsibility to try to make sure that their users are safe. One of the things that Dr O’Connell mentioned is age verification and trying to make sure that we keep young kids off platforms where they should not be.
I think there is a big role to play for the big tech platforms—the Apples and Googles—who distribute our apps. Over the years, we have said again and again to both of those companies, “We have age-gated our apps at 18, yet you will allow a user you know is 15, 14, 16—whatever it is—to download that app. That person has entered that information and yet you still allow that app to be downloaded.” We have begged and pleaded with them to stop and they will not stop. I am not sure that that can be included in the Bill, but if it could be, it would be powerful.
If Apple and Google could not distribute any of our apps—Hinge, Match, Tinder—to anyone under the age of 18, that solves it right there. It is the same methodology that has been used at clubs with bouncers—you have a bouncer at the door who makes sure you are 21 before you go in and have a drink. It should be the same thing with these technology platforms. If they are going to distribute and have these app stores, the store should then have rules that show age-gated apps—“This is for 17-plus or 18-plus”—and should also enforce that. It is very unfortunate that our calls on this front have gone unanswered. If the Bill could be modified to include that, it would really help to address the issue.
Dr Rachel O'Connell: Absolutely. I 100% support that. There is a tendency for people to say, “It is very complex. We need a huge amount of further consultation.” I started my PhD in 1996. This stuff has been going on for all that time. In 2008, there was a huge push by the Attorneys General, which I mentioned already, which brought all of the industry together. That was 2008. We are in 2022 now. 2017 was the Internet Safety Strategy Green Paper. We know what the risks are. They are known; we understand what they are. We understand the systems and processes that facilitate them. We understand what needs to be done to mitigate those risks and harms. Let’s keep on the track that we are going on.
Regarding industry’s concerns, a lot of them will be ironed out when companies are required to conduct risk assessments and impact assessments. They might ask, what are the age bands of your users? What are the risks associated with the product features that you are making available? What are the behaviour modification techniques that you are using, like endless scroll and loot boxes that get kids completely addicted? Are those appropriate for those ages? Then you surface the decision making within the business that results in harms and also the mitigations.
I urge you to keep going on this; do not be deterred from it. Keep the timeframe within which it comes into law fairly tight, because there are children out there who are suffering. As for the harassment—I have experienced it myself, it is horrible.
Those would be my final words.
Q
Rhiannon-Faye McDonald: It is incredibly important that we have this education piece. Like Susie said, we cannot rely on technology or any single part of this to solve child sexual abuse, and we cannot rely on the police to arrest their way out of the problem. Education really is the key. That is education in all areas—educating the child in an appropriate way and educating parents. We hold parenting workshops. Parents are terrified; they do not know what to do, what platforms are doing what, or what to do when things go wrong. They do not even know how to talk to children about the issue; it is embarrassing for them and they cannot bring it up. Educating parents is a huge thing. Companies have a big responsibility there. They should have key strategies in place on how they are going to improve education.
Q
I would like to pick up on a point that has arisen in the discussion so far—the point that Susie raised about the risks posed by Meta introducing end-to-end encryption, particularly on the Facebook Messenger service. You have referenced the fact that huge numbers of child sexual exploitation images are identified by scanning those communications, leading to the arrests of thousands of paedophiles each year. You also referenced the fact that when this was temporarily turned off in Europe owing to the privacy laws there—briefly, thankfully—there was a huge loss of information. We will come on to the Bill in a minute, but as technology stands now, if Meta did proceed with end-to-end encryption, would that scanning ability be lost?
Susie Hargreaves: Yes. It would not affect the Internet Watch Foundation, but it would affect the National Centre for Missing and Exploited Children. Facebook, as a US company, has a responsibility to do mandatory reporting to NCMEC, which will be brought in with the Bill in this country. Those millions of images would be lost, as of today, if they brought end-to-end encryption in now.
Q
Susie Hargreaves: Because they are scanning Facebook—sorry, I am just trying to unpack the way it works. It will affect us, actually. Basically, when we provide our hash list to Facebook, it uses that to scan Messenger, but the actual images that are found—the matches—are not reported to us; they are reported into NCMEC. Facebook does take our hash list. For those of you who do not know about hashing, it is a list of digital fingerprints—unique images of child sexual abuse. We currently have about 1.3 million unique images of child sexual abuse. Facebook does use our hash list, so yes it does affect us, because it would still take our hash list to use on other platforms, but it would not use it on Messenger. The actual matches would go into NCMEC. We do not know how many matches it gets against our hash list, because it goes into NCMEC.
Q
Susie Hargreaves: Yes, sorry—I was unclear about that. Yes, it would on Messenger.
Q
Susie Hargreaves: As I said before, it is essential that we do not demonise end-to-end encryption. It is really important. There are lots of reasons why, from a security and privacy point of view, people want to be able to use end-to-end encryption.
In terms of whether the technology is there, we all know that there are things on the horizon. As Ian said in the previous session, the technology is there and is about to be tried out. I cannot give any update at this meeting, but in terms of what we would do if end-to-end encryption is introduced and there is no ability to scan, we could look at on-device scanning, which I believe you mentioned before, Minister.
Yes.
Susie Hargreaves: That is an option. That could be a backstop position. I think that, at the moment, we should stand our ground on this and say, “No, we need to ensure that we have some form of scanning in place if end-to-end encryption is introduced.”
Q
Susie Hargreaves: I agree 100%.
Thank you very much indeed, Ms McDonald and Ms Hargreaves. We are most grateful to you; thank you for your help.
Examination of Witnesses
Ellen Judson and Kyle Taylor gave evidence.
Q
Ellen Judson: At the moment, no. The rights that are discussed in the Bill at the minute are quite limited: primarily, it is about freedom of expression and privacy, and the way that protections around privacy have been drafted is less strong than for those around freedom of expression. Picking up on the question about setting precedents, if we have a Bill that is likely to lead to more content moderation and things like age verification and user identity verification, and if we do not have strong protections for privacy and anonymity online, we are absolutely setting a bad precedent. We would want to see much more integration with existing human rights legislation in the Bill.
Kyle Taylor: All I would add is that if you look at the exception for content of democratic importance, and the idea of “active political issue”, right now, conversion therapy for trans people—that has been described by UN experts as torture—is an active political issue. Currently, the human rights of trans people are effectively set aside because we are actively debating their lives. That is another example of how minority and marginalised people can be negatively impacted by this Bill if it is not more human rights-centred.
Q
Ellen Judson: I accept that that is what the Bill currently says. Our point was thinking about how it will be implemented in practice. If platforms are expected to prove to a regulator that they are taking certain steps to protect content of democratic importance—in the explanatory notes, that is content related to Government policy and political parties—and they are expected to prove that they are taking a special consideration of journalistic content, the most straightforward way for them to do that will be in relation to journalists and politicians. Given that it is such a broad category and definition, that seems to be the most likely effect of the regime.
Kyle Taylor: It is potentially—
Q
Is it not true that a member of the public or anyone debating a legitimate political topic would also benefit from these measures? It is likely that MPs would automatically benefit—near automatically—but a member of the public might equally benefit if the topic they are talking about is of democratic or journalistic importance.
Ellen Judson: Our concern is that defining what is a legitimate political debate is itself already privileging. As you said, an MP is very likely automatically to benefit.
Well, it is likely; I would not say it is guaranteed.
Ellen Judson: A member of the public may be discussing something—for example, an active political debate that is not about the United Kingdom, which I believe would be out of scope of that protection. They would be engaged in political discussion and exercising freedom of expression, and if they were not doing so in a way that met the threshold for action based on harm, their speech should also come under those protections.
Kyle Taylor: I would add that the way in which you have described it would be so broad as to effectively be meaningless in the context of the Bill, and that instead we should be looking for universal free expression protections in that part of the Bill, and removing this provision. Because what is not, in a liberal democracy, speech of democratic importance? Really, that is everything. When does it reach the threshold where it is an active political debate? Is it when enough people speak about it or enough politicians bring it up? It is so subjective and so broad effectively to mean that everything could qualify. Again, this is not taking a harms-based approach to online safety, because the question is not “Who is saying it?” or “In what context?”; the question is, “Does this have the propensity to cause harm at scale?”
Q
Kyle Taylor: Can I respond to that?
Yes, sure.
Kyle Taylor: My point is that if there is a provision in the Bill about freedom of expression, it should be robust enough that this protection does not have to be in the Bill. To me, this is saying, “Actually, our free expression bit isn’t strong enough, so we’re going to reiterate it here in a very specific context, using very select language”. That may mean that platforms decide not to act for fear of reprisal, as opposed to pursuing online safety. I suggest strengthening the freedom of expression section so that it hits all the points that the Government intend to hit, and removing those qualifiers that create loopholes and uncertainty for a regime that, if it is systems-based, does not have loopholes.
Q
Ellen Judson: We absolutely recognise that. There is discussion in terms of meeting certain standards of responsible journalism in relation to those protections. Our concern is very much that the people and actors who would most benefit from the journalistic protections specifically would be people who do not meet those standards and cannot prove that they meet those standards, because the standards are very broad. If you intend your content to be journalistic, you are in scope, and that could apply to extremists as much as to people meeting standards of responsible journalism.
Q
Kyle Taylor: Remove the exemption.
Q
Kyle Taylor: Well, I am struggling to understand how we can look at the Bill and say, “If this entity says it, it is somehow less harmful than if this entity says it.” That is a two-tiered system and that will not lead to online safety, especially when those entities that are being given privilege are the most likely and largest sources and amplifiers of harmful content online. We sit on the frontlines of this every day, looking at social media, and we can point to countless examples from around the world that will show that, with these exemptions, exceptions and exclusions, you will actually empower those actors, because you explicitly say that they are special. You explicitly say that if they cause harm, it is somehow not as bad as if a normal user with six followers on Twitter causes harm. That is the inconsistency and incoherency in the Bill.
We are talking here about the press, not about politicians—
Kyle Taylor: Yes, but the press and media entities spread a lot of disinformation—
Q
Kyle Taylor: Except that that is inconsistent in the Bill, because you are saying that for broadcast, they must have a licence, but for print press, they do not have to subscribe to an independent standards authority or code. Even within the media, there is this inconsistency within the Bill.
That is a point that applies regardless of the Bill. The fact is that UK broadcast is regulated whereas UK newspapers are not regulated, and that has been the case for half a century. You can debate whether that is right or wrong, but—
Kyle Taylor: We are accepting that newspapers are not regulated then.
Q
Kyle Taylor: I am not suggesting that the freedom of the press is not sacrosanct. Actually, I am expressing the opposite, which is that I believe that it is so sacrosanct that it should be essential to the freedom-of-expression portion of the Bill, and that the press should be set to a standard that meets international human rights and journalistic standards. I want to be really clear that I absolutely believe in freedom of the press, and it is really important that we don’t leave here suggesting that we don’t think that the press should be free—
Q
Ellen Judson: To the media exemption—
To clause 50, “Recognised news publisher”.
Ellen Judson: One of the changes that the Government have indicated that they are minded to make—please correct me if I misunderstood—is to introduce a right to appeal.
Correct.
Ellen Judson: Content having to stay online while the appeal was taking place I would very much urge not to be introduced, on the grounds that the content staying online might then be found to be incredibly harmful, and by the time you have got through an appeals process, it will already have done the damage it was going to do. So, if there is a right to appeal—I would urge there not to be a particular right to appeal beyond what is already in the Bill, but if that is to be included, not having the restriction that the platforms must carry the content while the appeal process is ongoing would be important.
Kyle Taylor: You could require an independent standards code as a benchmark at least.
Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions. It also brings us to the end of the day’s sitting. On behalf of the Committee, I thank the witnesses for your evidence. As you ran out of time and the opportunity to frame answers, if you want to put them in writing and offer them to the Minister, I am sure they will be most welcome. The Committee will meet again on Thursday at 11.30 am in this room to hear further evidence on the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(2 years, 6 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Q
William Moy: Essentially, the tests are such that almost anyone could pass them. Without opening the Bill, you have to have a standards code, which you can make up for yourself, a registered office in the UK and so on. It is not very difficult for a deliberate disinformation actor to pass the set of tests in clause 50 as they currently stand.
Q
William Moy: This would need a discussion. I have not come here with a draft amendment—frankly, that is the Government’s job. There are two areas of policy thinking over the last 10 years that provide the right seeds and the right material to go into. One is the line of thinking that has been done about public benefit journalism, which has been taken up in the House of Lords Communications and Digital Committee inquiry and the Cairncross review, and is now reflected in recent Charity Commission decisions. Part of Full Fact’s charitable remit is as a publisher of public interest journalism, which is a relatively new innovation, reflecting the Cairncross review. If you take that line of thinking, there might be some useful criteria in there that could be reflected in this clause.
I hate to mention the L-word in this context, but the other line of thinking is the criteria developed in the context of the Leveson inquiry for what makes a sensible level of self-regulation for a media organisation. Although I recognise that that is a past thing, there are still useful criteria in that line of thinking, which would be worth thinking about in this context. As I said, I would be happy to sit down, as a publisher of journalism, with your officials and industry representatives to work out a viable way of achieving your political objectives as effectively as possible.
William Perrin: Such a definition, of course, must satisfy those who are in the industry, so I would say that these definitions need to be firmly industry-led, not simply by the big beasts—for whom we are grateful, every day, for their incredibly incisive journalism—but by this whole spectrum of new types of news providers that are emerging. I have mentioned my experience many years ago of explaining what a blog was to DCMS.
The news industry is changing massively. I should declare an interest: I was involved in some of the work on public-benefit journalism in another capacity. We have national broadcasters, national newspapers, local papers, local broadcasters, local bloggers and local Twitter feeds, all of which form a new and exciting news media ecosystem, and this code needs to work for all of them. I suppose that you would need a very deep-dive exercise with those practitioners to ensure that they fit within this code, so that you achieve your policy objective.
Q
We heard some commentary earlier—I think from Mr Moy—about the need to address misinformation, particularly in the context of a serious situation such as the recent pandemic. I think you were saying that there was a meeting, in March or April 2020, for the then Secretary of State and social media firms to discuss the issue and what steps they might take to deal with it. You said that it was a private meeting and that it should perhaps have happened more transparently.
Do you accept that the powers conferred in clause 146, as drafted, do, in fact, address that issue? They give the Secretary of State powers, in emergency situations—a public health situation or a national security situation, as set out in clause 146(1)—to address precisely that issue of misinformation in an emergency context. Under that clause, it would happen in a way that was statutory, open and transparent. In that context, is it not a very welcome clause?
William Moy: I am sorry to disappoint you, Minister, but no, I do not accept that. The clause basically attaches to Ofcom’s fairly weak media literacy duties, which, as we have already discussed, need to be modernised and made harms-based and safety-based.
However, more to the point, the point that I was trying to make is that we have normalised a level of censorship that was unimaginable in previous generations. A significant part of the pandemic response was, essentially, some of the main information platforms in all of our day-to-day lives taking down content in vast numbers and restricting what we can all see and share. We have started to treat that as a normal part of our lives, and, as someone who believes that the best way to inform debate in an open society is freedom of expression, which I know you believe, too, Minister, I am deeply concerned that we have normalised that. In fact, you referred to it in your Times article.
I think that the Bill needs to step in and prevent that kind of overreach, as well as the triggering of unneeded reactions. In the pandemic, the political pressure was all on taking down harmful health content; there was no countervailing pressure to ensure that the systems did not overreach. We therefore found ridiculous examples, such as police posts warning of fraud around covid being taken down by the internet companies’ automated systems because those systems were set to, essentially, not worry about overreach.
That is why we are saying that we need, in the Bill, a modern, open-society approach to misinformation. That starts with it recognising misinformation in the first place. That is vital, of course. It should then go on to create a modern, harms-based media literacy framework, and to prefer content-neutral and free-speech-based interventions over content-restricting interventions. That was not what was happening during the pandemic, and it is not what will happen by default. It takes Parliament to step in and get away from this habitual, content-restriction reaction and push us into an open-society-based response to misinformation.
William Perrin: Can I just add that it does not say “emergency”? It does not say that at all. It says “reasonable grounds” that “present a threat”—not a big threat—under “special circumstances”. We do not know what any of that means, frankly. With this clause, I get the intent—that it is important for national security, at times, to send messages—but this has not been done in the history of public communication before. If we go back through 50 or 60 years, even 70 years, of Government communication, the Government have bought adverts and put messages transparently in place. Apart from D-notices, the Government have never sought to interfere in the operations of media companies in quite the way that is set out here.
If this clause is to stand, it certainly needs a much higher threshold before the Secretary of State can act—such as who they are receiving advice from. Are they receiving advice from directors of public health, from the National Police Chiefs’ Council or from the national security threat assessment machinery? I should declare an interest; I worked in there a long time ago. It needs a higher threshold and greater clarity, but you could dispense with this by writing to Ofcom and saying, “Ofcom, you should have regard to these ‘special circumstances’. Why don’t you take actions that you might see fit to address them?”
Many circumstances, such as health or safety, are national security issues anyway if they reach a high enough level for intervention, so just boil it all down to national security and be done with it.
Professor Lorna Woods: If I may add something about the treatment of misinformation more generally, I suspect that if it is included in the regime, or if some subset such as health misinformation is included in the regime, it will be under the heading of “harmful to adults”. I am picking up on the point that Mr Moy made that the sorts of interventions will be more about friction and looking at how disinformation is incentivised and spread at an earlier stage, rather than reactive takedown.
Unfortunately, the measures that the Bill currently envisages for “harmful but legal” seem to focus more on the end point of the distribution chain. We are talking about taking down content and restricting access. Clause 13(4) gives the list of measures that a company could employ in relation to priority content harmful to adults.
I suppose that you could say, “Companies are free to take a wider range of actions”, but my question then is this: where does it leave Ofcom, if it is trying to assess compliance with a safety duty, if a company is doing something that is not envisaged by the Act? For example, taking bot networks offline, if that is thought a key factor in the spreading of disinformation—I see that Mr Moy is nodding. A rational response might be, “Let’s get rid of bot networks”, but that, as I read it, does not seem to be envisaged by clause 13(4).
I think that is an example of a more general problem. With “harmful but legal”, we would want to see less emphasis on takedown and more emphasis on friction, but the measures listed as envisaged do not go that far up the chain.
Minister, we have just got a couple of minutes left, so perhaps this should be your last question.
Q
“(b) restricting users’ access to the content;
(c) limiting the recommendation or promotion of the content;
(d) recommending or promoting the content.”
I would suggest that those actions are pretty wide, as drafted.
One of the witnesses—I think it was Mr Moy—talked about what were essentially content-agnostic measures to impede virality, and used the word “friction”. Can you elaborate a little bit on what you mean by that in practical terms?
William Moy: Yes, I will give a couple of quick examples. WhatsApp put a forwarding limit on WhatsApp messages during the pandemic. We knew that WhatsApp was a vector through which misinformation could spread, because forwarding is so easy. They restricted it to, I think, six forwards, and then you were not able to forward the message again. That is an example of friction. Twitter has a note whereby if you go to retweet something but you have not clicked on the link, it says, “Do you want to read the article before you share this?” You can still share it, but it creates that moment of pause for people to make a more informed decision.
Q
William Moy: But that is not what I am suggesting you do. I am suggesting you say that this Parliament prefers interventions that are content-neutral or free speech-based, and that inform users and help them make up their own minds, to interventions that restrict what people can see and share.
Q
William Moy: I do not think it is any more challenging than most of the risk assessments, codes of practice and so on, but I am willing to spend as many hours as it takes to talk through it with you.
Order. I am afraid that we have come to the end of our allotted time for questions. On behalf of the Committee, I thank the witnesses for all their evidence.
Examination of Witnesses
Danny Stone MBE, Stephen Kinsella OBE and Liron Velleman gave evidence.
Q
Stephen Kinsella: Yes. We think they are extremely helpful. We welcome what we see in clause 14 and clause 57. There is thus a very clear right to be verified, and an ability to screen out interactions with unverified accounts, which is precisely what we asked for. The Committee will be aware that we have put forward some further proposals. I would really hesitate to describe them as amendments; I see them as shading-in areas—we are not trying to add anything. We think that it would be helpful, for instance, when someone is entitled to be verified, that verification status should also be visible to other users. We think that should be implicit, because it is meant to act as a signal to others as to whether someone is verified. We hope that would be visible, and we have suggested the addition of just a few words into clause 14 on that.
We think that the Bill would benefit from a further definition of what it means by “user identity verification”. We have put forward a proposal on that. It is such an important term that I think it would be helpful to have it as a defined term in clause 189. Finally, we have suggested a little bit more precision on the things that Ofcom should take into account when dealing with platforms. I have been a regulatory lawyer for nearly 40 years, and I know that regulators often benefit from having that sort of clarity. There is going to be negotiation between Ofcom and the platforms. If Ofcom can refer to a more detailed list of the factors it is supposed to take into account, I think that will speed the process up.
One of the reasons we particularly welcomed the structure of the Bill is that there is no wait for detailed codes of conduct because these are duties that we will be executing immediately. I hope Ofcom is working on the guidance already, but the guidance could come out pretty quickly. Then there would be the process of—maybe negotiating is the wrong word—to-and-fro with the platforms. I would be very reluctant to take too much on trust. I do not mean on trust from the Government; I mean on trust from the platforms—I saw the Minister look up quickly then. We have confidence in Government; it is the platforms we are little bit wary of. I heard the frustration expressed on Tuesday.
indicated assent.
Stephen Kinsella: I think you said, “If platforms care about the users, why aren’t they already implementing this?” Another Member, who is not here today, said, “Why do they have to be brought kicking and screaming?” Yet, every time platforms were asked, we heard them say, “We will have to wait until we see the detail of—”, and then they would fill in whatever thing is likely to come last in the process. So we welcome the approach. Our suggestions are very modest and we are very happy to discuss them with you.
Q
Danny, we have had some fairly extensive discussions on the question of small but toxic platforms such as 4chan and BitChute—thank you for coming to the Department to discuss them. I heard your earlier response to the shadow Minister, but do you accept that those platforms should be subject to duties in the Bill in relation to content that is illegal and content that is already harmful to children?
Danny Stone: Yes, that is accurate. My position has always been that that is a good thing. The extent and the nature of the content that is harmful to adults on such platforms—you mentioned BitChute but there are plenty of others—require an additional level of regulatory burden and closer proximity to the regulator. Those platforms should have to account for it and say, “We are the platforms; we are happy that this harm is on our platform and”—as the Bill says—“we are promoting it.” You are right that it is captured to some degree; I think it could be captured further.
Q
“proportionate systems and processes…to ensure that…content of democratic”—
or journalistic—
“importance is taken into account”.
That is not an absolute protection; it is simply a requirement to take into account and perform a proportionate and reasonable balancing exercise. Is that not reasonable?
Liron Velleman: I have a couple of things to say on that. First, we and others in civil society have spent a decade trying to de-platform some of the most harmful actors from mainstream social media companies. What we do not want to see after the Bill becomes an Act are massive test cases where we do not know which way they will go and where it will be up to either the courts or social media companies to make their own decisions on how much regard they place in those exemptions at the same time as all the other clauses.
Secondly, one of our main concerns is the time it takes for some of that content to be removed. If we have a situation in which there is an expediated process for complaints to be made, and for journalistic content to remain on the platform for an announced time until the platform is able take it down, that could move far outside the realms of that journalistic or democratically important content. Again, using the earlier examples, it does not take long for content such as a livestream of a terrorist attack to be up on the Sun or the Daily Mirror websites and for lots of people to modify that video and bypass content, which can then be shared and used to recruit new terrorists and allow copycat attacks to happen, and can go into the worst sewers of the internet. Any friction that is placed on stopping platforms being able to take down some of that harm is definitely of particular concern to us.
Finally, as we heard on Tuesday, social media platforms—I am not sure I would agree with much of what they would say about the Bill, but I think this is true—do not really understand what they are meant to do with these clauses. Some of them are talking about flowcharts and whether this is a point-scoring system that says, “You get plus one for being a journalist, but minus two for being a racist.” I am not entirely sure that platforms will exercise the same level of regard. If, with some of the better-faith actors in the social media space, we have successfully taken down huge reams of the most harmful content and moved it away from where millions of people can see it to where only tens of thousands can see it, we do not want in any way the potential to open up the risk that hundreds of people could argue that they should be back on platforms when they are currently not there.
Q
Danny Stone: My take on this—I think people have misunderstood the Bill—is that it ultimately creates a regulated marketplace of harm. As a user, you get to determine how harmful a platform you wish to engage with—that is ultimately what it does. I do not think that it enforces content take-downs, except in relation to illegal material. It is about systems, and in some places, as you have heard today, it should be more about systems, introducing friction, risk-assessing and showing the extent to which harm is served up to people. That has its problems.
The only other thing on free speech is that we sometimes take too narrow a view of it. People are crowded out of spaces, particularly minority groups. If I, as a Jewish person, want to go on 4chan, it is highly unlikely that I will get a fair hearing there. I will be threatened or bullied out of that space. Free speech has to apply across the piece; it is not limited. We need to think about those overlapping harms when it comes to human rights—not just free speech but freedom from discrimination. We need to be thinking about free speech in its widest context.
Q
Stephen Kinsella: I agree entirely with what Danny was saying. Of course, we would say that our proposals have no implications for free speech. What we are talking about is the freedom not to be shouted at—that is really what we are introducing.
On disinformation, we did some research in the early days of our campaign that showed that a vast amount of the misinformation and disinformation around the 5G covid conspiracy was spread and amplified by anonymous or unverified accounts, so they play a disproportionate role in disseminating that. They also play a disproportionate role in disseminating abuse, and I think you may have a separate session with Kick It Out and the other football bodies. They have some very good research that shows the extent to which abusive language is from unverified or anonymous accounts. So, no, we do not have any free speech concerns at Clean up the Internet.
Q
Liron Velleman: We are satisfied that the Bill adequately protects freedom of speech. Our key view is that, if people are worried that it does not, beefing up the universal protections for freedom of speech should be the priority, instead of what we believe are potentially harmful exemptions in the Bill. We think that freedom of speech for all should be protected, and we very much agree with what Danny said—that the Bill should be about enhancing freedom of speech. There are so many communities that do not use social media platforms because of the harm that exists currently on platforms.
On children, the Bill should not be about limiting freedom of speech, but a large amount of our work covers the growth of youth radicalisation, particularly in the far right, which exists primarily online and which can then lead to offline consequences. You just have to look at the number of arrests of teenagers for far-right terrorism, and so much of that comes from the internet. Part of the Bill is about moderating online content, but it definitely serves to protect against some of the offline consequences of what exists on the platform. We would hope that if people are looking to strengthen freedom of speech, that is a universalist principle in the Bill, and not for some groups but not others.
Good. Thank you. I hope the Committee is reassured by those comments on the freedom of speech question.
Q
Danny Stone: I think that a media literacy strategy is really important. There is, for example, UCL data on the lack of knowledge of the word “antisemitism”: 68% of nearly 8,000 students were unfamiliar with the term’s meaning. Dr Tom Harrison has discussed cultivating cyber-phronesis—this was also in an article by Nicky Morgan in the “Red Box” column some time ago—which is a method of building practical knowledge over time to make the right decisions when presented with a moral challenge. We are not well geared up as a society—I am looking at my own kids—to educate young people about their interactions, about what it means when they are online in front of that box and about to type something, and about what might be received back. I have talked about some of the harms people might be directed to, even through Alexa, but some kind of wider strategy, which goes beyond what is already there from Ofcom—during the Joint Committee process, the Government said that Ofcom already has its media literacy requirements—and which, as you heard earlier, updates it to make it more fit for purpose for the modern age, would be very appropriate.
Stephen Kinsella: I echo that. We also think that that would be welcome. When we talk about media literacy, we often find ourselves with the platforms throwing all the obligation back on to the users. Frankly, that is one of the reasons why we put forward our proposal, because we think that verification is quite a strong signal. It can tell you quite a lot about how likely it is that what you are seeing or reading is going to be true if someone is willing to put their name to it. Seeing verification is just one contribution. We are really talking about trying to build or rebuild trust online, because that is what is seriously lacking. That is a system and design failure in the way that these platforms have been built and allowed to operate.
Q
Liron Velleman: If the Bill is seeking to make the UK the safest place to be on the internet, it seems to be the obvious place to put in something about media literacy. I completely agree with what Danny said earlier: we would also want to specifically ensure—although I am sure this already exists in some other parts of Ofcom and Government business—that there is much greater media literacy for adults as well as children. There are lots of conversations about how children understand use of the internet, but what we have seen, especially during the pandemic, is the proliferation of things like community Facebook groups, which used to be about bins and a fair that is going on this weekend, becoming about the worst excesses of harmful content. People have seen conspiracy theories, and that is where we have seen some of the big changes to how the far-right and other hateful groups operate, in terms of being able to use some of those platforms. That is because of a lack of media literacy not just among children, but among the adult population. I definitely would encourage that being in the Bill, as well as anywhere else, so that we can remove some of those harms.
Danny Stone: I think it will need further funding, beyond what has already been announced. That might put a smile on the faces of some Department for Education officials, who looked so sad during some of the consultation process—trying to ensure that there is proper funding. If you are going to roll this out across the country and make it fit for purpose, it is going to cost a lot of money.
Thank you. As there are no further questions from Members, I thank the witnesses for their evidence. That concludes this morning’s sitting.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(2 years, 6 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Q
Stephen Almond: Thank you very much. I will start by explaining the Digital Regulation Cooperation Forum. It is a voluntary, not statutory, forum that brings together ourselves, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority—some of the regulators with the greatest interest in digital regulation—to make sure that we have a coherent approach to the regulation of digital services in the interests of the public and indeed the economy.
We are brought together through our common interest. We do not require a series of duties or statutory frameworks to make us co-operate, because the case for co-operation is very, very clear. We will deliver better outcomes by working together and by joining up where our powers align. I think that is what you are seeing in practice in some of the work we have done jointly—for example, around the implementation of the children’s code alongside Ofcom’s implementation of the video-sharing platform regime. A joined-up approach to questions about, for example, how you assure the age of children online is really important. That gives me real confidence in reassuring the Committee that the ICO, Ofcom and other digital regulators will be able to take a very joined-up approach to regulating in the context of the new online safety regime.
Q
Stephen Almond: In our view, the Bill strikes an appropriate balance between privacy and online safety. The duties in the Bill should leave service providers in no doubt that they must comply with data protection law, and that they should guard against unwarranted intrusion of privacy. In my discourse with firms, I am very clear that this is not a trade-off between online safety and privacy: it is both. We are firmly expecting that companies take that forward and work out how they are going to adopt both a “privacy by design” and a “safety by design” approach to the delivery of their services. They must deliver both.
Q
Stephen Almond: In brief, yes. We feel that the Bill has been designed to work alongside data protection law, for which we remain the statutory regulator, but with appropriate mechanisms for co-operation with the ICO—so, with this series of consultation duties where codes of practice or guidance that could be issued by Ofcom may have an impact on privacy. We think that is the best way of assuring regulatory coherence in this area.
Mr Almond, we are trying to get a pint into a half-pint pot doing this, so we are rushing a bit. If, when you leave the room, you have a “I wish I’d said that” moment, please feel free to put it in writing to us. We are indebted to you. Thank you very much indeed.
Examination of Witnesses
Sanjay Bhandari and Lynn Perry gave evidence.
Q
Lynn Perry: As a recommendation, we think that could only strengthen the protections of children.
Q
Lynn Perry: We would welcome provision to be able to bring particularly significant evidence of concern. That is certainly something that organisations, large charities in the sector and those responsible for representing the rights of children and young people would welcome. On some of these issues, we work in coalition to make representations on behalf of children and young people, as well as of parents and carers, who also raise some concerns. The ability to do that and to strengthen the response is something that would be welcomed.
Q
Sanjay Bhandari: Our beneficiaries are under-represented or minority communities in sports. I agree, I think that the Bill goes a substantial way to protecting them and to dealing with some of the issues that we saw most acutely after the Euro 2020 finals.
We have to look at the Bill in context. This is revolutionary legislation, which we are not seeing anywhere else in the world. We are going first. The basic sanctions framework and the 10% fines I have seen working in other areas—anti-trust in particular. In Europe, that has a long history. The definition of harm being in the manner of dissemination will pick up pile-ons and some forms of trolling that we see a lot of. Hate crime being designated as priority illegal content is a big one for us, because it puts the proactive duty on the platforms. That too will take away quite a lot of content, we think. The new threatening communications offence we have talked about will deal with rape and death threats. Often the focus is on, quite rightly, the experience of black professional footballers, but there are also other people who play, watch and work in the game, including our female pundits and our LGBT fan groups, who also get loads of this abuse online. The harm-based offence—communications sent to cause harm without reasonable excuse—will likely cover things such as malicious tagging and other forms of trolling. I have already talked about the identification, verification and anonymity provisions.
I think that the Bill will go a substantial way. I am still interested in what fits into that residual category of content harmful to adults, but rather than enter into an arid philosophical and theoretical debate, I will take the spirit of the Bill and try to tag it to real content.
Q
Sanjay Bhandari: I do not think it was adequate because we still see stuff coming through. They have the greatest power to stop it. One thing we are interested in is improving transparency reporting. I have asked them a number of times, “Someone does not become a troll overnight, in the same way that someone does not become a heroin addict overnight, or commit an extremist act of terrorism overnight. There is a pathway where people start off, and you have that data. Can I have it?” I have lost count of the number of times that I have asked for that data. Now I want Ofcom to ask them for it.
Q
Lynn Perry: We do. Barnardo’s really welcomes the Bill. We think it is a unique and once-in-a-generation opportunity to achieve some really long-term changes to protect children from a range of online harms. There are some areas in which the Bill could go further, which we have talked about today. The opportunity that we see here is to make the UK the safest place in the world for children to be online. There are some very important provisions that we welcome, not least on age verification, the ability to raise issues through super-complaints, which you have asked me about, and the accountability in various places throughout the Bill.
Q
Sanjay Bhandari: As I said earlier, there are no absolute rights. There is no absolute right to freedom of speech— I cannot shout “Fire!” here—and there is no absolute right to privacy; I cannot use my anonymity as a cloak for criminality. It is question of drawing an appropriate balance. In my opinion, the Bill draws an appropriate balance between the right to freedom of speech and the right to privacy. I believe in both, but in the same way that I believe in motherhood and apple pie: of course I believe in them. It is really about the balancing exercise, and I think this is a sensible, pragmatic balancing exercise.
Ms Perry, I am very pleased that we were finally able to hear from you. Thank you very much indeed—you have been very patient. Thank you very much, Mr Bhandari. If either of you, as a result of what you have heard and been asked today, have any further thoughts that you wish to submit, please do so.
Examination of Witnesses
Eva Hartshorn-Sanders and Poppy Wood gave evidence.
Q
Eva Hartshorn-Sanders: Our “Hidden Hate” report was on DMs—direct messages—that were shared by the participants in the study. One in 15 of those broke the terms and conditions that Instagram had set out related to misogynist abuse—sexual abuse. That was in the wake of the World cup, so after Instagram had done a big promotion about how great it was going to be in having policies on these issues going forward. We found that 90% of that content was not acted on when we reported it. This was not even them going out proactively to find the content and not doing anything with it; it was raised for their attention, using their systems.
Q
Eva Hartshorn-Sanders: That will depend on transparency, as Poppy mentioned. How much of that information can be shared? We are doing research at the moment on data that is shared personally, or is publicly available through the different tools that we have. So it is strengthening access to that data.
There is this information asymmetry that happens at the moment, where big tech is able to see patterns of abuse. In some cases, as in the misogyny report, you have situations where a woman might be subject to abuse from one person over and over again. The way that is treated in the EU is that Instagram will go back and look at the last 30 historically to see the pattern of abuse that exists. They are not applying that same type of rigorousness to other jurisdictions. So it is having access to it in the audits that are able to happen. Everyone should be safe online, so this should be a safety-by-design feature that the companies have.
Q
Eva Hartshorn-Sanders: I think it depends on who the researchers are. I personally do not have experience of it, but I cannot speak to that. On transparency, at the moment, the platforms generally choose what they share. They do not necessarily give you the data that you need. You can hear from my accent that I am originally from New Zealand. I know that in the wake of the Christchurch mosque terrorist attack, they were not prepared to provide the independent regulator with data on how many New Zealanders had seen the footage of the livestream, which had gone viral globally. That is inexcusable, really.
Q
Poppy Wood: On the point about access to data, I do not believe that the platforms go as far as they could, or even as far as they say they do. Meta have a tool called CrowdTangle, which they use to provide access to data for certain researchers who are privileged enough to have access. That does not even include comments on posts; it is only the posts themselves. The platforms pull the rug out all the time from under researchers who are investigating things that the platforms do not like. We saw that with Laura Edelson at New York University, who they just cut off—that is one of the most famous cases. I think it is quite egregious of Meta to say that they give lots of access to data.
We know from the revelations of whistleblowers that Meta do their own internal research, and when they do not like the results, they just bury it. They might give certain researchers access to data under certain provisions, but independent researchers who want to investigate a certain emergent harm or a certain problem are not being given the sort of access that they really need to get insights that move the needle. I am afraid that I just do not believe that at all.
The Bill could go much further. A provision on access to data in clause 136 states that Ofcom has two years to issue a report on whether researchers should get access to data. I think we know that researchers should have access to data, so I would, as a bare minimum, shorten the time that Ofcom has to do that report from two years to six months. You could turn that into a question of how to give researchers access to data rather than of whether they should get it. The Digital Services Act—the EU equivalent of the Bill—goes a bit further on access to data than our Bill. One result of that might be that researchers go to the EU to get their data because they can get it sooner.
Improving the Bill’s access to data provisions is a no-brainer. It is a good thing for the Government because we will see more stuff coming out of academia, and it is a good thing for the safety tech sector, because the more research is out there, the more tools can be built to tackle online harms. I certainly call on the Government to think about whether clause 136 could go further.
Q
Poppy Wood: It is not an easy problem to solve, for sure. What everybody is saying is that you do it in a content-neutral way, so that you are not talking about listing specific types of misinformation but about the risks that are built into your system and that need to be mitigated. This is a safety by design question. We have heard a lot about introducing more friction into the system, checking the virality threshold, and being more transparent. If you can get better on transparency, I think you will get better on misinformation.
If there is more of an obligation on the platforms to, first, do a broader risk assessment outside of the content that will be listed as priority content and, secondly, introduce some “harm reduction by design” mechanisms, through friction and stemming virality, that are not specific to certain types of misinformation, but are much more about safety by design features—if we can do that, we are part of the way there. You are not going to solve this problem straightaway, but you should have more friction in the system, be it through a code of practice or a duty somewhere to account for risk and build safer systems. It cannot be a content play; it has to be a systems play.
Thank you. I am sorry, but that brings us to the end of the time allotted to this session. Ladies, if either of you wishes to make a submission in writing in the light of what you have not answered or not been able to answer, please do. Ms Wood, Ms Hartsholm-Sanders, thank you very much indeed for joining us.
Examination of Witnesses
Owen Meredith and Matt Rogerson gave evidence.
Q
Owen Meredith: I do not think that would be allowable under the Bill, because of the distinction between a recognised news publisher publishing what we would all recognise as journalistic content, versus the journalistic content exemption. I think that is why they are treated differently.
Q
Owen Meredith: Yes. I think the issue is how that exemption will work in practice. I think that what the Government have said they are looking at and will bring forward does address the operating in practice.
Q
Owen Meredith: As I alluded to earlier, it is a real challenge to set out this legal definition in a country that believes, rightly, in the freedom of the press as a fourth pillar of democracy. It is a huge challenge to start with, and therefore we have to set out criteria that cover the vast majority of news publishers but do not end up with a backdoor licensing system for the press, which I think we are all keen to avoid. I think it meets that criterion.
On the so-called bad actors seeking to abuse that, I have listened to and read some of the evidence that you have had from others—not extensively, I must say, due to other commitments this week—and I think that it would be very hard for someone to meet all those criteria as set out in order to take advantage of this. I think that, as Matt has said, there will clearly be tests and challenges to that over time. It will rightly be challenged in court or go through the usual judicial process.
Matt Rogerson: It seems to me that the whole Bill will be an iterative process. The internet will not suddenly become safe when the Bill receives Royal Assent, so there will be this process whereby guidance and case law are developed, in terms of what a newspaper is, against the criteria. There are exemptions for news publishers in a whole range of other laws that are perfectly workable. I think that Ofcom is perfectly well equipped to create guidance that enables it to be perfectly workable.
Q
Matt Rogerson: Subject to the guidance developed by Ofcom, which we will be engaged in developing, I do think so. The other thing to bear in mind is that the platforms already have lists of trusted publishers. For example, Google has a list in relation to Google News—I think it has about 65,000 publishers—which it automates to push through Google News as trusted news publishers. Similarly, Facebook has a list of trusted news publishers that it uses as a signal for the Facebook newsfeed. So I do not buy the idea that you can’t automate the use of trusted news sources within those products.
Q
Owen Meredith: If I can speak to the point that directly relates to my members and those I represent, which is “Does it protect press freedom?”, which is perhaps an extension of your question, I would say that it is seeking to. Given the assurances you have given about the detailed amendments that you intend to bring forward—if those are correct, and I am very happy to write to the Committee and comment once we have seen the detail, if it would be helpful to do so—and everything I have heard about what you are intending to do, I believe it will. But I do not believe that the current draft properly and adequately protects press freedom, which is why, I think, you will be bringing forward amendments.
Q
Owen Meredith: Subject to seeing the drafting, but I believe the intention—yes.
Thank you. That is very helpful. Mr Rogerson?
Matt Rogerson: As we know, this is a world first: regulation of the internet, regulation of speech acts on the internet. From a news publisher perspective, I think all the principles are right in terms of what the Government are trying to do. In terms of free speech more broadly, a lot of it will come down to how the platforms implement the Bill in practice. Only time will tell in terms of the guidance that Ofcom develops and how the platforms implement that at vast scale. That is when we will see what impact the Bill actually has in practice.
Q
Matt Rogerson: Yes. With the development of the online platforms to the dominant position they are in today, that will be a big step forward. The only thing I would add is that, as well as this Bill, the other Bill that will make a massive difference when it comes through is the digital markets unit Bill. We need competition to Facebook so that consumers have a choice and so that they can decide which social network they want to be on, not just the one dominant social network that is available to them in this country.
I commend your ingenuity in levering an appeal for more digital competition into this discussion. Thank you.
Q
Tim Fassam: I believe that would be helpful. I think Ofcom is the right organisation to manage the relationship with the platforms, because it is going to be much broader than the topics we are talking about in our session, but we do think the FCA, Action Fraud and potentially the CMA should be able to direct, and be very clear with Ofcom, that action needs to be taken. Ofcom should have the ability to ask for things to be reviewed to see whether they break the rules.
The other area where we think action probably needs to be taken is where firms are under investigation, because the Bill assumes it is clear cut whether something is fraud, a scam, a breach of the regulations or not. In some circumstances, that can take six months or a year to establish through investigation. We believe that if, for example, the FCA feels that something is high risk, it should be able to ask Ofcom to suspend an advert, or a firm from advertising, pending an investigation to assess whether it is a breach of the regulation.
Rocio Concha: I agree that Ofcom is the right regulator, the main regulator, but it needs to work with the other regulators—with the FCA, ASA and CMA—to enforce the Bill effectively. There is another area. Basically, we need to make sure that Ofcom and all the regulators involved have the right resources. When the initial version of the Bill was published, Ofcom got additional resources to enable it to enforce the Bill. But the Bill has increased in scope, because now it includes fraud and fraudulent advertising. We need to make sure that Ofcom has the right resources to enforce the full Bill effectively. That is something that the Government really need to consider.
Martin Lewis: I was going to make exactly that point, but it has just been made brilliantly so I will not waste your time.
Q
I will start by agreeing with the point that Martin Lewis made a minute or two ago—that we cannot trust these companies to work on their own. Mr Lewis, I am not sure whether you have had a chance to go through clause 34, which we inserted into the Bill following your evidence to the Joint Committee last year. It imposes a duty on these companies to take steps and implement systems to
“prevent individuals from encountering content consisting of fraudulent advertisements”.
There is a clear duty to stop them from doing this, rather as you were asking a minute ago when you described the presentation. Does that strong requirement in clause 34, to stop individuals from encountering fraudulent advertisement content, meet the objective that you were asking for last year?
Martin Lewis: Let me start by saying that I am very grateful that you have put it in there and thankful that the Government have listened to our campaign. What I am about to say is not intended as criticism.
It is very difficult to know how this will work in practice. The issue is all about thresholds. How many scam adverts can we stomach? I still have, daily—even from the platform that I sued, never mind the others—tens of reports directly to me of scam adverts with my face on. Even though there is a promise that we will try to mitigate that, the companies are not doing it. We have to have a legitimate understanding that we are not going to have zero scam adverts on these platforms; unless they were to pre-vet, which I do not think they will, the way they operate means that will not happen.
I am not a lawyer but my concern is that the Bill should make it clear, and that any interpretation of the Bill from Ofcom should be clear, about exactly what threshold of scam adverts is acceptable—we know that they are going to happen—and what threshold is not acceptable. I do not have the expertise to answer your question; I have to rely on your expertise to do that. But I ask the Committee to think properly about what the threshold level should be.
What is and is not acceptable? What counts as “doing everything they can”? They are going to get big lawyers involved if you say there must be zero scam adverts—that is not going to happen. How many scam adverts are acceptable and how many are not? I am so sorry to throw that back as a question when I am a witness, but I do not have the expertise to answer. But that is my concern: I am not 100% convinced of the threshold level that you are setting.
Q
Tim Fassam: I think we are positive about the actions that have been taken regarding social media; our concern is that the clause is not applied to search and that it excludes paid-for ads that are also user-generated content—promoted tweets or promoted posts, for example. We would ensure that that applied to all paid-for adverts and that it was consistent between social media and search.
Q
Tim Fassam: You absolutely do, but to a weaker standard than in clause 34.
Q
Tim Fassam: Thank you.
Q
Mr Lewis, as you were named, I think you had better start.
Martin Lewis: Ten per cent. of the global revenue of a major social media or search player is a lot of money—it certainly would hit them in the pocket. I reiterate my previous point: it is all about the threshold at which that comes in and how rigidly Ofcom is enforcing it. There are very few organisations that have the resources, legally, to take on big institutions of state, regulators and Governments. If any does, it is the gigantic tech firms. Absolutely, 10% of global revenue sounds like a suitable wall to prevent them jumping over. That is the aim, because we want those companies to work for people; we don’t want them to do scam adds. We want them to work well and we want them never to be fined because is no reason to fine them.
The proof of the pudding will be in how robust Ofcom feels it can be, off the back of the Bill, taking those companies on. I go back to needing to understand how many scam ads you permit under the duty to prevent scam ads. It clearly is not zero—you are not going to tell me it is zero. So how many are allowed, what are the protocols that come into place and how quickly do they have to take the ads down? Ultimately, I think that is going to be a decision for Ofcom, but it is the level of stringency that you put on Ofcom in order for it to interpret how it takes that decision that is going to decide whether this works or not.
Rocio Concha: I completely agree with Martin. Ofcom needs to have the right resources in order to monitor how the platforms are doing that, and it needs to have the right powers. At the moment, Ofcom can ask for information in a number of areas, including fraud, but not advertising. We need to make sure that Ofcom can ask for that information so that it can monitor what the platforms are doing. We need to make sure that it has the right powers and the right resources to enforce the Bill effectively.
Tim Fassam: You would hope that 10% would certainly be a significant disincentive. Our focus would be on whether companies are contributing to compensating the victims of fraud and scams, and whether they have been brought into the architecture that is utilised to compensate victims of fraud and scams. That would be the right aim in terms of financial consequences for the firms.
Q
Secondly, clauses 140 and 141 contain a procedure for so-called super-complaints, where a body that represents users—it could be Which? or an organisation like it—is able to bring something almost like a class action or group complaint to Ofcom if it thinks a particular social media firm has systemic problems. Will those two clauses address the issue of complaints not being properly handled or, in some cases, not being dealt with at all?
Martin Lewis: Everything helps. I think the super-complaint point is really important. We must remember that many victims of scams are not so good at complaining and, by the nature of the crossover of individuals, there is a huge mental health issue at stake with scams. There is both the impact on people with mental health issues and the impact on people’s mental health of being scammed, which means that they may not be as robust and up for the fight or for complaining. As long as it works and applies to all the different categories that are repeated here, the super-complaint status is a good measure.
We absolutely need proper reporting lines. I urge you, Minister—I am not sure that this is in the Bill—to standardise this so that we can talk about what someone should do when they report: the same imagery, the same button. With that, people will know what to do. The more we can do that, the easier and better the system will be.
Q
“easy to access, easy to use (including by children) and transparent.”
The previous paragraph (b) states that the system must
“provides for appropriate action to be taken by the provider of the service in response to complaints of a relevant kind”.
The Bill is saying that a complaints process must do those two things, because if it does not, Ofcom will be on the company’s back.
Martin Lewis: I absolutely support all of that. I am just pushing for that tiny bit more leadership, whether it is from you or Ofcom, that comes up with a standardised system with standardised imagery and placing, so that everybody knows that on the top left of the advert you have the button that you click to fill in a form to report it. The more we have that cross-platform and cross-search and cross-social media, the easier it will be for people. I am not sure it is a position for the Bill in itself, but Government leadership would work really well on that.
Tim Fassam: They are both welcome—the super-complaint and the new complaints process. We want to ensure that we have a system that looks not just at weight of number of complaints, but at the content. In particular, you may find on the super-complaint point that, for example, the firm that a fraudster is pretending to be is the organisation that has the best grasp of the issue, so do not forget about commercial organisations as well as consumer organisations when thinking about who is appropriate to make super-complaints.
Q
Tim Fassam: Absolutely. We suggested to Meta when we met them about 18 months ago that we could be a clearing house to identify for them whether they need to take something seriously, because our members have analysed it and consider it to represent a real risk.
Last word to Rocio Concha.
Rocio Concha: I completely agree about the super-complaint. We as a consumer organisation have super-complaint powers. As with other regulators, we would like to have it in this context as well. We have done many super-complaints representing consumers in particular areas with the regulators, so I think we need it in this Bill as well.
On reporting, I want to clarify something. At the moment, the Bill does not have a requirement for users to complain and report to platforms in relation to fraudulent advertising. It happens for priority illegal content, but our assessment of the Bill is that it is unclear whether it applies to fraudulent advertising. We probably do not have time to look at this now, but we sent you amendments to where we thought the Bill had weaknesses. We agree with you that users should have an easy and transparent way to report illegal or fraudulent advertising, and they should have an easy way to complain about it. At the moment, it is not clear that the Bill will require that for fraudulent advertising.
Q
Rocio Concha: My comment was in relation not to the super-complaints but to the requirements. We already sent you our comments with suggestions on how you can fix this in the Bill.
Ms Concha and Mr Fassam, thank you very much. Do please write in if you have further comments. Mr Lewis, we are deeply grateful to you. You can now go back to your day job and tell us whether we are going to be worse or better off as a result of the statement today—please don’t answer that now.
Martin Lewis: I am interviewing the Chancellor in 15 minutes.
Q
Frances Haugen: I think that shows a commendable level of chutzpah. Researchers have been trying to get really basic datasets out of Facebook for years. When I talk about a basic dataset, it is things as simple as, “Just show us the top 10,000 links that are distributed in any given week.” When you ask for information like that in a country like the United States, no one’s privacy is violated: every one of those links will have been viewed by hundreds of thousands, if not millions of people. Facebook will not give out even basic data like that, even though hundreds if not thousands of academics have begged for this data.
The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes—and remember, it does not even say that it will happen; Ofcom might say, “Oh, maybe not.” We need to take a page from the Digital Services Act and say, “On the day that the Bill passes, we get access to data,” or, at worst, “Within three months, we are going to figure out how to do it.” It needs to be not, “Should we do it?” but “How will we do it?”
Q
Frances Haugen: First, I left the company a year ago. Because we have no transparency with these companies, they do not have to publish their algorithms or the consequences of their algorithms, so who knows? Maybe they use astrology now to rank the content. We have no idea. All I know is that Meta definitely still uses signals—did users click on it, did they dwell on it, did they re-share it, or did they put a comment on it? There is no way it is not using those. It is very unlikely that they do not still use engagement in their ranking.
The secondary question is, do they optimise for engagement? Are they trying to maximise it? It is possible that they might interpret that and say, “No, we have multiple things we optimise for,” because that is true. They look at multiple metrics every single time they try to decide whether or not to shift things. But I think it is very likely that they are still trying to optimise for engagement, either as their top metric or as one of their top metrics.
Remember, Meta is not trying to optimise for engagement to keep you there as long as possible; it is optimising for engagement to get you and your friends to produce as much content as possible, because without content production, there can be no content consumption. So that is another thing. They might say, “No, we are optimising for content production, not engagement,” but that is one step off.
Q
Frances Haugen: I have a feeling that there is going to be a period of growing pains after the first time these risk assessments happen. I can almost entirely guarantee you that Facebook will try to give you very little. It will likely be a process of back and forth with the regulator, where you are going to have to have very specific standards for the level of transparency, because Facebook is always going to try to give you the least possible.
One of the things that I am actually quite scared about is that, in things like the Digital Services Act, penalties go up to 10% of global profits. Facebook as a company has something like 35% profit margins. One of the things I fear is that these reports may be so damning— that we have such strong opinions after we see the real, hard consequences of what they are doing—that Facebook might say, “This isn’t worth the risk. We’re just going to give you 10% of our profits.” That is one of the things I worry about: that they may just say, “Okay, now we’re 25% profitable instead of 35% profitable. We’re that ashamed.”
Q
Frances Haugen: Oh, good. That’s wonderful.
We had a case last year where Facebook—it was actually Facebook—failed to provide some information to the CMA in a takeover case, and it paid a £50 million fine rather than provide the information, hence the provision for personal criminal liability for failing to provide information that is now in this Bill.
My final question is a simple one. From your perspective, at the moment, when online tech companies are making product design decisions, what priority do they give to safety versus profit?
Frances Haugen: What I saw when I was at Facebook was that there was a culture that encouraged people to always have the most positive interpretation of things. If things are still the same as when I left—like I said, I do not know; I left last May—what I saw was that people routinely had to weigh little changes in growth versus changes in safety metrics, and unless they were major changes in safety metrics, they would continue to pursue growth. The only problem with a strategy like that is that those little deficits add up to very large harms over time, so we must have mandated transparency. The public have to have access to data, because unless Facebook has to add the public cost of the harm of its products, it is not going to prioritise enough those little incremental harms as they add up.
Ms Haugen, thank you very much indeed for joining us today, and thank you also for the candour with which you have answered your questions. We are very grateful to you indeed.
The Committee will meet again on Tuesday 7 June at 9.25 am for the start of its line-by-line consideration of the Bill. That session will be in Committee Room 14.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(2 years, 6 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Good morning, ladies and gentleman. If anybody wishes to take their jacket off, they are at liberty to do so when I am in the Chair—my co-Chairman is joining us, and I am sure she will adopt the same procedure. I have a couple of preliminary announcements. Please make sure that all mobile phones are switched off. Tea and coffee are not allowed in the Committee, I am afraid. I think they used to be available outside in the corridor, but I do not know whether that is still the case.
We now start line-by-line consideration of the Bill. The selection and grouping list for the sitting is available on the table in the room for anybody who does not have it. It shows how the clauses and selected amendments have been grouped for debate. Grouped amendments are generally on the same subject or a similar issue.
Now for a slight tutorial to remind me and anybody else who is interested, including anybody who perhaps has not engaged in this arcane procedure before, of the proceedings. Each group has a lead amendment, and that amendment is moved first. The other grouped amendments may be moved later, but they are not necessarily voted on at that point, because some of them relate to matters that appear later in the Bill. Do not panic; that does not mean that we have forgotten them, but that we will vote on them—if anybody wants to press them to a Division—when they are reached in order in the Bill. However, if you are in any doubt and feel that we have missed something—occasionally I do; the Clerks never do—just let us know. I am relaxed about this, so if anybody wants to ask a question about anything that they do not understand, please interrupt and ask, and we will endeavour to confuse you further.
The Member who has put their name to the lead amendment, and only the lead amendment, is usually called to speak first. At the end of the debate, the Minister will wind up, and the mover of the lead amendment—that might be the Minister if it is a Government amendment, or it might be an Opposition Member—will indicate whether they want a vote on that amendment. We deal with that first, then we deal with everything else in the order in which it arises. I hope all that is clear, but as I said, if there are any questions, please interrupt and ask.
We start consideration of the Bill with clause 1, to which there are no amendments. Usually, the Minister would wind up at the end of each debate, but as there are no amendments to clause 1, the Minister has indicated that he would like to say a few words about the clause.
Clause 1
Overview of Act
Question proposed, That the clause stand part of the Bill.
Thank you, Sir Roger; it is a pleasure to serve under your chairmanship once again. It may be appropriate to take this opportunity to congratulate my right hon. Friend the Member for Basingstoke on her damehood in the Queen’s birthday honours, which was very well deserved indeed.
This simple clause provides a high-level overview of the different parts of the Bill and how they come together to form the legislation.
The Minister was completely out of order in congratulating the right hon. Lady, but I concur with him. I call the shadow Minister.
This part of the Bill deals with the definitions of services and which services would be exempt. I consider myself a millennial; most people my age or older are Facebook and Twitter users, and people a couple of years younger might use TikTok and other services. The way in which the online space is used by different generations, particularly by young people, changes rapidly. Given the definitions in the Bill, how does the Minister intend to keep pace with the changing ways in which people communicate? Most online games now allow interaction between users in different places, which was not the case a few years ago. Understanding how the Government intend the Bill to keep up with such changes is important. Will the Minister tell us about that?
Let me briefly speak to the purpose of these clauses and then respond to some of the points made in the debate.
As the shadow Minister, the hon. Member for Pontypridd, touched on, clauses 2 and 3 define some of the key terms in the Bill, including “user-to-user services” and “search services”—key definitions that the rest of the Bill builds on. As she said, schedule 1 and clause 4 contain specific exemptions where we believe the services concerned present very low risk of harm. Schedule 2 sets out exemptions relating to the new duties that apply to commercial providers of pornography. I thank the shadow Minister and my right hon. Friend the Member for Basingstoke for noting the fact that the Government have substantially expanded the scope of the Bill to now include commercial pornography, in response to widespread feedback from Members of Parliament across the House and the various Committees that scrutinised the Bill.
The shadow Minister is quite right to say that the number of platforms to which the Bill applies is very wide. [Interruption.] Bless you—or bless my hon. Friend the Member for North West Durham, I should say, Sir Roger, although he is near sanctified already. As I was saying, we are necessarily trying to protect UK users, and with many of these platforms not located in the UK, we are seeking to apply these duties to those companies as well as ones that are domestically located. When we come to discuss the enforcement powers, I hope the Committee will see that those powers are very powerful.
The shadow Minister, the hon. Member for Liverpool, Walton and others asked about future technologies and whether the Bill will accommodate technologies that we cannot even imagine today. The metaverse is a good example: The metaverse did not exist when the Bill was first contemplated and the White Paper produced. Actually, I think Snapchat did not exist when the White Paper that preceded the Bill was first conceived. For that reason, the Bill is tech agnostic. We do not talk about specific technologies; we talk about the duties that apply to companies and the harms they are obligated to prevent.
The whole Bill is tech agnostic because we as parliamentarians today cannot anticipate future developments. When those future developments arise, as they inevitably will, the duties under the Bill will apply to them as well. The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic. That is an extremely important point to make.
The hon. Member for Aberdeen North asked about gaming. Parents are concerned because lots of children, including quite young children, use games. My own son has started playing Minecraft even though he is very young. To the extent that those games have user-to-user features—for example, user-to-user messaging, particularly where those messages can be sent widely and publicly—those user-to-user components are within the scope of the Bill.
The hon. Member for Aberdeen North also asked about the App Store. I will respond quickly to her question now rather than later, to avoid leaving the Committee in a state of tingling anticipation and suspense. The App Store, or app stores generally, are not in the scope of the Bill, because they are not providing, for example, user-to-user services, and the functionality they provide to basically buy apps does not count as a search service. However, any app that is purchased in an app store, to the extent that it has either search functionality, user-to-user functionality or purveys or conveys pornography, is in scope. If an app that is sold on one of these app stores turns out to provide a service that breaks the terms of the Bill, that app will be subject to regulatory enforcement directly by Ofcom.
The hon. Members for Aberdeen North and for Liverpool, Walton touched on media literacy, noting that there has been a change to the Bill since the previous version. We will probably debate this later, so I will be brief. The Government published a media literacy strategy, backed by funding, to address this point. It was launched about a year ago. Ofcom also has existing statutory duties—arising under the Communications Act 2003, I believe. The critical change made since the previous draft of the Bill—it was made in December last year, I believe—is that Ofcom published an updated set of policy intentions around media literacy that went even further than we had previously intended. That is the landscape around media literacy.
I am sure we will discuss this topic a bit more as the Bill progresses.
I will make a few points on disinformation. The first is that, non-legislatively, the Government have a counter-disinformation unit, which sits within the Department for Digital, Culture, Media and Sport. It basically scans for disinformation incidents. For the past two years it has been primarily covid-focused, but in the last three or four months it has been primarily Russia/Ukraine-focused. When it identifies disinformation being spread on social media platforms, the unit works actively with the platforms to get it taken down. In the course of the Russia-Ukraine conflict, and as a result of the work of that unit, I have personally called in some of the platforms to complain about the stuff they have left up. I did not have a chance to make this point in the evidence session, but when the person from Twitter came to see us, I said that there was some content on Russian embassy Twitter accounts that, in my view, was blatant disinformation—denial of the atrocities that have been committed in Bucha. Twitter had allowed it to stay up, which I thought was wrong. Twitter often takes down such content, but in that example, wrongly and sadly, it did not. We are doing that work operationally.
Secondly, to the extent that disinformation can cause harm to an individual, which I suspect includes a lot of covid disinformation—drinking bleach is clearly not very good for people—that would fall under the terms of the legal but harmful provisions in the Bill.
Thirdly, when it comes to state-sponsored disinformation of the kind that we know Russia engages in on an industrial scale via the St Petersburg Internet Research Agency and elsewhere, the Home Office has introduced the National Security Bill—in fact, it had its Second Reading yesterday afternoon, when some of us were slightly distracted. One of the provisions in that Bill is a foreign interference offence. It is worth reading, because it is very widely drawn and it criminalises foreign interference, which includes disinformation. I suggest the Committee has a look at the foreign interference offence in the National Security Bill.
I am grateful for the Minister’s intervention in bringing in the platforms to discuss disinformation put out by hostile nation states. Does he accept that if Russia Today had put out some of that disinformation, the platforms would be unable to take such content down as a result of the journalistic exemption in the Bill?
We will no doubt discuss in due course clauses 15 and 50, which are the two that I think the shadow Minister alludes to. If a platform is exempt from the duties of the Bill owing to its qualification as a recognised news publisher under clause 50, it removes the obligation to act under the Bill, but it does not prevent action. Social media platforms can still choose to act. Also, it is not a totally straightforward matter to qualify as a regulated news publisher under clause 50. We saw the effect of sanctions: when Russia Today was sanctioned, it was removed from many platforms as a result of the sanctioning process. There are measures outside the Bill, such as sanctions, that can help to address the shocking disinformation that Russia Today was pumping out.
The last point I want to pick up on was rightly raised by my right hon. Friend the Member for Basingstoke and the hon. Member for Aberdeen North. It concerns child sexual exploitation and abuse images, and particularly the ability of platforms to scan for those. Many images are detected as a result of scanning messages, and many paedophiles or potential paedophiles are arrested as a result of that scanning. We saw a terrible situation a little while ago, when—for a limited period, owing to a misconception of privacy laws—Meta, or Facebook, temporarily suspended scanning in the European Union; as a result, loads of images that would otherwise have been intercepted were not.
I agree with the hon. Member for Aberdeen North that privacy concerns, including end-to-end encryption, should not trump the ability of organisations to scan for child sexual exploitation and abuse images. Speaking as a parent—I know she is, too—there is, frankly, nothing more important than protecting children from sexual exploitation and abuse. Some provisions in clause 103 speak to this point, and I am sure we will debate those in more detail when we come to that clause. I mention clause 103 to put down a marker as the place to go for the issue being raised. I trust that I have responded to the points raised in the debate, and I commend the clause to the Committee.
Question put and agreed to.
Clause 2 accordingly ordered to stand part of the Bill.
Clause 3 ordered to stand part of the Bill.
Schedules 1 and 2 agreed to.
Clause 4 ordered to stand part of the Bill.
Before we move on, we have raised the issue of the live feed. The audio will be online later today. There is a problem with the feed—it is reaching the broadcasters, but it is not being broadcast at the moment.
As we are not certain we can sort out the technicalities between now and this afternoon, the Committee will move to Committee Room 9 for this afternoon’s sitting to ensure that the live stream is available. Mr Double, if Mr Russell intends to be present—he may not; that is up to you—it would be helpful if you would let him know. Ms Blackman, if John Nicolson intends to be present this afternoon, would you please tell him that Committee Room 9 will be used?
It would normally be possible to leave papers and other bits and pieces in the room, because it is usually locked between the morning and afternoon sittings. Clearly, because we are moving rooms, you will all need to take your papers and laptops with you.
Clause 5
Overview of Part 3
Question proposed, That the clause stand part of the Bill.
I want to add my voice to the calls for ways to monitor the success or failures of this legislation. We are starting from a position of self-regulation where companies write the rules and regulate themselves. It is right that we are improving on that, but with it comes further concerns around the powers of the Secretary of State and the effectiveness of Ofcom. As the issues are fundamental to freedom of speech and expression, and to the protection of vulnerable and young people, will the Minster consider how we better monitor whether the legislation does what it says on the tin?
Clause 5 simply provides an overview of part 3 of the Bill. Several good points have been raised in the course of this discussion. I will defer replying to the substance of a number of them until we come to the relevant clause, but I will address two or three of them now.
The shadow Minister said that the Bill is a complex, and she is right; it is 193-odd clauses long and a world-leading piece of legislation. The duties that we are imposing on social media firms and internet companies do not already exist; we have no precedent to build on. Most matters on which Parliament legislates have been considered and dealt with before, so we build on an existing body of legislation that has been built up over decades or, in some cases in the criminal law, over centuries. In this case, we are constructing a new legislative edifice from the ground up. Nothing precedes this piece of legislation—we are creating anew—and the task is necessarily complicated by virtue of its novelty. However, I think we have tried to frame the Bill in a way that keeps it as straightforward and as future-proof as possible.
The shadow Minister is right to point to the codes of practice as the source of practical guidance to the public and to social media firms on how the obligations operate in practice. We are working with Ofcom to ensure that those codes of practice are published as quickly as possible and, where possible, prepared in parallel with the passage of the legislation. That is one reason why we have provided £88 million of up-front funding to Ofcom in the current and next financial years: to give it the financial resources to do precisely that.
My officials have just confirmed that my recollection of the Ofcom evidence session on the morning of Tuesday 24 May was correct: Ofcom confirmed to the Committee that it will publish, before the summer, what it described as a “road map” providing details on the timing of when and how those codes of practice will be created. I am sure that Ofcom is listening to our proceedings and will hear the views of the Committee and of the Government. We would like those codes of practice to be prepared and introduced as quickly as possible, and we certainly provided Ofcom with the resources to do precisely that.
There was question about the Scottish offences and, I suppose, about the Northern Irish offences as well—we do not want to forget any part of the United Kingdom.
We are in agreement on that. I can confirm that the Government have tabled amendments 116 to 126 —the Committee will consider them in due course—to place equivalent Scottish offences, which the hon. Member for Aberdeen North asked about, in the Bill. We have done that in close consultation with the Scottish Government to ensure that the relevant Scottish offences equivalent to the England and Wales offences are inserted into the Bill. If the Scottish Parliament creates any new Scottish offences that should be inserted into the legislation, that can be done under schedule 7 by way of statutory instrument. I hope that answers the question.
The other question to which I will briefly reply was about parliamentary scrutiny. The Bill already contains a standard mechanism that provides for the Bill to be reviewed after a two to five-year period. That provision appears at the end of the Bill, as we would expect. Of course, there are the usual parliamentary mechanisms—Backbench Business debates, Westminster Hall debates and so on—as well as the DCMS Committee.
I heard the points about a standing Joint Committee. Obviously, I am mindful of the excellent prelegislative scrutiny work done by the previous Joint Committee of the Commons and the Lords. Equally, I am mindful that standing Joint Committees, outside the regular Select Committee structure, unusual. The only two that spring immediately to mind are the Intelligence and Security Committee, which is established by statute, and the Joint Committee on Human Rights, chaired by the right hon. and learned Member for Camberwell and Peckham (Ms Harman), which is established by Standing Orders of the House. I am afraid I am not in a position to make a definitive statement about the Government’s position on this. It is of course always open to the House to regulate its own businesses. There is nothing I can say today from a Government point of view, but I know that hon. Members’ points have been heard by my colleagues in Government.
We have gone somewhat beyond the scope of clause 5. You have been extremely generous, Sir Roger, in allowing me to respond to such a wide range of points. I commend clause 5 to the Committee.
Question put and agreed to.
Clause 5 accordingly ordered to stand part of the Bill.
Clause 6
Providers of user-to-user services: duties of care
Before we proceed, perhaps this is the moment to explain what should happen and what is probably going to happen. Ordinarily, a clause is taken with amendments. This Chairman takes a fairly relaxed view of stand part debates. Sometimes it is convenient to have a very broad-ranging debate on the first group of amendments because it covers matters relating to the whole clause. The Chairman would then normally say, “Well, you’ve already had your stand part debate, so I’m not going to allow a further stand part debate.” It is up to hon. Members to decide whether to confine themselves to the amendment under discussion and then have a further stand part debate, or whether to go free range, in which case the Chairman would almost certainly say, “You can’t have a stand part debate as well. You can’t have two bites of the cherry.”
This is slightly more complex. It is a very complex Bill, and I think I am right in saying that it is the first time in my experience that we are taking other clause stand parts as part of the groups of amendments, because there is an enormous amount of crossover between the clauses. That will make it, for all of us, slightly harder to regulate. It is for that reason—the Minister was kind enough to say that I was reasonably generous in allowing a broad-ranging debate—that I think we are going to have to do that with this group.
I, and I am sure Ms Rees, will not wish to be draconian in seeking to call Members to order if you stray slightly outside the boundaries of a particular amendment. However, we have to get on with this, so please try not to be repetitive if you can possibly avoid it, although I accept that there may well be some cases where it is necessary.
That is a huge concern for us. As was brought up in our evidence sessions with Ofcom, it is recruiting, effectively, a fundraising officer for the regulator. That throws into question the potential longevity of the regulator’s funding and whether it is resourced effectively to properly scrutinise and regulate the online platforms. If that long-term resource is not available, how can the regulator effectively scrutinise and bring enforcement to bear against companies for enabling illegal activity?
Just to reassure the shadow Minister and her hon. Friend the Member for Liverpool, Walton, the Bill confers powers on Ofcom to levy fees and charges on the sector that it is regulating—so, on social media firms—to recoup its costs. We will debate that in due course—I think it is in clause 71, but that power is in the Bill.
I am grateful to the Minister for that clarification and I look forward to debating that further as the Bill progresses.
Returning to the senior managers and certificate regime in the financial services industry, it states that senior managers must be preapproved by the regulator, have their responsibilities set out in a statement of responsibilities and be subject to enhanced conduct standards. Those in banks are also subject to regulatory requirements on their remuneration. Again, it baffles me that we are not asking the same for child safety from online platforms and companies.
The money laundering regulations also use the threat of criminal offences to drive culture change. Individuals can be culpable for failure of processes, as well as for intent. I therefore hope that the Minister will carefully consider the need for the same to apply to our online space to make children safe.
Amendment 70 is a technical amendment that we will be discussing later on in the Bill. However, I am happy to move it in the name of the official Opposition.
I congratulate my own Front Bench on this important amendment. I would like the Minister to respond to the issue of transparency and the reason why only the regulator would have sight of these risk assessments. It is fundamental that civil society groups and academics have access to them. Her Majesty’s Revenue and Customs is an example of where that works very well. HMRC publishes a lot of its data, which is then used by academics and researchers to produce reports and documents that feed back into the policy making processes and HMRC’s work. It would be a missed opportunity if the information and data gathered by Ofcom were not widely available for public scrutiny.
I would reinforce the earlier points about accountability. There are too many examples—whether in the financial crash or the collapse of companies such as Carillion—where accountability was never there. Without this amendment and the ability to hold individuals to account for the failures of companies that are faceless to many people, the legislation risks being absolutely impotent.
Finally, I know that we will get back to the issue of funding in a later clause but I hope that the Minister can reassure the Committee that funding for the enforcement of these regulations will be properly considered.
Let me start by speaking to clauses 6, 7, 21 and 22 stand part. I will then address the amendments moved by the shadow Minister.
Order. I apologise for interrupting, Minister, but the stand part debates on clauses 7, 21 and 22 are part of the next grouping, not this one. I am fairly relaxed about it, but just be aware that you cannot have two debates on this.
The grouping sheet I have here suggests that clause 7 stand part and clauses 21 and 22 stand part are in this grouping, but if I have misunderstood—
No, there are two groups. Let me clarify this for everyone, because it is not as straightforward as it normally is. At the moment we are dealing with amendments 69 and 70. The next grouping, underneath this one on your selection paper, is the clause stand part debates—which is peculiar, as effectively we are having the stand part debate on clause 6 now. For the convenience of the Committee, and if the shadow Minister is happy, I am relaxed about taking all this together.
The hon. Lady can be called again. The Minister is not winding up at this point.
In the interests of simplicity, I will stick to the selection list and adapt my notes accordingly to confine my comments to amendments 69 and 70, and then we will come to the stand part debates in due course. I am happy to comply, Sir Roger.
Speaking of compliance, that brings us to the topic of amendments 69 and 70. It is worth reminding ourselves of the current enforcement provisions in the Bill, which are pretty strong. I can reassure the hon. Member for Liverpool, Walton that the enforcement powers here are far from impotent. They are very potent. As the shadow Minister acknowledged in her remarks, we are for the first time ever introducing senior management liability, which relates to non-compliance with information notices and offences of falsifying, encrypting or destroying information. It will be punishable by a prison sentence of up to two years. That is critical, because without that information, Ofcom is unable to enforce.
We have had examples of large social media firms withholding information and simply paying a large fine. There was a Competition and Markets Authority case a year or two ago where a large social media firm did not provide information repeatedly requested over an extended period and ended up paying a £50 million fine rather than providing the information. Let me put on record now that that behaviour is completely unacceptable. We condemn it unreservedly. It is because we do not want to see that happen again that there will be senior manager criminal liability in relation to providing information, with up to two years in prison.
In addition, for the other duties in the Bill there are penalties that Ofcom can apply for non-compliance. First, there are fines of up to 10% of global revenue. For the very big American social media firms, the UK market is somewhere just below 10% of their global revenue, so 10% of their global revenue is getting on for 100% of their UK revenue. That is a very significant financial penalty, running in some cases into billions of pounds.
In extreme circumstances—if those measures are not enough to ensure compliance—there are what amount to denial of service powers in the Bill, where essentially Ofcom can require internet service providers and others, such as payment providers, to disconnect the companies in the UK so that they cannot operate here. Again, that is a very substantial measure. I hope the hon. Member for Liverpool, Walton would agree that those measures, which are in the Bill already, are all extremely potent.
The question prompted by the amendment is whether we should go further. I have considered that issue as we have been thinking about updating the Bill—as hon. Members can imagine, it is a question that I have been debating internally. The question is whether we should go further and say there is personal criminal liability for breaches of the duties that go beyond information provision. There are arguments in favour, which we have heard, but there are arguments against as well. One is that if we introduce criminal liability for those other duties, that introduces a risk that the social media firms, fearing criminal prosecution, will become over-zealous and just take everything down because they are concerned about being personally liable. That could end up having a chilling effect on content available online and goes beyond what we in Parliament would intend.
In a moment.
For those reasons, I think we have drawn the line in the right place. There is personal criminal liability for information provision, with fines of 10% of local revenue and service disruption—unplugging powers—as well. Having thought about it quite carefully, I think we have struck the balance in the right place. We do not want to deter people from offering services in the UK. If they worried that they might go to prison too readily, it might deter people from locating here. I fully recognise that there is a balance to strike. I feel that the balance is being struck in the right place.
I will go on to comment on a couple of examples we heard about Carillion and the financial crisis, but before I do so, I will give way as promised.
I appreciate that the Minister says he has been swithering on this point—he has been trying to work out the correct place to draw the line. Given that we do not yet have a commitment for a standing committee—again, that is potentially being considered—we do not know how the legislation is going to work. Will the Minister, rather than accepting the amendment, give consideration to including the ability to make changes via secondary legislation so that there is individual criminal liability for different breaches? That would allow him the flexibility in the future, if the regime is not working appropriately, to add through secondary legislation individual criminal liability for breaches beyond those that are currently covered.
I have not heard that idea suggested. I will think about it. I do not want to respond off the cuff, but I will give consideration to the proposal. Henry VIII powers, which are essentially what the hon. Lady is describing—an ability through secondary legislation effectively to change primary legislation—are obviously viewed askance by some colleagues if too wide in scope. We do use them, of course, but normally in relatively limited circumstances. Creating a brand new criminal offence via what amounts to a Henry VIII power would be quite a wide application of the power, but it is an idea that I am perfectly happy to go away and reflect on. I thank her for mentioning the idea.
A couple of examples were given about companies that have failed in the past. Carillion was not a financial services company and there was no regulatory oversight of the company at all. In relation to financial services regulation, despite the much stricter regulation that existed in the run-up to the 2008 financial crisis, that crisis occurred none the less. [Interruption.] We were not in government at the time. We should be clear-eyed about the limits of what regulation alone can deliver, but that does not deter us from taking the steps we are taking here, which I think are extremely potent, for all the reasons that I mentioned and will not repeat.
Question put, That the amendment be made.
On clause 7, as I have previously mentioned, we were all pleased to see the Government bring in more provisions to tackle pornographic content online, much of which is easily accessible and can cause harm to those viewing it and potentially to those involved in it.
As we have previously outlined, a statutory duty of care for social platforms online has been missing for far too long, but we made it clear on Second Reading that such a duty will only be effective if we consider the systems, business models and design choices behind how platforms operate. For too long, platforms have been abuse-enabling environments, but it does not have to be this way. The amendments that we will shortly consider are largely focused on transparency, as we all know that the duties of care will only be effective if platforms are compelled to proactively supply their assessments to Ofcom.
On clause 21, the duty of care approach is one that the Opposition support and it is fundamentally right that search services are subject to duties including illegal content risk assessments, illegal content assessments more widely, content reporting, complaints procedures, duties about freedom of expression and privacy, and duties around record keeping. Labour has long held the view that search services, while not direct hosts of potentially damaging content, should have responsibilities that see them put a duty of care towards users first, as we heard in our evidence sessions from HOPE not hate and the Antisemitism Policy Trust.
It is also welcome that the Government have committed to introducing specific measures for regulated search services that are likely to be accessed by children. However, those measures can and must go further, so we will be putting forward some important amendments as we proceed.
Labour does not oppose clause 22, either, but I would like to raise some important points with the Minister. We do not want to be in a position whereby those designing, operating and using a search engine in the United Kingdom are subject to a second-rate internet experience. We also do not want to be in a position where we are forcing search services to choose what is an appropriate design for people in the UK. It would be worrying indeed if our online experience vastly differed from that of, let us say, our friends in the European Union. How exactly will clause 22 ensure parity? I would be grateful if the Minister could confirm that before we proceed.
The shadow Minister has already touched on the effect of these clauses: clause 6 sets out duties applying to user-to-user services in a proportionate and risk-based way; clause 7 sets out the scope of the various duties of care; and clauses 21 and 22 do the same in relation to search services.
In response to the point about whether the duties on search will end up providing a second-rate service in the United Kingdom, I do not think that they will. The duties have been designed to be proportionate and reasonable. Throughout the Bill, Members will see that there are separate duties for search and for user-to-user services. That is reflected in the symmetry—which appears elsewhere, too—of clauses 6 and 7, and clauses 21 and 22. We have done that because we recognise that search is different. It indexes the internet; it does not provide a user-to-user service. We have tried to structure these duties in a way that is reasonable and proportionate, and that will not adversely impair the experience of people in the UK.
I believe that we are ahead of the European Union in bringing forward this legislation and debating it in detail, but the European Union is working on its Digital Services Act. I am confident that there will be no disadvantage to people conducting searches in United Kingdom territory.
Question put and agreed to.
Clause 6 accordingly ordered to stand part of the Bill.
Clause 7 ordered to stand part of the Bill.
Clause 8
Illegal content risk assessment duties
I beg to move amendment 10, in clause 8, page 6, line 33, at end insert—
“(4A) A duty to publish the illegal content risk assessment and proactively supply this to OFCOM.”
This amendment creates a duty to publish an illegal content risk assessment and supply it to Ofcom.
Clause 8 sets out the risk assessment duties for illegal content, as already discussed, that apply to user-to-user services. Ofcom will issue guidance on how companies can undertake those. To comply with those duties, companies will need to take proportionate measures to mitigate the risks identified in those assessments. The clause lists a number of potential risk factors the providers must assess, including how likely it is that users will encounter illegal content, as defined later in the Bill,
“by means of the service”.
That phrase is quite important, and I will come to it later, on discussing some of the amendments, because it does not necessarily mean just on the service itself but, in a cross-platform point, other sites where users might find themselves via the service. That phrase is important in the context of some of the reasonable queries about cross-platform risks.
Moving on, companies will also need to consider how the design and operation of their service may reduce or increase the risks identified. Under schedule 3, which we will vote on, or at least consider, later on, companies will have three months to carry out risk assessments, which must be kept up to date so that fresh risks that may arise from time to time can be accommodated. Therefore, if changes are made to the service, the risks can be considered on an ongoing basis.
Amendment 10 relates to the broader question that the hon. Member for Liverpool, Walton posed about transparency. The Bill already contains obligations to publish summary risk assessments on legal but harmful content. That refers to some of the potentially contentious or ambiguous types of content for which public risk assessments would be helpful. The companies are also required to make available those risk assessments to Ofcom on request. That raises a couple of questions, as both the hon. Member for Liverpool, Walton mentioned and some of the amendments highlighted. Should companies be required to proactively serve up their risk assessments to Ofcom, rather than wait to be asked? Also, should those risk assessments all be published—probably online?
In considering those two questions, there are a couple of things to think about. The first is Ofcom’s capacity. As we have discussed, 25,000 services are in scope. If all those services proactively delivered a copy of their risk assessment, even if they are very low risk and of no concern to Ofcom or, indeed, any of us, they would be in danger of overwhelming Ofcom. The approach contemplated in the Bill is that, where Ofcom has a concern or the platform is risk assessed as being significant—to be clear, that would apply to all the big platforms—it will proactively make a request, which the platform will be duty bound to meet. If the platform does not do that, the senior manager liability and the two years in prison that we discussed earlier will apply.
The Minister mentioned earlier that Ofcom would be adequately resourced and funded to cope with the regulatory duty set out in the Bill. If Ofcom is not able to receive risk assessments for all the platforms potentially within scope, even if those platforms are not deemed to be high risk, does that not call into question whether Ofcom has the resource needed to actively carry out its duties in relation to the Bill?
Of course, Ofcom is able to request any of them if it wants to—if it feels that to be necessary—but receiving 25,000 risk assessments, including from tiny companies that basically pose pretty much no risk at all and hardly anyone uses, would, I think, be an unreasonable and disproportionate requirement to impose. I do not think it is a question of the resources being inadequate; it is a question of being proportionate and reasonable.
The point I was trying to get the Minister to think about was the action of companies in going through the process of these assessments and then making that information publicly available to civil society groups; it is about transparency. It is what the sector needs; it is the way we will find and root out the problems, and it is a great missed opportunity in this Bill.
To reassure the hon. Member on the point about doing the risk assessment, all the companies have to do the risk assessment. That obligation is there. Ofcom can request any risk assessment. I would expect, and I think Parliament would expect, it to request risk assessments either where it is concerned about risk or where the platform is particularly large and has a very high reach—I am thinking of Facebook and companies like that. But hon. Members are talking here about requiring Ofcom to receive and, one therefore assumes, to consider, because what is the point of receiving an assessment unless it considers it? Receiving it and just putting it on a shelf without looking at it would be pointless, obviously. Requiring Ofcom to receive and look at potentially 25,000 risk assessments strikes me as a disproportionate burden. We should be concentrating Ofcom’s resources—and it should concentrate its activity, I submit—on those companies that pose a significant risk and those companies that have a very high reach and large numbers of users. I suggest that, if we imposed an obligation on it to receive and to consider risk assessments for tiny companies that pose no risk, that would not be the best use of its resources, and it would take away resources that could otherwise be used on those companies that do pose risk and that have larger numbers of users.
Just to be clear, we are saying that the only reason why we should not be encouraging the companies to do the risk assessment is that Ofcom might not be able to cope with dealing with all the risk assessments. But surely that is not a reason not to do it. The risk assessment is a fundamental part of this legislation. We have to be clear that there is no point in the companies having those risk assessments if they are not visible and transparent.
All the companies have to do the risk assessment, for example for the “illegal” duties, where they are required to by the Bill. For the “illegal” duties, that is all of them; they have to do those risk assessments. The question is whether they have to send them to Ofcom—all of them—even if they are very low risk or have very low user numbers, and whether Ofcom, by implication, then has to consider them, because it would be pointless to require them to be sent if they were not then looked at. We want to ensure that Ofcom’s resources are pointed at the areas where the risks arise. Ofcom can request any of these. If Ofcom is concerned—even a bit concerned—it can request them.
Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.
If the rules are so bad that people can circumvent them, they are not good enough anyway and they need to be updated, but I have a specific question on this. The Minister says that Ofcom will be taking in the biggest risk assessments, looking at them and ensuring that they are adequate. Will he please give consideration to asking Ofcom to publish the risk assessments from the very biggest platforms? Then they will all be in one place. They will be easy for people to find and people will not have to rake about in the bottom sections of a website. And it will apply only in the case of the very biggest, most at risk platforms, which should be regularly updating their risk assessments and changing their processes on a very regular basis in order to ensure that people are kept safe.
Order. I am sorry to interrupt the Minister, but I now have to adjourn the sitting until this afternoon, when the Committee will meet again, in Room 9 and with Ms Rees in the Chair.
(2 years, 6 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I remind the Committee that with this we are discussing the following:
Amendment 14, in clause 8, page 6, line 33, at end insert—
“(4A) A duty for the illegal content risk assessment to be approved by either—
(a) the board of the entity; or, if the organisation does not have a board structure,
(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the illegal content risk assessment duties, and reports directly into the most senior employee of the entity.”
This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for illegal content risk assessments.
Amendment 25, in clause 8, page 7, line 3, after the third “the” insert “production,”.
This amendment requires the risk assessment to take into account the risk of the production of illegal content, as well as the risk of its presence and dissemination.
Amendment 19, in clause 8, page 7, line 14, at end insert—
“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—
(i) enable users to encounter illegal content on other regulated user-to-user services, and
(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”
This amendment would incorporate into the duties a requirement to consider cross-platform risk.
Clause stand part.
Amendment 20, in clause 9, page 7, line 30, at end insert—
“, including by being directed while on the service towards priority illegal content hosted by a different service;”.
This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.
Amendment 26, in clause 9, page 7, line 30, at end insert—
“(aa) prevent the production of illegal content by means of the service;”.
This amendment incorporates a requirement to prevent the production of illegal content within the safety duties.
Amendment 18, in clause 9, page 7, line 35, at end insert—
“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”
This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.
Amendment 21, in clause 9, page 7, line 35, at end insert—
“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content,”.
This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.
Clause 9 stand part.
Amendment 30, in clause 23, page 23, line 24, after “facilitating” insert—
“the production of illegal content and”.
This amendment requires the illegal content risk assessment to consider the production of illegal content.
Clause 23 stand part.
Amendment 31, in clause 24, page 24, line 2, after “individuals” insert “producing or”.
This amendment expands the safety duty to include the need to minimise the risk of individuals producing certain types of search content.
Clause 24 stand part.
It is a great pleasure to serve under your chairmanship, Ms Rees, and I am glad that this afternoon’s Committee proceedings are being broadcast to the world.
Before we adjourned this morning, I was in the process of saying that one of the challenges with public publication of the full risk assessment, even for larger companies, is that the vulnerabilities in their systems, or the potential opportunities to exploit those systems for criminal purposes, would then be publicly exposed in a way that may not serve the public interest, and that is a reason for not requiring complete disclosure of everything.
However, I draw the Committee’s attention to the existing transparency provisions in clause 64. We will come on to them later, but I want to mention them now, given that they are relevant to amendment 10. The transparency duties state that, once a year, Ofcom must serve notice on the larger companies—those in categories 1, 2A and 2B—requiring them to produce a transparency report. That is not a power for Ofcom—it is a requirement. Clause 64(1) states that Ofcom
“must give every provider…a notice which requires the provider to produce…(a ‘transparency report’).”
The content of the transparency report is specified by Ofcom, as set out in subsection (3). As Members will see, Ofcom has wide powers to specify what must be included in the report. On page 186, schedule 8—I know that we will debate it later, but it is relevant to the amendment—sets out the scope of what Ofcom can require. It is an extremely long list that covers everything we would wish to see. Paragraph 1, for instance, states:
“The incidence of illegal content, content that is harmful to children and priority content that is harmful to adults on a service.”
Therefore, the transparency reporting requirement—it is not an option but a requirement—in clause 64 addresses the transparency point that was raised earlier.
Amendment 14 would require a provider’s board members or senior manager to take responsibility for the illegal content risk assessment. We agree with the Opposition’s point. Indeed, we agree with what the Opposition are trying to achieve in a lot of their amendments.
There is a “but” coming. We think that, in all cases apart from one, the Bill as drafted already addresses the matter. In the case of amendment 14, the risk assessment duties as drafted already explicitly require companies to consider how their governance structures may affect the risk of harm to users arising from illegal content. Ofcom will provide guidance to companies about how they can comply with those duties, which is very likely to include measures relating to senior-level engagement. In addition, Ofcom can issue confirmation decisions requiring companies to take specific steps to come into compliance. To put that simply, if Ofcom thinks that there is inadequate engagement by senior managers in relation to the risk assessment duties, it can require—it has the power to compel—a change of behaviour by the company.
I come now to clause 9—I think this group includes clause 9 stand part as well. The shadow Minister has touched on this. Clause 9 contains safety duties in relation to—
I think the group includes clause 9 stand part, but I will of course be guided by you, Ms Rees.
Very well; we will debate clause 9 separately. In that case, I will move on to amendments 19 and 20, which seek to address cross-platform risk. Again, we completely agree with the Opposition that cross-platform risk is a critical issue. We heard about it in evidence. It definitely needs to be addressed and covered by the Bill. We believe that it is covered by the Bill, and our legal advice is that it is covered by the Bill, because in clause 8 as drafted—[Interruption.] Bless you—or rather, I bless the shadow Minister, following Sir Roger’s guidance earlier, lest I inadvertently bless the wrong person.
Clause 8 already includes the phrase to which I alluded previously. I am talking about the requirement that platforms risk-assess illegal content that might be encountered
“by means of the service”.
That is a critical phrase, because it means not just on that service itself; it also means, potentially, via that service if, for example, that service directs users onward to illegal content on another site. By virtue of the words,
“by means of the service”,
appearing in clause 8 as drafted, the cross-platform risk that the Opposition and witnesses have rightly referred to is covered. Of course, Ofcom will set out further steps in the code of practice as well.
I was listening very closely to what the Minister was saying and I was hoping that he might be able to comment on some of the evidence that was given, particularly by Professor Lorna Woods, who talked about the importance of risk assessments being about systems, not content. Would the Minister pick up on that point? He was touching on it in his comments, and I was not sure whether this was the appropriate point in the Bill at which to bring it up.
I thank my right hon. Friend for raising that. The risk assessments and, indeed, the duties arising under this Bill all apply to systems and processes—setting up systems and processes that are designed to protect people and to prevent harmful and illegal content from being encountered. We cannot specify in legislation every type of harmful content that might be encountered. This is about systems and processes. We heard the Chairman of the Joint Committee on the draft Online Safety Bill, our hon. Friend the Member for Folkestone and Hythe (Damian Collins), confirm to the House on Second Reading his belief—his accurate belief—that the Bill takes a systems-and-processes approach. We heard some witnesses saying that as well. The whole point of this Bill is that it is tech-agnostic—to future-proof it, as hon. Members mentioned this morning—and it is based on systems and processes. That is the core architecture of the legislation that we are debating.
Amendments 25 and 26 seek to ensure that user-to-user services assess and mitigate the risk of illegal content being produced via functions of the service. That is covered, as it should be—the Opposition are quite right to raise the point—by the illegal content risk assessment and safety duties in clauses 8 and 9. Specifically, clause 8(5)(d), on page 7 of the Bill—goodness, we are only on page 7 and we have been going for over half a day already—requires services to risk-assess functionalities of their service being used to facilitate the presence of illegal content. I stress the word “presence” in clause 8(5)(d). Where illegal content is produced by a functionality of the service—for example, by being livestreamed—that content will be present on the service and companies must mitigate that risk. The objective that the Opposition are seeking to achieve, and with which we completely agree with, is covered in clause 8(5)(d) by the word “presence”. If the content is present, it is covered by that section.
Specifically on that, I understand the point the hon. Gentleman is making and appreciate his clarification. However, on something such as Snapchat, if somebody takes a photo, it is sent to somebody else, then disappears immediately, because that is what Snapchat does—the photo is no longer present. It has been produced and created there, but it is not present on the platform. Can the Minister consider whether the Bill adequately covers all the instances he hopes are covered?
The hon. Lady raises an interesting point about time. However, the clause 8(5)(d) uses the wording,
“the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content”
and so on. That presence can happen at any time, even fleetingly, as with Snapchat. Even when the image self-deletes after a certain period—so I am told, I have not actually used Snapchat—the presence has occurred. Therefore, that would be covered by clause 8(5)(d).
Will the Minister explain how we would be able to prove, once the image is deleted, that it was present on the platform?
The question of proof is a separate one, and that would apply however we drafted the clause. The point is that the clause provides that any presence of a prohibited image would fall foul of the clause. There are also duties on the platforms to take reasonable steps. In the case of matters such as child sexual exploitation and abuse images, there are extra-onerous duties that we have discussed before, for obvious and quite correct reasons.
Will the Minister stress again that in this clause specifically he is talking about facilitating any presence? That is the wording that he has just used. Can he clarify exactly what he means? If the Minister were to do so, it would be an important point for the Bill as it proceeds.
I am happy to follow your direction, Ms Rees. I find that that is usually the wisest course of action.
I will speak to amendment 18, which is definitely on the agenda for this grouping and which the shadow Minister addressed earlier. It would oblige service providers to put in place systems and processes
“to minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”
The Government completely support that objective, quite rightly promoted by the Opposition, but it is set out in the Bill as drafted. The companies in scope are obliged to take comprehensive measures to tackle CSEA content, including where a service directs users on the first service to the second service.
Amendment 21, in a similar spirit, talks about cross-platform collaboration. I have already mentioned the way in which the referral of a user from one platform to another is within the scope of the Bill. Again, under its provisions, service providers must put in place proportionate systems and processes to mitigate identified cross-platform harms and, where appropriate, to achieve that objective service providers would be expected to collaborate and communicate with one another. If Ofcom finds that they are not engaging in appropriate collaborative behaviour, which means they are not discharging their duty to protect people and children, it can intervene. While agreeing completely with the objective sought, the Bill already addresses that.
Obviously, I encourage the Committee to support those clauses standing part of the Bill. They impose duties on search services—we touched on search a moment ago—to assess the nature and risk to individuals of accessing illegal content via their services, and to minimise the risk of users encountering that illegal content. They are very similar duties to those we discussed for user-to-user services, but applied in the search context. I hope that that addresses all the relevant provisions in the group that we are debating.
I am grateful for the opportunity to speak to amendments to clause 9 and to clauses 23 and 24, which I did not speak on earlier. I am also very grateful that we are being broadcast live to the world and welcome that transparency for all who might be listening.
On clause 9, it is right that the user-to-user services will be required to have specific duties and to take appropriate measures to mitigate and manage the risk of harm to individuals and their likelihood of encountering priority illegal content. Again, however, the Bill does not go far enough, which is why we are seeking to make these important amendments. On amendment 18, it is important to stress that the current scope of the Bill does not capture the range of ways in which child abusers use social networks to organise abuse, including to form offender networks. They post digital breadcrumbs that signpost to illegal content on third-party messaging apps and the dark web, and they share child abuse videos that are carefully edited to fall within content moderation guidelines. This range of techniques, known as child abuse breadcrumbing, is a significant enabler of online child abuse.
Our amendment would give the regulator powers to tackle breadcrumbing and ensure a proactive upstream response. The amendment would ensure that tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material will be brought into regulatory scope. It will not leave that as ambiguous. The amendment will also ensure that companies must tackle child abuse at the earliest possible stage. As it stands, the Bill would reinforce companies’ current focus only on material that explicitly reaches the criminal threshold. Because companies do not focus their approach on other child abuse material, abusers can exploit this knowledge to post carefully edited child abuse images and content that enables them to connect and form networks with other abusers. Offenders understand and can anticipate that breadcrumbing material will not be proactively identified or removed by the host site, so they are able to organise and link to child abuse in plain sight.
We all know that child abuse breadcrumbing takes many forms, but techniques include tribute sites where users create social media profiles using misappropriated identities of known child abuse survivors. These are used by offenders to connect with likeminded perpetrators to exchange contact information, form offender networks and signpost child abuse material elsewhere online. In the first quarter of 2021, there were 6 million interactions with such accounts.
Abusers may also use Facebook groups to build offender groups and signpost to child abuse hosted on third-party sites. Those groups are thinly veiled in their intentions; for example, as we heard in evidence sessions, groups are formed for those with an interest in children celebrating their 8th, 9th or 10th birthdays. Several groups with over 50,000 members remained alive despite being reported to Meta, and algorithmic recommendations quickly suggested additional groups for those members to join.
Lastly, abusers can signpost to content on third-party sites. Abusers are increasingly using novel forms of technology to signpost to online child abuse, including QR codes, immersive technologies such as the metaverse, and links to child abuse hosted on the blockchain. Given the highly agile nature of the child abuse threat and the demonstrable ability of sophisticated offenders to exploit new forms of technology, this amendment will ensure that the legislation is effectively futureproofed. Technological change makes it increasingly important that the ability of child abusers to connect and form offender networks can be disrupted at the earliest possible stage.
Turning to amendment 21, we know that child abuse is rarely siloed on a single platform or app. Well-established grooming pathways see abusers exploit the design features of social networks to contact children before they move communication across to other platforms, including livestreaming sites, as we have already heard, and encrypted messaging services. Offenders manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with a large number of children. They can then use direct messages to groom them and coerce children into sending sexual images via WhatsApp. Similarly, as we heard earlier, abusers can groom children through playing videogames and then bringing them on to another ancillary platform, such as Discord.
The National Society for the Prevention of Cruelty to Children has shared details of an individual whose name has been changed, and whose case particularly highlights the problems that children are facing in the online space. Ben was 14 when he was tricked on Facebook into thinking he was speaking to a female friend of a friend, who turned out to be a man. Using threats and blackmail, he coerced Ben into sending abuse images and performing sex acts live on Skype. Those images and videos were shared with five other men, who then bombarded Ben with further demands. His mum, Rachel, said:
“The abuse Ben suffered had a devastating impact on our family. It lasted two long years, leaving him suicidal.
It should not be so easy for an adult to meet and groom a child on one site then trick them into livestreaming their own abuse on another app, before sharing the images with like-minded criminals at the click of a button.
Social media sites should have to work together to stop this abuse happening in the first place, so other children do not have to go through what Ben did.”
The current drafting of the Bill does not place sufficiently clear obligations on platforms to co-operate on the cross-platform nature of child abuse. Amendment 21 would require companies to take reasonable and proportionate steps to share threat assessments, develop proportionate mechanisms to share offender intelligence, and create a rapid response arrangement to ensure that platforms develop a coherent, systemic approach to new and emerging threats. Although the industry has developed a systemic response to the removal of known child abuse images, these are largely ad hoc arrangements that share information on highly agile risk profiles. The cross-platform nature of grooming and the interplay of harms across multiple services need to be taken into account. If it is not addressed explicitly in the Bill, we are concerned that companies may be able to cite competition concerns to avoid taking action.
I completely agree with the hon. Member, and appreciate her intervention. It is fundamental for this point to be captured in the Bill because, as we are seeing, this is happening more and more. More and more victims are coming forward who have been subject to livestreaming that is not picked up by the technology available, and is then recorded and posted elsewhere on smaller platforms.
Legal advice suggests that cross-platform co-operation is likely to be significantly impeded by the negative interplay with competition law unless there is a clear statutory basis for enabling or requiring collaboration. Companies may legitimately have different risk and compliance appetites, or may simply choose to hide behind competition law to avoid taking a more robust form of action.
New and emerging technologies are likely to produce an intensification of cross-platform risks in the years ahead, and we are particularly concerned about the child abuse impacts in immersive virtual reality and alternative-reality environments, including the metaverse. A number of high-risk immersive products are already designed to be platform-agnostic, meaning that in-product communication takes place between users across multiple products and environments. There is a growing expectation that these environments will be built along such lines, with an incentive for companies to design products in this way in the hope of blunting the ability of Governments to pursue user safety objectives.
Separately, regulatory measures that are being developed in the EU, but are highly likely to impact service users in the UK, could result in significant unintended safety consequences. Although the interoperability provisions in the Digital Markets Act are strongly beneficial when viewed through a competition lens—they will allow the competition and communication of multiple platforms—they could, without appropriate safety mitigations, provide new means for abusers to contact children across multiple platforms, significantly increase the overall profile of cross-platform risk, and actively frustrate a broad number of current online safety responses. Amendment 21 will provide corresponding safety requirements that can mitigate the otherwise significant potential for unintended consequences.
The Minister referred to clauses 23 and 24 in relation to amendments 30 and 31. We think a similar consideration should apply for search services as well as for user-to-user services. We implore that the amendments be made, in order to prevent those harms from occurring.
I have already commented on most of those amendments, but one point that the shadow Minister made that I have not addressed was about acts that are essentially preparatory to acts of child abuse or the exchange of child sexual exploitation and abuse images. She was quite right to raise that issue as a matter of serious concern that we would expect the Bill to prevent, and I offer the Committee the reassurance that the Bill, as drafted, does so.
Schedule 6 sets out the various forms of child sexual exploitation and abuse that are designated as priority offences and that platforms have to take proactive steps to prevent. On the cross-platform point, that includes, as we have discussed, things that happen through a service as well as on a service. Critically, paragraph 9 of schedule 6 includes “inchoate offences”, which means someone not just committing the offence but engaging in acts that are preparatory to committing the offence, conspiring to commit the offence, or procuring, aiding or abetting the commission of the offence. The preparatory activities that the shadow Minister referred to are covered under schedule 6, particularly paragraph 9.
I thank the Minister for giving way. I notice that schedule 6 includes provision on the possession of indecent photographs of children. Can he confirm that that provision encapsulates the livestreaming of sexual exploitation?
As this is the first time I have spoken in the Committee, may I say that it is a pleasure to serve with you in the Chair, Ms Rees? I agree with my hon. Friend the Member for Pontypridd that we are committed to improving the Bill, despite the fact that we have some reservations, which we share with many organisations, about some of the structure of the Bill and some of its provisions. As my hon. Friend has detailed, there are particular improvements to be made to strengthen the protection of children online, and I think the Committee’s debate on this section is proving fruitful.
Amendment 28 is a good example of where we must go further if we are to achieve the goal of the Bill and protect children from harm online. The amendment seeks to require regulated services to assess their level of risk based, in part, on the frequency with which they are blocking, detecting and removing child sexual exploitation and abuse content from their platforms. By doing so, we will be able to ascertain the reality of their overall risk and the effectiveness of their existing response.
The addition of livestreamed child sexual exploitation and abuse content not only acknowledges first-generation CSEA content, but recognises that livestreamed CSEA content happens on both public and private channels, and that they require different methods of detection.
Furthermore, amendment 28 details the practical information needed to assess whether the action being taken by a regulated service is adequate in countering the production and dissemination of CSEA content, in particular first-generation CSEA content. Separating the rates of terminated livestreams of CSEA in public and private channels is important, because those rates may vary widely depending on how CSEA content is generated. By specifying tools, strategies and interventions, the amendment would ensure that the systems in place to detect and report CSEA are adequate, and that is why we would like it to be part of the Bill.
The Government support the spirit of amendments 17 and 28, which seek to achieve critical objectives, but the Bill as drafted delivers those objectives. In relation to amendment 17 and cross-platform risk, clause 8 already sets out harms and risks—including CSEA risks—that arise by means of the service. That means through the service to other services, as well as on the service itself, so that is covered.
Amendment 28 calls for the risk assessments expressly to cover illegal child sexual exploitation content, but clause 8 already requires that to happen. Clause 8(5) states that the risk assessment must cover the
“risk of individuals who are users of the service encountering…each kind of priority illegal content”.
If we follow through the definition of priority illegal content, we find all those CSEA offences listed in schedule 6. The objective of amendment 28 is categorically delivered by clause 8(5)(b), referencing onwards to schedule 6.
The amendment specifically mentions the level and rates of those images. I did not quite manage to follow through all the things that the Minister just spoke about, but does the clause specifically talk about the level of those things, rather than individual incidents, the possibility of incidents or some sort of threshold for incidents, as in some parts of the Bill?
The risk assessments that clause 8 requires have to be suitable and sufficient; they cannot be perfunctory and inadequate in nature. I would say that suitable and sufficient means they must go into the kind of detail that the hon. Lady requests. More details, most of which relate to timing, are set out in schedule 3. Ofcom will be making sure that these risk assessments are not perfunctory.
Importantly, in relation to CSEA reporting, clause 59, which we will come to, places a mandatory requirement on in-scope companies to report to the National Crime Agency all CSEA content that they detect on their platforms, if it has not already been reported. Not only is that covered by the risk assessments, but there is a criminal reporting requirement here. Although the objectives of amendments 17 and 28 are very important, I submit to the Committee that the Bill delivers the intention behind them already, so I ask the shadow Minister to withdraw them.
Question put, That the amendment be made.
I will speak to other amendments in this group as well as amendment 15. The success of the Bill’s regulatory framework relies on regulated companies carefully risk-assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations. However, up to now, boards and top executives have not taken the risk to children seriously. Services have either not considered producing risk assessments or, if they have done so, they have been of limited efficacy and failed to identify and respond to harms to children.
In evidence to the Joint Committee, Frances Haugen explained that many of the corporate structures involved are flat, and accountability for decision making can be obscure. At Meta, that means teams will focus only on delivering against key commercial metrics, not on safety. Children’s charities have also noted that corporate structures in the large technology platforms reward employees who move fast and break things. Those companies place incentives on increasing return on investment rather than child safety. An effective risk assessment and risk mitigation plan can impact on profit, which is why we have seen so little movement from companies to take the measures themselves without the duty being placed on them by legislation.
It is welcome that clause 10 introduces a duty to risk-assess user-to-user services that are likely to be accessed by children. But, as my hon. Friend the Member for Pontypridd said this morning, it will become an empty, tick-box exercise if the Bill does not also introduce the requirement for boards to review and approve the risk assessments.
The Joint Committee scrutinising the draft Bill recommended that the risk assessment be approved at board level. The Government rejected that recommendation on the grounds thar Ofcom could include that in its guidance on producing risk assessments. As with much of the Bill, it is difficult to blindly accept promised safeguards when we have not seen the various codes of practice and guidance materials. The amendments would make sure that decisions about and awareness of child safety went right to the top of regulated companies. The requirement to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making and create accountability and responsibility at the most senior level of the organisation. That should trickle down the organisation and help embed a culture of compliance across it. Unless there is a commitment to child safety at the highest level of the organisation, we will not see the shift in attitude that is urgently needed to keep children safe, and which I believe every member of the Committee subscribes to.
On amendments 11 and 13, it is welcome that we have risk assessments for children included in the Bill, but the effectiveness of that duty will be undermined unless the risk assessments can be available for scrutiny by the public and charities. In the current version of the Bill, risk assessments will only be made available to the regulator, which we debated on an earlier clause. Companies will be incentivised to play down the likelihood of currently emerging risks because of the implications of having to mitigate against them, which may run counter to their business interests. Unless the risk assessments are published, there will be no way to hold regulated companies to account, nor will there be any way for companies to learn from one another’s best practice, which is a very desirable aim.
The current situation shows that companies are unwilling to share risk assessments even when requested. In October 2021, following the whistleblower disclosures made by Frances Haugen, the National Society for the Prevention of Cruelty to Children led a global coalition of 60 child protection organisations that urged Meta to publish its risk assessments, including its data privacy impact assessments, which are a legal requirement under data protection law. Meta refused to share any of its risk assessments, even in relation to child sexual abuse and grooming. The company argued that risk assessments were live documents and it would not be appropriate for it to share them with any organisation other than the Information Commissioner’s Office, to whom it has a legal duty to disclose. As a result, civil society organisations and the charities that I talked about continue to be in the dark about whether and how Meta has appropriately identified online risk to children.
Making risk assessments public would support the smooth running of the regime and ensure its broader effectiveness. Civil society and other interested groups would be able to assess and identify any areas where a company might not be meeting its safety duties and make full, effective use of the proposed super-complaints mechanism. It will also help civil society organisations to hold the regulated companies and the regulator, Ofcom, to account.
As we have seen from evidence sessions, civil society organisations are often at the forefront of understanding and monitoring the harms that are occurring to users. They have an in depth understanding of what mitigations may be appropriate and they may be able to support the regulator to identify any obvious omissions. The success of the systemic risk assessment process will be significantly underpinned by and reliant upon the regulator’s being able to rapidly and effectively identify new and emerging harms, and it is highly likely that the regulator will want to draw on civil society expertise to ensure that it has highly effective early warning functions in place.
However, civil society organisations will be hampered in that role if they remain unable to determine what, if anything, companies are doing to respond to online threats. If Ofcom is unable to rapidly identify new and emerging harms, the resulting delays could mean entire regulatory cycles where harms were not captured in risk profiles or company risk assessments, and an inevitable lag between harms being identified and companies being required to act upon them. It is therefore clear that there is a significant public value to publishing risk assessments.
Amendments 27 and 32 are almost identical to the suggested amendments to clause 8 that we discussed earlier. As my hon. Friend the Member for Pontypridd said in our discussion about amendments 25, 26 and 30, the duty to carry out a suitable and sufficient risk assessment could be significantly strengthened by preventing the creation of illegal content, not only preventing individuals from encountering it. I know the Minister responded to that point, but the Opposition did not think that response was fully satisfactory. This is just as important for children’s risk assessments as it is for illegal content risk assessments.
Online platforms are not just where abusive material is published. Sex offenders use mainstream web platforms and services as tools to commit child sexual abuse. This can be seen particularly in the livestreaming of child sexual exploitation. Sex offenders pay to direct and watch child sexual abuse in real time. The Philippines is a known hotspot for such abuse and the UK has been identified by police leads as the third-largest consumer of livestreamed abuse in the world. What a very sad statistic that our society is the third-largest consumer of livestreamed abuse in the world.
Ruby is a survivor of online sexual exploitation in the Philippines, although Ruby is not her real name; she recently addressed a group of MPs about her experiences. She told Members how she was trafficked into sexual exploitation aged 16 after being tricked and lied to about the employment opportunities she thought she would be getting. She was forced to perform for paying customers online. Her story is harrowing. She said:
“I blamed myself for being trapped. I felt disgusted by every action I was forced to do, just to satisfy customers online. I lost my self-esteem and I felt very weak. I became so desperate to escape that I would shout whenever I heard a police siren go by, hoping somebody would hear me. One time after I did this, a woman in the house threatened me with a knife.”
Eventually, Ruby was found by the Philippine authorities and, after a four-year trial, the people who imprisoned her and five other girls were convicted. She said it took many years to heal from the experience, and at one point she nearly took her own life.
It should be obvious that if we are to truly improve child protection online we need to address the production of new child abuse material. In the Bill, we have a chance to address not only what illegal content is seen online, but how online platforms are used to perpetrate abuse. It should not be a case of waiting until the harm is done before taking action.
As the hon. Lady said, we discussed in the groupings for clauses 8 and 9 quite a few of the broad principles relating to children, but I will none the less touch on some of those points again because they are important.
On amendment 27, under clause 8 there is already an obligation on platforms to put in place systems and processes to reduce the risk that their services will be used to facilitate the presence of illegal content. As that includes the risk of illegal content being present, including that produced via the service’s functionality, the terrible example that the hon. Lady gave is already covered by the Bill. She is quite right to raise that example, because it is terrible when such content involving children is produced, but such cases are expressly covered in the Bill as drafted, particularly in clause 8.
Amendment 31 covers a similar point in relation to search. As I said for the previous grouping, search does not facilitate the production of content; it helps people to find it. Clearly, there is already an obligation on search firms to stop people using search engines to find illegal content, so the relevant functionality in search is already covered by the Bill.
Amendments 15 and 16 would expressly require board member sign-off for risk assessments. I have two points to make on that. First, the duties set out in clause 10(6)(h) in relation to children’s risk assessments already require the governance structures to be properly considered, so governance is directly addressed. Secondly, subsection (2) states that the risk assessment has to be “suitable and sufficient”, so it cannot be done in a perfunctory or slipshod way. Again, Ofcom must be satisfied that those governance arrangements are appropriate. We could invent all the governance arrangements in the world, but the outcome needs to be delivered and, in this case, to protect children.
Beyond governance, the most important things are the sanctions and enforcement powers that Ofcom can use if those companies do not protect children. As the hon. Lady said in her speech, we know that those companies are not doing enough to protect children and are allowing all kinds of terrible things to happen. If those companies continue to allow those things to happen, the enforcement powers will be engaged, and they will be fined up to 10% of their global revenue. If they do not sort it out, they will find that their services are disconnected. Those are the real teeth that will ensure that those companies comply.
I know that the Minister listened to Frances Haugen and to the members of charities. The charities and civil society organisations that are so concerned about this point do not accept that the Bill addresses it. I cannot see how his point addresses what I said about board-level acceptance of that role in children’s risk assessments. We need to change the culture of those organisations so that they become different from how they were described to us. He, like us, was sat there when we heard from the big platform providers, and they are not doing enough. He has had meetings with Frances Haugen; he knows what they are doing. It is good and welcome that the regulator will have the powers that he mentions, but that is just not enough.
I agree with the hon. Lady that, as I said a second ago, those platforms are not doing enough to protect children. There is no question about that at all, and I think there is unanimity across the House that they are not doing enough to protect children.
I do not think the governance point is a panacea. Frankly, I think the boards of these companies are aware of what is going on. When these big questions arise, they go all the way up to Mark Zuckerberg. It is not as if Mark Zuckerberg and the directors of companies such as Meta are unaware of these risks; they are extremely aware of them, as Frances Haugen’s testimony made clear.
We do address the governance point. As I say, the risk assessments do need to explain how governance matters are deployed to consider these things—that is in clause 10(6)(h). But for me, it is the sanctions—the powers that Ofcom will have to fine these companies billions of pounds and ultimately to disconnect their service if they do not protect our children—that will deliver the result that we need.
The Minister is talking about companies of such scale that even fines of billions will not hurt them. I refer him to the following wording in the amendments:
“a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties”.
That is the minimum we should be asking. We should be asking these platforms, which are doing so much damage and have had to be dragged to the table to do anything at all, to be prepared to appoint somebody who is responsible. The Minister tries to gloss over things by saying, “Oh well, they must be aware of it.” The named individual would have to be aware of it. I hope he understands the importance of his role and the Committee’s role in making this happen. We could make this happen.
As I say, clause 10 already references the governance arrangements, but my strong view is that the only thing that will make these companies sit up and take notice—the only thing that will make them actually protect children in a way they are currently not doing—is the threat of billions of pounds of fines and, if they do not comply even after being fined at that level, the threat of their service being disconnected. Ultimately, that is the sanction that will make these companies protect our children.
As my hon. Friend the Member for Worsley and Eccles South has said, the point here is about cultural change, and the way to do that is through leadership. It is not about shutting the gate after the horse has bolted. Fining the companies might achieve something, but it does not tackle the root of the problem. It is about cultural change and leadership at these organisations. We all agree across the House that they are not doing enough, so how do we change that culture? It has to come from leadership.
Yes, and that is why governance is addressed in the clause as drafted. But the one thing that will really change the way the leadership of these companies thinks about this issue is the one thing they ultimately care about—money. The reason they allow unsafe content to circulate and do not rein in or temper their algorithms, and the reason we are in this situation, which has arisen over the last 10 years or so, is that these companies have consistently prioritised profit over protection. Ultimately, that is the only language they understand—it is that and legal compulsion.
While the Bill rightly addresses governance in clause 10 and in other clauses, as I have said a few times, what has to happen to make this change occur is the compulsion that is inherent in the powers to fine and to deny service—to pull the plug—that the Bill also contains. The thing that will give reassurance to our constituents, and to me as a parent, is knowing that for the first time ever these companies can properly be held to account. They can be fined. They can have their connection pulled out of the wall. Those are the measures that will protect our children.
The Minister is being very generous with his time, but I do not think he appreciates the nature of the issue. Mark Zuckerberg’s net worth is $71.5 billion. Elon Musk, who is reported to be purchasing Twitter, is worth $218 billion. Bill Gates is worth $125 billion. Money does not matter to these people.
The Minister discusses huge fines for the companies and the potential sanction of bringing down their platforms. They will just set up another one. That is what we are seeing with the smaller platforms: they are closing down and setting up new platforms. These measures do not matter. What matters and will actually make a difference to the safety of children and adults online is personal liability—holding people personally responsible for the direct harm they are causing to people here in the United Kingdom. That is what these amendments seek to do, and that is why we are pushing them so heavily. I urge the Minister to respond to that.
We discussed personal liability extensively this morning. As we discussed, there is personal liability in relation to providing information, with a criminal penalty of up to two years’ imprisonment, to avoid situations like the one we saw a year or two ago, where one of these companies failed to provide the Competition and Markets Authority with the information that it required.
The shadow Minister pointed out the very high levels of global turnover—$71.5 billion—that these companies have. That means that ultimately they can be fined up to $7 billion for each set of breaches. That is a vast amount of money, particularly if those breaches happen repeatedly. She said that such companies will just set up again if we deny their service. Clearly, small companies can close down and set up again the next day, but gigantic companies, such as Meta—Facebook—cannot do that. That is why I think the sanctions I have pointed to are where the teeth really lie.
I accept the point about governance being important as well; I am not dismissing that. That is why we have personal criminal liability for information provision, with up to two years in prison, and it is why governance is referenced in clause 10. I accept the spirit of the points that have been made, but I think the Bill delivers these objectives as drafted.
One last time, because I am conscious that we need to make some progress this afternoon.
I have huge sympathy with the point that the Minister is making on this issue, but the hon. Member for Pontypridd is right to drive the point home. The Minister says there will be huge fines, but I think there will also be huge court bills. There will be an awful lot of litigation about how things are interpreted, because so much money will come into play. I just reiterate the importance of the guidance and the codes of practice, because if we do not get those right then the whole framework will be incredibly fragile. We will need ongoing scrutiny of how the Bill works or there will be a very difficult situation.
My right hon. Friend, as always, makes a very good point. The codes of practice will be important, particularly to enable Ofcom to levy fines where appropriate and then successfully defend them. This is an area that may get litigated. I hope that, should lawyers litigating these cases look at our transcripts in the future, they will see how strongly those on both sides of the House feel about this point. I know that Ofcom will ensure that the codes of practice are properly drafted. We touched this morning on the point about timing; we will follow up with Ofcom to make sure that the promise it made us during the evidence session about the road map is followed through and that those get published in good time.
On the point about the Joint Committee, I commend my right hon. Friend for her persistence—[Interruption.] Her tenacity—that is the right word. I commend her for her tenacity in raising that point. I mentioned it to the Secretary of State when I saw her at lunchtime, so the point that my right hon. Friend made this morning has been conveyed to the highest levels in the Department.
I must move on to the final two amendments, 11 and 13, which relate to transparency. Again, we had a debate about transparency earlier, when I made the point about the duties in clause 64, which I think cover the issue. Obviously, we are not debating clause 64 now but it is relevant because it requires Ofcom—it is not an option but an obligation; Ofcom must do so—to require providers to produce a transparency report every year. Ofcom can say what is supposed to be in the report, but the relevant schedule lists all the things that can be in it, and covers absolutely everything that the shadow Minister and the hon. Member for Worsley and Eccles South want to see in there.
That requirement to publish transparently and publicly is in the Bill, but it is to be found in clause 64. While I agree with the Opposition’s objectives on this point, I respectfully say that those objectives are delivered by the Bill as drafted, so I politely and gently request that the amendments be withdrawn.
I have a couple of comments, particularly about amendments 15 and 16, which the Minister has just spoken about at some length. I do not agree with the Government’s assessment that the governance subsection is adequate. It states that the risk assessment must take into account
“how the design and operation of the service (including the business model, governance, use of proactive technology…may reduce or increase the risks identified.”
It is actually an assessment of whether the governance structure has an impact on the risk assessment. It has no impact whatever on the level at which the risk assessment is approved or not approved; it is about the risks that the governance structure poses to children or adults, depending on which section of the Bill we are looking at.
The Minister should consider what is being asked in the amendment, which is about the decision-making level at which the risk assessments are approved. I know the Minister has spoken already, but some clarification would be welcome. Does he expect a junior tech support member of staff, or a junior member of the legal team, to write the risk assessment and then put it in a cupboard? Or perhaps they approve it themselves and then nothing happens with it until Ofcom asks for it. Does he think that Ofcom would look unfavourably on behaviour like that? If he was very clear with us about that, it might put our minds at rest. Does he think that someone in a managerial position or a board member, or the board itself, should take decisions, rather than a very junior member of staff? There is a big spread of people who could be taking decisions. If he could give us an indication of what Ofcom might look favourably on, it would be incredibly helpful for our deliberations.
I am anxious about time, but I will respond to that point because it is an important one. The hon. Lady is right to say that clause 10(6)(h) looks to identify the risks associated with governance. That is correct —it is a risk assessment. However in clause 11(2)(a), there is a duty to mitigate those risks, having identified what the risks are. If, as she hypothesised, a very junior person was looking at these matters from a governance point of view, that would be identified as a risk. If it was not, Ofcom would find that that was not sufficient or suitable. That would breach clause 10(2), and the service would then be required to mitigate. If it did not mitigate the risks by having a more senior person taking the decision, Ofcom would take enforcement action for its failure under clause 11(2)(a).
For the record, should Ofcom or lawyers consult the transcript to ascertain Parliament’s intention in the course of future litigation, it is absolutely the Government’s view, as I think it is the hon. Lady’s, that a suitable level of decision making for a children’s risk assessment would be a very senior level. The official Opposition clearly think that, because they have put it in their amendment. I am happy to confirm that, as a Minister, I think that. Obviously the hon. Lady, who speaks for the SNP, does too. If the transcripts of the Committee’s proceedings are examined in the future to ascertain Parliament’s intention, Parliament’s intention will be very clear.
All I have to add is the obvious point—I am sure that we are going to keep running into this—that people should not have to look to a transcript to see what the Minister’s and Parliament’s intention was. It is clear what the Opposition’s intention is—to protect children. I cannot see why the Minister will not specify who in an organisation should be responsible. It should not be a question of ploughing through transcripts of what we have talked about here in Committee; it should be obvious. We have the chance here to do something different and better. The regulator could specify a senior level.
Clearly, we are legislating here to cover, as I think we said this morning, 25,000 different companies. They all have different organisational structures, different personnel and so on. To anticipate the appropriate level of decision making in each of those companies and put it in the Bill in black and white, in a very prescriptive manner, might not adequately reflect the range of people involved.
I beg to move amendment 72, in clause 10, page 9, line 24, after “characteristic” insert “or characteristics”.
I will first speak to our amendment 85, which, like the Labour amendment, seeks to ensure that the Bill is crystal clear in addressing intersectionality. We need only consider the abuse faced by groups of MPs to understand why that is necessary. Female MPs are attacked online much more regularly than male MPs, and the situation is compounded if they have another minority characteristic. For instance, if they are gay or black, they are even more likely to be attacked. In fact, the MP who is most likely to be attacked is black and female. There are very few black female MPs, so it is not because of sheer numbers that they are at such increased risk of attack. Those with a minority characteristic are at higher risk of online harm, but the risk facing those with more than one minority characteristic is substantially higher, and that is what the amendment seeks to address.
I have spoken specifically about people being attacked on Twitter, Facebook and other social media platforms, but people in certain groups face an additional significant risk. If a young gay woman does not have a community around her, or if a young trans person does not know anybody else who is trans, they are much more likely to use the internet to reach out, to try to find people who are like them, to try to understand. If they are not accepted by their family, school or workplace, they are much more likely to go online to find a community and support—to find what is out there in terms of assistance—but using the internet as a vulnerable, at-risk person puts them at much more significant risk. This goes back to my earlier arguments about people requiring anonymity to protect themselves when using the internet to find their way through a difficult situation in which they have no role models.
It should not be difficult for the Government to accept this amendment. They should consider it carefully and understand that all of us on the Opposition Benches are making a really reasonable proposal. This is not about saying that someone with only one protected characteristic is not at risk; it is about recognising the intersectionality of risk and the fact that the risk faced by those who fit into more than one minority group is much higher than that faced by those who fit into just one. This is not about taking anything away from the Bill; it is about strengthening it and ensuring that organisations listen.
We have heard that a number of companies are not providing the protection that Members across the House would like them to provide against child sexual abuse. The governing structures, risk assessments, rules and moderation at those sites are better at ensuring that the providers make money than they are at providing protection. When regulated providers assess risk, it is not too much to ask them to consider not just people with one protected characteristic but those with multiple protected characteristics.
As MPs, we work on that basis every day. Across Scotland and the UK, we support our constituents as individuals and as groups. When protected characteristics intersect, we find ourselves standing in Parliament, shouting strongly on behalf of those affected and giving them our strongest backing, because we know that that intersection of harms is the point at which people are most vulnerable, in both the real and the online world. Will the Minister consider widening the provision so that it takes intersectionality into account and not only covers people with one protected characteristic but includes an over and above duty? I genuinely do not think it is too much for us to ask providers, particularly the biggest ones, to make this change.
Once again, the Government recognise the intent behind these amendments and support the concept that people with multiple intersecting characteristics, or those who are members of multiple groups, may experience—or probably do experience—elevated levels of harm and abuse online compared with others. We completely understand and accept that point, as clearly laid out by the hon. Member for Aberdeen North.
There is a technical legal reason why the use of the singular characteristic and group singular is adopted here. Section 6(c) of the Interpretation Act 1978 sets out how words in Bills and Acts are interpreted, namely that such words in the singular also cover the plural. That means that references in the singular, such as
“individuals with a certain characteristic”
in clause 10(6)(d), also cover characteristics in the plural. A reference to the singular implies a reference to the plural.
Will those compounded risks, where they exist, be taken into account? The answer is yes, because the assessments must assess the risk in front of them. Where there is evidence that multiple protected characteristics or the membership of multiple groups produce compounded risks, as the hon. Lady set out, the risk assessment has to reflect that. That includes the general sectoral risk assessment carried out by Ofcom, which is detailed in clause 83, and Ofcom will then produce guidance under clause 84.
The critical point is that, because there is evidence of high levels of compounded risk when people have more than one characteristic, that must be reflected in the risk assessment, otherwise it is inadequate. I accept the point behind the amendments, but I hope that that explains, with particular reference to the 1978 Act, why the Bill as drafted covers that valid point.
The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.
To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.
The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.
I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
I rise to speak to clause 11, because this is an important part of the Bill that deals with the safety duties protecting children. Many of us here today are spurred on by our horror at the way in which internet providers, platform providers and search engines have acted over recent years, developing their products with no regard for the safety of children, so I applaud the Government for bringing forward this groundbreaking legislation. They are literally writing the book on this, but in doing so, we have be very careful about the language we use and the way in which we frame our requirements of these organisations. The Minister has rightly characterised these organisations as being entirely driven by finance, not the welfare of their consumers, which must make them quite unique in the world. I can only hope that that will change: presumably, over time, people will not want to use products that have no regard for the safety of those who use them.
In this particular part of the Bill, the thorny issue of age assurance comes up. I would value the Minister’s views on some of the evidence that we received during our evidence sessions about how we ensure that age assurance is effective. Some of us who have been in this place for a while would be forgiven for thinking that we had already passed a law on age assurance. Unfortunately, that law did not seem to come to anything, so let us hope that second time is lucky. The key question is: who is going to make sure that the age assurance that is in place is good enough? Clause 11(3) sets out
“a duty to operate a service using proportionate systems and processes”
that is designed to protect children, but what is a proportionate system? Who is going to judge that? Presumably it will be Ofcom in the short term, and in the long term, I am sure the courts will get involved.
In our evidence, we heard some people advocating very strongly for these sorts of systems to be provided by third parties. I have to say, in a context where we are hearing how irresponsible the providers of these services are, I can understand why people would think that a third party would be a more responsible way forward. Can the Minister help the Committee understand how Ofcom will ensure that the systems used, particularly the age assurance systems, are proportionate—I do not particularly like that word; I would like those systems to be brilliant, not proportionate—and are actually doing what we need them to do, which is safeguard children? For the record, and for the edification of judges who are looking at this matter in future—and, indeed, Ofcom—will he set out how important this measure is within the Bill?
I thank my right hon. Friend for her remarks, in which she powerfully and eloquently set out how important the clause is to protecting children. She is right to point out that this is a critical area in the Bill, and it has wide support across the House. I am happy to emphasise, for the benefit of those who may study our proceedings in future, that protecting children is probably the single-most important thing that the Bill does, which is why it is vital that age-gating, where necessary, is effective.
My right hon. Friend asked how Ofcom will judge whether the systems under clause 11(3) are proportionate to
“prevent children of any age from encountering”
harmful content and so on. Ultimately, the proof of the pudding is in the eating; it has to be effective. When Ofcom decides whether a particular company or service is meeting the duty set out in the clause, the simple test will be one of effectiveness: is it effective and does it work? That is the approach that I would expect Ofcom to take; that is the approach that I would expect a court to take. We have specified that age verification, which is the most hard-edged type of age assurance—people have to provide a passport or something of that nature—is one example of how the duty can be met. If another, less-intrusive means is used, it will still have to be assessed as effective by Ofcom and, if challenged, by the courts.
I think my right hon. Friend was asking the Committee to confirm to people looking at our proceedings our clear intent for the measures to be effective. That is the standard to which we expect Ofcom and the courts to hold those platforms in deciding whether they have met the duties set out in the clause.
For clarification, does the Minister anticipate that Ofcom might be able to insist that a third-party provider be involved if there is significant evidence that the measures put in place by a platform are ineffective?
We have deliberately avoided being too prescriptive about precisely how the duty is met. We have pointed to age verification as an example of how the duty can be met without saying that that is the only way. We would not want to bind Ofcom’s hands, or indeed the hands of platforms. Clearly, using a third party is another way of delivering the outcome. If a platform were unable to demonstrate to Ofcom that it could deliver the required outcome using its own methods, Ofcom may well tell it to use a third party instead. The critical point is that the outcome must be delivered. That is the message that the social media firms, Ofcom and the courts need to hear when they look at our proceedings. That is set out clearly in the clause. Parliament is imposing a duty, and we expect all those to whom the legislation applies to comply with it.
Question put and agreed to.
Clause 11 accordingly ordered to stand part of the Bill.
Clause 12
Adults’ risk assessment duties
I beg to move amendment 12, in clause 12, page 12, line 10, at end insert—
“(4A) A duty to publish the adults’ risk assessment and proactively supply this to OFCOM.”
This amendment creates a duty to publish the adults’ risk assessment and supply it to Ofcom.
Once again, I agree with the point about transparency and the need to have those matters brought into the light of day. We heard from Frances Haugen how Facebook—now Meta—actively resisted doing so. However, I point to two provisions already in the Bill that deliver precisely that objective. I know we are debating clause 12, but there is a duty in clause 13(2) for platforms to publish in their terms of service—a public document—the findings of the most recent adult risk assessment. That duty is in clause 13—the next clause we are going to debate—in addition to the obligations I have referred to twice already in clause 64, where Ofcom compels those firms to publish their transparency reports. I agree with the points that the shadow Minister made, but suggest that through clause 13(2) and clause 64, those objectives are met in the Bill as drafted.
I thank the Minister for his comments, but sadly we do not feel that is appropriate or robust enough, which is why we will be pressing the amendment to a Division.
Question put, That the amendment be made.
The Committee divided.
While I am at risk of parroting my hon. Friend the Member for Worsley and Eccles South on clause 11, it is important that adults and the specific risks they face online are considered in the clause. The Minister knows we have wider concerns about the specific challenges of the current categorisation system. I will come on to that at great length later, but I thought it would be helpful to remind him at this relatively early stage that the commitments to safety and risk assessments for category 1 services will only work if category 1 encapsulates the most harmful platforms out there. That being said, Labour broadly supports this clause and has not sought to amend it.
I am eagerly awaiting the lengthy representations that the shadow Minister just referred to, as are, I am sure, the whole Committee and indeed the millions watching our proceedings on the live broadcast. As the shadow Minister said, clause 13 sets out the safety duties in relation to adults. This is content that is legal but potentially harmful to adults, and for those topics specified in secondary legislation, it will require category 1 services to set out clearly what actions they might be taking—from the actions specified in subsection (4) —in relation to that content.
It is important to specify that the action they may choose to take is a choice for the platform. I know some people have raised issues concerning free speech and these duties, but I want to reiterate and be clear that this is a choice for the platform. They have to be publicly clear about what choices they are making, and they must apply those choices consistently. That is a significant improvement on where we are now, where some of these policies get applied in a manner that is arbitrary.
Question put and agreed to.
Clause 13 accordingly ordered to stand part of the Bill.
Clause 14
User empowerment duties
I beg to move amendment 46, in clause 14, page 14, line 12, after “non-verified users” insert
“and to enable them to see whether another user is verified or non-verified.”
This amendment would make it clear that, as part of the User Empowerment Duty, users should be able to see which other users are verified and which are non-verified.
I would be delighted to speak to the amendment, which would change the existing user empowerment duty in clause 14 to require category 1 services to enable adult users to see whether other users are verified. In effect, however, that objective already follows as a natural consequence of the duty in clause 14(6). When a user decides to filter out non-verified users, by definition such users will be able to see content only from verified users, so they could see from that who was verified and who was not. The effect intended by the amendment, therefore, is already achieved through clause 14(6).
I am sorry to disagree with the Minister so vigorously, but that is a rubbish argument. It does not make any sense. There is a difference between wanting to filter out everybody who is not verified and wanting to actually see if someone who is threatening someone else online is a verified or a non-verified user. Those are two very different things. I can understand why a politician, for example, might not want to filter out unverified users but would want to check whether a person was verified before going to the police to report a threat.
When it comes to police investigations, if something is illegal and merits a report to the police, users should report it, regardless of whether someone is verified or not—whatever the circumstances. I would encourage any internet user to do that. That effectively applies on Twitter already; some people have blue ticks and some people do not, and people should report others to the police if they do something illegal, whether or not they happen to have a blue tick.
Amendment 47 seeks to create a definition of identity verification in clause 189. In addition, it would compel the person’s real name to be displayed. I understand the spirit of the amendment, but there are two reasons why I would not want to accept it and would ask hon. Members not to press it. First, the words “identity verification” are ordinary English words with a clear meaning and we do not normally define in legislation ordinary English words with a clear meaning. Secondly, the amendment would add the new requirement that, if somebody is verified, their real name has to be displayed, but I do not think that that is the effect of the drafting as it stands. Somebody may be verified, and the company knows who they are—if the police go to the company, they will have the verified information—but there is no obligation, as the amendment is drafted, for that information to be displayed publicly. The effect of that part of the amendment would be to force users to choose between disclosing their identity to everyone or having no control over who they interact with. That may not have been the intention, but I am not sure that this would necessarily make sense.
New clause 8 would place requirements on Ofcom about how to produce guidance on user identity verification and what that guidance must contain. We already have provisions on that in clause 58, which we will no doubt come to, although probably not later on today—maybe on Thursday. Clause 58 allows Ofcom to include in its regulatory guidance the principles and standards referenced in the new clause, which can then assist service providers in complying with their duties. Of course, if they choose to ignore the guidelines and do not comply with their duties, they will be subject to enforcement action, but we want to ensure that there is flexibility for Ofcom, in writing those guidelines, and for companies, in following those guidelines or taking alternative steps to meet their duty.
This morning, a couple of Members talked about the importance of remaining flexible and being open to future changes in technology and a wide range of user needs. We want to make sure that flexibility is retained. As drafted, new clause 8 potentially undermines that flexibility. We think that the powers set out in clause 58 give Ofcom the ability to set the relevant regulatory guidance.
Clause 14 implements the proposals made by my hon. Friend the Member for Stroud in her ten-minute rule Bill and the proposals made, as the shadow Minister has said, by a number of third-party stakeholders. We should all welcome the fact that these new user empowerment duties have now been included in the Bill in response to such widespread parliamentary lobbying.
I am grateful to the Minister for giving way. I want to recount my own experience on this issue. He mentioned that anybody in receipt of anonymous abuse on social media should report it to the police, especially if it is illegal. On Thursday, I dared to tweet my opinions on the controversial Depp-Heard case in America. As a result of putting my head above the parapet, my Twitter mentions were an absolute sewer of rape threats and death threats, mainly from anonymous accounts. My Twitter profile was mocked up—I had devil horns and a Star of David on my forehead. It was vile. I blocked, deleted and moved on, but I also reported those accounts to Twitter, especially those that sent me rape threats and death threats.
That was on Thursday, and to date no action has been taken and I have not received any response from Twitter about any of the accounts I reported. The Minister said they should be reported to the police. If I reported all those accounts to the police, I would still be there now reporting them. How does he anticipate that this will be resourced so that social media companies can tackle the issue? That was the interaction resulting from just one tweet that I sent on Thursday, and anonymous accounts sent me a barrage of hate and illegal activity.
The shadow Minister raises a very good point. Of course, what she experienced on Twitter was despicable, and I am sure that all members of the Committee would unreservedly condemn the perpetrators who put that content on there. Once the Bill is passed, there will be legal duties on Twitter to remove illegal content. At the moment, they do not exist, and there is no legal obligation for Twitter to remove that content, even though much of it, from the sound of it, would cross one of various legal thresholds. Perhaps some messages qualify as malicious communication, and others might cross other criminal thresholds. That legal duty does not exist at the moment, but when this Bill passes, for the first time there will be that duty to protect not just the shadow Minister but users across the whole country.
Question put, That the amendment be made.
Sometimes we miss out the fact that although MPs face abuse, we have a level of protection as currently elected Members. Even if there were an election coming up, we have a level of security protection and access that is much higher than for anybody else challenging a candidate or standing in a council or a Scottish Parliament election. As sitting MPs, we already have an additional level of protection because of the security services we have in place. We need to remember, and I assume this is why the amendment is drawn in a pretty broad way, that everybody standing for any sort of elected office faces significant risk of harm—again, whether or not that meets the threshold for illegality.
There are specific things that have been mentioned. As has been said, epilepsy is specifically mentioned as a place where specific harm occurs. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. That is why we have election addresses and a system where the election address gets delivered through every single person’s door. There is an understanding and acceptance by people involved in designing democratic processes that the message of all candidates needs to get out there. If the message of all candidates cannot get out there because some people are facing significant levels of abuse online, then democracy is not acting in the way that it should be. These amendments are fair and make a huge amount of sense. They are protecting the most important tenets of democracy and democratic engagement.
I want to say something about my own specific experiences. We have reported people to the police and have had people in court over the messages they have sent, largely by email, which would not be included in the Bill, but there have also been some pretty creepy ones on social media that have not necessarily met the threshold. As has been said, it is my staff who have had to go to court and stand in the witness box to explain the shock and terror they have felt on seeing the email or the communication that has come in, so I think any provision should include that.
Finally, we have seen situations where people working in elections—this is not an airy-fairy notion, but something that genuinely happened—have been photographed and those pictures have been shared on social media, and they have then been abused as a result. They are just doing their job, handing out ballot papers or standing up and announcing the results on the stage, and they have to abide by the processes that are in place now. In order for us to have free and fair elections that are run properly and that people want to work at and support, we need to have that additional level of protection. The hon. Member for Batley and Spen made a very reasonable argument and I hope the Minister listened to it carefully.
I have listened very carefully to both the hon. Member for Batley and Spen and the hon. Member for Aberdeen North. I agree with both of them that abuse and illegal activity directed at anyone, including people running for elected office, is unacceptable. I endorse and echo the comments they made in their very powerful and moving speeches.
In relation to the technicality of these amendments, what they are asking for is in the Bill already but in different places. This clause is about protecting content of “democratic importance” and concerns stopping online social media firms deleting content through over-zealous takedown. What the hon. Members are talking about is different. They are talking about abuse and illegal activities, such as rape threats, that people get on social media, particularly female MPs, as they both pointed out. I can point to two other places in the Bill where what they are asking for is delivered.
First, there are the duties around illegal content that we debated this morning. If there is content online that is illegal—some of the stuff that the shadow Minister referred to earlier sounds as if it would meet that threshold—then in the Bill there is a duty on social media firms to remove that content and to proactively prevent it if it is on the priority list. The route to prosecution will exist in future, as it does now, and the user-verification measures, if a user is verified, make it more likely for the police to identify the person responsible. In the context of identifying people carrying out abuse, I know the Home Office is looking at the Investigatory Powers Act 2016 as a separate piece of work that speaks to that issue.
So illegal content is dealt with in the illegal content provisions in the Bill, but later we will come to clause 150, which updates the Malicious Communications Act 1988 and creates a new harmful communications offence. Some of the communications that have been described may not count as a criminal offence under other parts of criminal law, but if they meet the test of harmful communication in clause 150, they will be criminalised and will therefore have to be taken down, and prosecution will be possible. In meeting the very reasonable requests that the hon. Members for Batley and Spen and for Aberdeen North have made, I would point to those two parts of the Bill.
But clause 150(5) says that if a message
“is, or is intended to be, a contribution to a matter of public interest”,
people are allowed to send it, which basically gives everybody a get-out clause in relation to anything to do with elections.
I know we are not discussing that part of the Bill, and if the Minister wants to come back to this when we get to clause 150, I have no problem with that.
I will answer the point now, as it has been raised. Clause 150 categorically does not give a get-out-of-jail-free card or provide an automatic excuse. Clearly, there is no way that abusing a candidate for elected office with rape threats and so on could possibly be considered a matter of public interest. In fact, even if the abuse somehow could be considered as possibly contributing to public debate, clause 150(5) says explicitly in line 32 on page 127:
“but that does not determine the point”.
Even where there is some potentially tenuous argument about a contribution to a matter of public interest, which most definitely would not be the case for the rape threats that have been described, that is not determinative. It is a balancing exercise that gets performed, and I hope that puts the hon. Lady’s mind at rest.
The Minister makes a really valid point and is right about the impact on the individual. The point I am trying to make with the amendments is that this is about the impact on the democratic process, which is why I think it fits in with clause 15. It is not about how individuals feel; it is about the impact that that has on behaviours, and about putting the emphasis and onus on platforms to decide what is of democratic importance. In the evidence we had two weeks ago, the witnesses certainly did not feel comfortable with putting the onus on platforms. If we were to have a code of practice, we would at least give them something to work with on the issue of what is of democratic importance. It is about the impact on democracy, not just the harm to the individual involved.
Clearly, if a communication is sufficiently offensive that it meets the criminal threshold, it is covered, and that would obviously harm the democratic process as well. If a communication was sufficiently offensive that it breached the harmful communication offence in clause 150, it would also, by definition, harm the democratic process, so communications that are damaging to democracy would axiomatically be caught by one thing or the other. I find it difficult to imagine a communication that might be considered damaging to democracy but that would not meet one of those two criteria, so that it was not illegal and would not meet the definition of a harmful communication.
My main point is that the existing provisions in the Bill address the kinds of behaviours that were described in those two speeches—the illegal content provisions, and the new harmful communication offence in clause 150. On that basis, I hope the hon. Member for Batley and Spen will withdraw the amendment, safe in the knowledge that the Bill addresses the issue that she rightly and reasonably raises.
Question put, That the amendment be made.
I will speak to clauses 15 and 16 and to new clause 7. The duties outlined in the clause, alongside clause 16, require platforms to have special terms and processes for handling journalistic and democratically important content. In respect of journalistic content, platforms are also required to provide an expedited appeals process for removed posts, and terms specifying how they will define journalistic content. There are, however, widespread concerns about both those duties.
As the Bill stands, we feel that there is too much discretion for platforms. They are required to define “journalistic” content, a role that they are completely unsuited to and, from what I can gather, do not want. In addition, the current drafting leaves the online space open to abuse. Individuals intent on causing harm are likely to apply to take advantage of either of those duties; masquerading as journalists or claiming democratic importance in whatever harm they are causing, and that could apply to almost anything. In the evidence sessions, we also heard about the concerns expressed brilliantly by Kyle Taylor from Fair Vote and Ellen Judson from Demos, that the definitions as they stand in the Bill thus far are broad and vague. However, we will come on to those matters later.
Ultimately, treating “journalistic” and “democratically important” content differently is unworkable, leaving platforms to make impossible judgments over, for example, when and for how long an issue becomes a matter of reasonable public debate, or in what settings a person is acting as a journalist. As the Minister knows, the duties outlined in the clause could enable a far-right activist who was standing in an election, or potentially even just supporting candidates in elections, to use all social media platforms. That might allow far-right figures to be re-platformed on to social media sites where they would be free to continue spreading hate.
The Bill indicates that content will be protected if created by a political party ahead of a vote in Parliament, an election or a referendum, or when campaigning on a live political issue—basically, anything. Can the Minister confirm whether the clause means that far-right figures who have been de-platformed for hate speech already must be reinstated if they stand in an election? Does that include far-right or even neo-Nazi political parties? Content and accounts that have been de-platformed from mainstream platforms for breaking terms of service should not be allowed to return to those platforms via this potential—dangerous—loophole.
As I have said, however, I know that these matters are complex and, quite rightly, exemptions must be in place to allow for free discussion around matters of the day. What cannot be allowed to perpetuate is hate sparked by bad actors using simple loopholes to avoid any consequences.
On clause 16, the Minister knows about the important work that Hope not Hate is doing in monitoring key far-right figures. I pay tribute to it for its excellent work. Many of them self-define as journalists and could seek to exploit this loophole in the Bill and propagate hate online. Some of the most high-profile and dangerous far-right figures in the UK, including Stephen Yaxley-Lennon, also known as Tommy Robinson, now class themselves as journalists. There are also far-right and conspiracy-theory so-called “news companies” such as Rebel Media and Urban Scoop. Both those replicate mainstream news publishers, but are used to spread misinformation and discriminatory content. Many of those individuals and organisations have been de-platformed already for consistently breaking the terms of service of major social media platforms, and the exemption could see them demand their return and have their return allowed.
New clause 7 would require the Secretary of State to publish a report reviewing the effectiveness of clauses 15 and 16. It is a simple new clause to require parliamentary scrutiny of how the Government’s chosen means of protecting content of democratic importance and content of journalistic content are working.
Hacked Off provided me with a list of people it found who have claimed to be journalists and who would seek to exploit the journalistic content duty, despite being banned from social media because they are racists or bad actors. First is Charles C. Johnson, a far-right activist who describes himself as an “investigative journalist”. Already banned from Twitter for saying he would “take out” a civil rights activist, he is also alleged to be a holocaust denier.
Secondly, we have Robert Stacy McCain. Robert has been banned from Twitter for participating in targeted abuse. He was a journalist for The Washington Post, but is alleged to have also been a member of the League of the South, a far-right group known to include racists. Then, there is Richard B. Spencer, a far-right journalist and former editor, only temporary banned for using overlapping accounts. He was pictured making the Nazi salute and has repeated Nazi propaganda. When Trump became President, he encouraged people to “party like it’s 1933”. Sadly, the list goes on and on.
Transparency is at the very heart of the Bill. The Minister knows we have concerns about clauses 15 and 16, as do many of his own Back Benchers. We have heard from my hon. Friend the Member for Batley and Spen how extremist groups and individuals and foreign state actors are having a very real impact on the online space. If the Minister is unwilling to move on tightening up those concepts, the very least he could commit to is a review that Parliament will be able to formally consider.
I thank the shadow Minister for her comments and questions. I would like to pick up on a few points on the clauses. First, there was a question about what content of democratic importance and content of journalistic importance mean in practice. As with many concepts in the Bill, we will look to Ofcom to issue codes of practice specifying precisely how we might expect platforms to implement the various provisions in the Bill. That is set out in clause 37(10)(e) and (f), which appear at the top of page 37, for ease. Clauses 15 and 16 on content of democratic and journalistic importance are expressly referenced as areas where codes of practice will have to be published by Ofcom, which will do further work on and consult on that. It will not just publish it, but will go through a proper process.
The shadow Minister expressed some understandable concerns a moment ago about various extremely unpleasant people, such as members of the far right who might somehow seek to use the provisions in clauses 15 and 16 as a shield behind which to hide, to enable them to continue propagating hateful, vile content. I want to make it clear that the protections in the Bill are not absolute—it is not that if someone can demonstrate that what they are saying is of democratic importance, they can say whatever they like. That is not how the clauses are drafted.
I draw attention to subsection (2) of both clauses 15 and 16. At the end of the first block of text, just above paragraph (a), it says “taken into account”: the duty is to ensure that matters concerning the importance of freedom of expression relating to content of democratic importance are taken into account when making decisions. It is not an absolute prohibition on takedown or an absolute protection, but simply something that has to be taken into account.
If someone from the far right, as the shadow Minister described, was spewing out vile hatred, racism or antisemitism, and tried to use those clauses, the fact that they might be standing in an election might well be taken into account. However, in performing that balancing exercise, the social media platforms and Ofcom acting as enforcers—and the court if it ever got judicially reviewed—would weigh those things up and find that taking into account content of democratic importance would not be sufficient to outweigh considerations around vile racism, antisemitism or misogyny.
The Minister mentions that it would be taken into account. How long does he anticipate it would be taken into account for, especially given the nature of an election? A short campaign could be a number of weeks, or something could be posted a day before an election, be deemed democratically important and have very serious and dangerous ramifications.
As I say, if content was racist, antisemitic or flagrantly misogynistic, the balancing exercise is performed and the democratic context may be taken into account. I do not think the scales would tip in favour of leaving the content up. Even during an election period, I think common sense dictates that.
To be clear on the timing point that the hon. Lady asked about, the definition of democratic importance is not set out in hard-edged terms. It does not say, “Well, if you are in a short election period, any candidate’s content counts as of democratic importance.” It is not set out in a manner that is as black and white as that. If, for example, somebody was a candidate but it was just racist abuse, I am not sure how even that would count as democratic importance, even during an election period, because it would just be abuse; it would not be contributing to any democratic debate. Equally, somebody might not be a candidate, or might have been a candidate historically, but might be contributing to a legitimate debate after an election. That might be seen as being of democratic importance, even though they were not actually a candidate. As I said, the concept is not quite as black and white as that. The main point is that it is only to be taken into account; it is not determinative.
I appreciate the Minister’s allowing me to come back on this. During the Committee’s evidence sessions, we heard of examples where bad-faith state actors were interfering in the Scottish referendum, hosting Facebook groups and perpetuating disinformation around the royal family to persuade voters to vote “Yes” to leave the United Kingdom. That disinformation by illegal bad-faith actors could currently come under both the democratic importance and journalistic exemptions, so would be allowed to remain for the duration of that campaign. Given the exemptions in the Bill, it could not be taken down but could have huge, serious ramifications for democracy and the security of the United Kingdom.
I understand the points that the hon. Lady is raising. However, I do not think that it would happen in that way.
No, I don’t. First of all, as I say, it is taken into account; it is not determinative. Secondly, on the point about state-sponsored disinformation, as I think I mentioned yesterday in response to the hon. Member for Liverpool, Walton, there is, as we speak, a new criminal offence of foreign interference being created in the National Security Bill. That will criminalise the kind of foreign interference in elections that she referred to. Because that would then create a new category of illegal content, that would flow through into this Bill. That would not be overridden by the duty to protect content of democratic importance set out here. I think that the combination of the fact that this is a balancing exercise, and not determinative, and the new foreign interference offence being created in the National Security Bill, will address the issue that the hon. Lady is raising—reasonably, because it has happened in this country, as she has said.
I will briefly turn to new clause 7, which calls for a review. I understand why the shadow Minister is proposing a review, but there is already a review mechanism in the Bill; it is to be found in clause 149, and will, of course, include a review of the way that clauses 15 and 16 operate. They are important clauses; we all accept that journalistic content and content of democratic importance is critical to the functioning of our society. Case law relating to article 10 of the European convention on human rights, for example, recognises content of journalistic importance as being especially critical. These two clauses seek to ensure that social media firms, in making their decisions, and Ofcom, in enforcing the firms, take account of that. However, it is no more than that: it is “take account”, it is not determinative.
Question put and agreed to.
Clause 15 accordingly ordered to stand part of the Bill.
Clause 16 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Good morning, Ms Rees. It is a pleasure to serve once again under your chairmanship. I wondered whether the shadow Minister, the hon. Member for Pontypridd, wanted to speak first—I am always happy to follow her, if she would prefer that.
I do my best.
Clauses 17 and 27 have similar effects, the former applying to user-to-user services and the latter to search services. They set out an obligation on the companies to put in place effective and accessible content reporting mechanisms, so that users can report issues. The clauses will ensure that service providers are made aware of illegal and harmful content on their sites. In relation to priority illegal content, the companies must proactively prevent it in the first place, but in the other areas, they may respond reactively as well.
The clause will ensure that anyone who wants to report illegal or harmful content can do so in a quick and reasonable way. We are ensuring that everyone who needs to do that will be able to do so, so the facility will be open to those who are affected by the content but who are not themselves users of the site. For example, that might be non-users who are the subject of the content, such as a victim of revenge pornography, or non-users who are members of a specific group with certain characteristics targeted by the content, such as a member of the Jewish community reporting antisemitic content. There is also facility for parents and other adults with caring responsibility for children, and adults caring for another adult, to report content. Clause 27 sets out similar duties in relation to search. I commend the clauses to the Committee.
I want to raise an additional point about content reporting and complaints procedures. I met with representatives of Mencap yesterday, who raised the issue of the accessibility of the procedures that are in place. I appreciate that the Bill talks about procedures being accessible, but will the Minister give us some comfort about Ofcom looking at the reporting procedures that are in place, to ensure that adults with learning disabilities in particular can access those content reporting and complaints procedures, understand them and easily find them on sites?
That is a specific concern that Mencap raised on behalf of its members. A number of its members will be users of sites such as Facebook, but may find it more difficult than others to access and understand the procedures that are in place. I appreciate that, through the Bill, the Minister is making an attempt to ensure that those procedures are accessible, but I want to make sure they are accessible not just for the general public but for children, who may need jargon-free access to content reporting and complaints procedures, and for people with learning disabilities, who may similarly need jargon-free, easy-to-understand and easy-to-find access to those procedures.
Let me try to address some of the questions that have been raised in this short debate, starting with the question that the hon. Member for Aberdeen North quite rightly asked at the beginning. She posed the question, “What if somebody who is not an affected person encountered some content and wanted to report it?” For example, she might encounter some racist content on Twitter or elsewhere and would want to be able to report it, even though she is not herself the target of it or necessarily a member of the group affected. I can also offer the reassurance that my hon. Friend the Member for Wolverhampton North East asked for.
The answer is to be found in clause 17(2), which refers to
“A duty to operate a service using systems and processes that allow users and”—
I stress “and”—“affected persons”. As such, the duty to offer content reporting is to users and affected persons, so if the hon. Member for Aberdeen North was a user of Twitter but was not herself an affected person, she would still be able to report content in her capacity as a user. I hope that provides clarification.
I appreciate that. That is key, and I am glad that this is wider than just users of the site. However, taking Reddit as an example, I am not signed up to that site, but I could easily stumble across content on it that was racist in nature. This clause would mean that I could not report that content unless I signed up to Reddit, because I would not be an affected person or a user of that site.
I thank the hon. Lady for her clarificatory question. I can confirm that in order to be a user of a service, she would not necessarily have to sign up to it. The simple act of browsing that service, of looking at Reddit—not, I confess, an activity that I participate in regularly—regardless of whether or not the hon. Lady has an account with it, makes her a user of that service, and in that capacity she would be able to make a content report under clause 17(2) even if she were not an affected person. I hope that clears up the question in a definitive manner.
The hon. Lady asked in her second speech about the accessibility of the complaints procedure for children. That is strictly a matter for clause 18, which is the next clause, but I will quickly answer her question. Clause 18 contains provisions that explicitly require the complaints process to be accessible. Subsection (2)(c) states that the complaints procedure has to be
“easy to access, easy to use (including by children) and transparent”,
so the statutory obligation that she requested is there in clause 18.
Can the Minister explain the logic in having that phrasing for the complaints procedure but not for the content-reporting procedure? Surely it would also make sense for the content reporting procedure to use the phrasing
“easy to access, easy to use (including by children) and transparent.”
There is in clause 17(2)
“a duty to operate a service that allows users and affected persons to easily report content which they consider to be content of a…kind specified below”,
which, of course, includes services likely to be accessed by children, under subsection (4). The words “easily report” are present in clause 17(2).
I will move on to the question of children reporting more generally, which the shadow Minister raised as well. Clearly, a parent or anyone with responsibility for a child has the ability to make a report, but it is also worth mentioning the power in clauses 140 to 142 to make super-complaints, which the NSPCC strongly welcomed its evidence. An organisation that represents a particular group—an obvious example is the NSPCC representing children, but it would apply to loads of other groups—has the ability to make super-complaints to Ofcom on behalf of those users, if it feels they are not being well treated by a platform. A combination of the parent or carer being able to make individual complaints, and the super-complaint facility, means that the points raised by Members are catered for. I commend the clause to the Committee.
Question put and agreed to.
Clause 17 accordingly ordered to stand part of the Bill.
Clause 18
Duties about complaints procedures
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Amendment 78, in clause 28, page 28, line 28, leave out “affected” and replace with “any other”
This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider to be illegal.
Amendment 79, in clause 28, page 28, line 30, leave out “affected” and replace with “any other”
This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider not to comply with sections 24, 27 or 29.
Clause 28 stand part.
New clause 1—Report on redress for individual complaints—
“(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under—
(a) section 18; and
(b) section 28
(2) The report must—
(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services and regulated search services;
(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and
(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services and search services.
(3) The report must be laid before Parliament within six months of the commencement of this Act.”
It is a pleasure to see you in the Chair, Ms Rees, and to make my first contribution in Committee—it will be a brief one. It is great to follow the hon. Member for Aberdeen North, and I listened intently to my right hon. Friend the Member for Basingstoke, from whom I have learned so much having sat with her in numerous Committees over the past two years.
I will speak to clause 18 stand part, in particular on the requirements of the technical specifications that the companies will need to use to ensure that they fulfil the duties under the clause. The point, which has been articulated well by numerous Members, is that we can place such a duty on service providers, but we must also ensure that the technical specifications in their systems allow them to follow through and deliver on it.
I sat in horror during the previous sitting as I listened to the hon. Member for Pontypridd talking about the horrendous abuse that she has to experience on Twitter. What that goes to show is that, if the intention of this clause and the Bill are to be fulfilled, we must ensure that the companies enable themselves to have the specifications in their systems on the ground to deliver the requirements of the Bill. That might mean that the secondary legislation is slightly more prescriptive about what those systems look like.
It is all well and good us passing primary legislation in this place to try to control matters, but my fear is that if those companies do not have systems such that they can follow through, there is a real risk that what we want will not materialise. As we proceed through the Bill, there will be mechanisms to ensure that that risk is mitigated, but the point that I am trying to make to my hon. Friend the Minister is that we should ensure that we are on top of this, and that companies have the technical specifications in their complaints procedures to meet the requirements under clause 18.
We must ensure that we do not allow the excuse, “Oh, well, we’re a bit behind the times on this.” I know that later clauses seek to deal with that, but it is important that we do not simply fall back on excuses. We must embed a culture that allows the provisions of the clause to be realised. I appeal to the Minister to ensure that we deal with that and embed a culture that looks at striding forward to deal with complaints procedures, and that these companies have the technical capabilities on the ground so that they can deal with these things swiftly and in the right way. Ultimately, as my right hon. Friend the Member for Basingstoke said, it is all well and good us making these laws, but it is vital that we ensure that they can be applied.
Let me address some of the issues raised in the debate. First, everyone in the House recognises the enormous problem at the moment with large social media firms receiving reports about harmful and even illegal content that they just flagrantly ignore. The purpose of the clause, and indeed of the whole Bill and its enforcement architecture, is to ensure that those large social media firms no longer ignore illegal and harmful content when they are notified about it. We agree unanimously on the importance of doing that.
The requirement for those firms to take the proper steps is set out in clause 18(2)(b), at the very top of page 18 —it is rather depressing that we are on only the 18th of a couple of hundred pages. That paragraph creates a statutory duty for a social media platform to take “appropriate action”—those are the key words. If the platform is notified of a piece of illegal content, or content that is harmful to children, or of content that it should take down under its own terms and conditions if harmful to adults, then it must do so. If it fails to do so, Ofcom will have the enforcement powers available to it to compel—ultimately, escalating to a fine of up to 10% of global revenue or even service disconnection.
Let me develop the point before I give way. Our first line of defence is Ofcom enforcing the clause, but we have a couple of layers of additional defence. One of those is the super-complaints mechanism, which I have mentioned before. If a particular group of people, represented by a body such as the NSPCC, feel that their legitimate complaints are being infringed systemically by the social media platform, and that Ofcom is failing to take the appropriate action, they can raise that as a super-complaint to ensure that the matter is dealt with.
I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.
I wanted to ask specifically about the resourcing of Ofcom, given the abilities that it will have under this clause. Will Ofcom have enough resource to be able to be that secondary line of defence?
A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.
This is the point about individual redress again: by talking about super-complaints, the Minister seems to be agreeing that it is not there. As I said earlier, for super-complaints to be made to Ofcom, the issue has to be of particular importance or to impact a particularly large number of users, but that does not help the individual. We know how much individuals are damaged; there must be a system of external redress. The point about internal complaints systems is that we know that they are not very good, and we require a big culture change to change them, but unless there is some mechanism thereafter, I cannot see how we are giving the individual any redress—it is certainly not through the super-complaints procedure.
As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.
Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.
I will in a moment. The volume of complaints that gets generated is vast. The way that we will fix this is not by having an external policeman to enforce on individual complaints, but by ensuring that the systems and processes are set up correctly to deal with problems at this large scale. [Interruption.] The shadow Minister, the hon. Member for Pontypridd, laughs, but it is a question of practicality. The way we will make the internet safe is to make sure that the systems and processes are in place and effective. Ofcom will ensure that that happens. That will protect everyone, not just those who raise individual complaints with an ombudsman.
I can see that there is substantial demand to comment, so I shall start by giving way to my right hon. Friend the Member for Basingstoke.
The Minister is doing an excellent job explaining the complex nature of the Bill. Ultimately, however, as he and I know, it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it. If my hon. Friend looks back at his comments, he will see that that is exactly the point he was making. Although it is possibly not necessary with this clause, I think he needs to give some assurances that later in the Bill he will look at hypothecating some of the money to be generated from fines to address the issues of individual constituents, who on a daily basis are suffering at the hands of the social media companies. I apologise for the length of my intervention.
It is categorically not the Government’s position that this problem is too big to fix. In fact, the whole purpose of this piece of groundbreaking and world-leading legislation is to fix a problem of such magnitude. The point my right hon. Friend was making about the hypothecation of fines to support user advocacy is a somewhat different one, which we will come to in due course, but there is nothing in the Bill to prevent individual groups from assisting individuals with making specific complaints to individual companies, as they are now entitled to do in law under clauses 17 and 18.
The point about an ombudsman is a slightly different one—if an individual complaint is made to a company and the individual complainant is dissatisfied with the outcome of their individual, particular and personal complaint, what should happen? In the case of financial services, if, for example, someone has been mis-sold a mortgage and they have suffered a huge loss, they can go to an ombudsman who will bindingly adjudicate that individual, single, personal case. The point that I am making is that having hundreds of thousands or potentially millions of cases being bindingly adjudicated on a case-by- case basis is not the right way to tackle a problem of this scale. The right way to tackle the problem is to force the social media companies, by law, to systemically deal with all of the problem, not just individual problems that may end up on an ombudsman’s desk.
That is the power in the Bill. It deals at a systems and processes level, it deals on an industry-wide level, and it gives Ofcom incredibly strong enforcement powers to make sure this actually happens. The hon. Member for Pontypridd has repeatedly called for a systems and processes approach. This is the embodiment of such an approach and the only way to fix a problem of such magnitude.
I associate myself with the comments of the right hon. Member for Basingstoke. Surely, if we are saying that this is such a huge problem, that is an argument for greater stringency and having an ombudsman. We cannot say that this is just about systems. Of course it is about systems, but online harms—we have heard some powerful examples of this—are about individuals, and we have to provide redress and support for the damage that online harms do to them. We have to look at systemic issues, as the Minister is rightly doing, but we also have to look at individual cases. The idea of an ombudsman and greater support for charities and those who can support victims of online crime, as mentioned by the hon. Member for Aberdeen North, is really important.
I thank the hon. Lady for her thoughtful intervention. There are two separate questions here. One is about user advocacy groups helping individuals to make complaints to the companies. That is a fair point, and no doubt we will debate it later. The ombudsman question is different; it is about whether to have a right of appeal against decisions by social media companies. Our answer is that, rather than having a third-party body—an ombudsman—effectively acting as a court of appeal against individual decisions by the social media firms, because of the scale of the matter, the solution is to compel the firms, using the force of law, to get this right on a systemic and comprehensive basis.
I give way first to the hon. Member for Aberdeen North—I think she was first on her feet—and then I will come to the hon. Member for Pontypridd.
Does the Minister not think this is going to work? He is creating this systems and processes approach, which he suggests will reduce the thousands of complaints—complaints will be made and complaints procedures will be followed. Surely, if it is going to work, in 10 years’ time we are going to need an ombudsman to adjudicate on the individual complaints that go wrong. If this works in the way he suggests, we will not have tens of millions of complaints, as we do now, but an ombudsman would provide individual redress. I get what he is arguing, but I do not know why he is not arguing for both things, because having both would provide the very best level of support.
I will address the review clause now, since it is relevant. If, in due course, as I hope and expect, the Bill has the desired effect, perhaps that would be the moment to consider the case for an ombudsman. The critical step is to take a systemic approach, which the Bill is doing. That engages the question of new clause 1, which would create a mechanism, probably for the reason the hon. Lady just set out, to review how things are going and to see if, in due course, there is a case for an ombudsman, once we see how the Bill unfolds in practice.
Let me finish the point. It is not a bad idea to review it and see how it is working in practice. Clause 149 already requires a review to take place between two and four years after Royal Assent. For the reasons that have been set out, it is pretty clear from this debate that we would expect the review to include precisely that question. If we had an ombudsman on day one, before the systems and processes had had a chance to have their effect, I fear that the ombudsman would be overwhelmed with millions of individual issues. The solution lies in fixing the problem systemically.
I think the shadow Minister wanted to intervene, unless I have answered her point already.
I wanted to reiterate the point that the hon. Member for Aberdeen North made, which the Minister has not answered. If he has such faith that the systems and processes will be changed and controlled by Ofcom as a result of the Bill, why is he so reluctant to put in an ombudsman? It will not be overwhelmed with complaints if the systems and processes work, and therefore protect victims. We have already waited far too long for the Bill, and now he says that we need to wait two to four years for a review, and even longer to implement an ombudsman to protect victims. Why will he not just put this in the Bill now to keep them safe?
Because we need to give the new systems and processes time to take effect. If the hon. Lady felt so strongly that an ombudsman was required, she was entirely at liberty to table an amendment to introduce one, but she has not done so.
I wonder whether Members would be reassured if companies were required to have a mechanism by which users could register their dissatisfaction, to enable an ombudsman, or perhaps Ofcom, to gauge the volume of dissatisfaction and bring some kind of group claim against the company. Is that a possibility?
Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.
When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.
The Minister said that the Opposition had not tabled an amendment to bring in an ombudsman.
On this clause. What we have done, however—we are debating it now—is to table a new clause to require a report on redress for individual complaints. The Minister talks about clause 149 and a process that will kick in between two and five years away, but we have a horrendous problem at the moment. I and various others have described the situation as the wild west, and very many people—thousands, if not millions, of individuals—are being failed very badly. I do not see why he is resisting our proposal for a report within six months of the commencement of the Act, which would enable us to start to see at that stage, not two to five years down the road, how these systems—he is putting a lot of faith in them—were turning out. I think that is a very sound idea, and it would help us to move forward.
The third line of defence—the super-complaint process—is available immediately, as I set out a moment ago. In relation to new clause 1, which the hon. Lady mentioned a moment ago, I think six months is very soon for a Bill of this magnitude. The two-to-five-year timetable under the existing review mechanism in clause 149 is appropriate.
Although we are not debating clause 149, I hope, Ms Rees, that you will forgive me for speaking about it for a moment. If Members turn to pages 125 and 126 and look at the matters covered by the review, they will see that they are extraordinarily comprehensive. In effect, the review covers the implementation of all aspects of the Bill, including the need to minimise the harms to individuals and the enforcement and information-gathering powers. It covers everything that Committee members would want to be reviewed. No doubt as we go through the Bill we will have, as we often do in Bill Committee proceedings, a number of occasions on which somebody tables an amendment to require a review of x, y or z. This is the second such occasion so far, I think, and there may be others. It is much better to have a comprehensive review, as the Bill does via the provisions in clause 149.
Question put and agreed to.
Clause 18 accordingly ordered to stand part of the Bill.
Clause 19
Duties about freedom of expression and privacy
Question proposed, That the clause stand part of the Bill.
Clause 19, on user-to-user services, and its associated clause 29, which relates to search services, specify a number of duties in relation to freedom of expression and privacy. In carrying out their safety duties, in-scope companies will be required by clause 19(2) to have regard to the importance of protecting users’ freedom of expression and privacy.
Let me pause for a moment on this issue. There has been some external commentary about the Bill’s impact on freedom of expression. We have already seen, via our discussion of a previous clause, that there is nothing in the Bill that compels the censorship of speech that is legal and not harmful to children. I put on the record again the fact that nothing in the Bill requires the censorship of legal speech that poses no harm to children.
We are going even further than that. As far as I am aware, for the first time ever there will be a duty on social media companies, via clause 19(2), to have regard to freedom of speech. There is currently no legal duty at all on platforms to have regard to freedom of speech. The clause establishes, for the first time, an obligation to have regard to freedom of speech. It is critical that not only Committee members but others more widely who consider the Bill should bear that carefully in mind. Besides that, the clause speaks to the right to privacy. Existing laws already speak to that, but the clause puts it in this Bill as well. Both duties are extremely important.
In addition, category 1 service providers—the really big ones—will need proactively to assess the impact of their policies on freedom of expression and privacy. I hope all Committee members will strongly welcome the important provisions I have outlined.
As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.
Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.
The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.
Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.
I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?
I confirm what the hon. Lady has just said. In response to the hon. Member for Worsley and Eccles South, it is important to say that the duty in clause 19 is “to have regard”, which simply means that a balancing exercise must be performed. It is not determinative; it is not as if the rights in the clause trump everything else. They simply have to be taken into account when making decisions.
To repeat what we discussed on Tuesday, I can explicitly and absolutely confirm to the hon. Member for Aberdeen North that in my view and the Government’s, concerns about freedom of expression or privacy should not trump platforms’ ability to scan for child sexual exploitation and abuse images or protect children. It is our view that there is nothing more important than protecting children from exploitation and sexual abuse.
We may discuss this further when we come to clause 103, which develops the theme a little. It is also worth saying that Ofcom will be able to look at the risk assessments and, if it feels that they are not of an adequate standard, take that up with the companies concerned. We should recognise that the duty to have regard to freedom of expression is not something that currently exists. It is a significant step forward, in my view, and I commend clauses 19 and 29 to the Committee.
As I have said, at the moment there is nothing at all. Platforms such as Facebook can and do arbitrarily censor content with little if any regard for freedom of speech. Some platforms have effectively cancelled Donald Trump while allowing the Russian state to propagate shocking disinformation about the Russian invasion of Ukraine, so there is real inconsistency and a lack of respect for freedom of speech. This at least establishes something where currently there is nothing. We can debate whether “have regard to” is strong enough. We have heard the other point of view from the other side of the House, which expressed concern that it might be used to allow otherwise harmful content, so there are clearly arguments on both sides of the debate. The obligation to have regard does have some weight, because the issue cannot be completely ignored. I do not think it would be adequate to simply pay lip service to it and not give it any real regard, so I would not dismiss the legislation as drafted.
I would point to the clauses that we have recently discussed, such as clause 15, under which content of democratic importance—which includes debating current issues and not just stuff said by an MP or candidate—gets additional protection. Some of the content that my hon. Friend the Member for Don Valley referred to a second ago would probably also get protection under clause 14, under which content of democratic importance has to be taken in account when making decisions about taking down or removing particular accounts. I hope that provides some reassurance that this is a significant step forwards compared with where the internet is today.
I share the Minister’s sentiments about the Bill protecting free speech; we all want to protect that. He mentions some of the clauses we debated on Tuesday regarding democratic importance. Some would say that debating this Bill is of democratic importance. Since we started debating the Bill on Tuesday, and since I have mentioned some of the concerns raised by stakeholders and others about the journalistic exemption and, for example, Tommy Robinson, my Twitter mentions have been a complete sewer—as everyone can imagine. One tweet I received in the last two minutes states:
“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”
in this country. Does the Minister agree that that is content of democratic importance, given we are debating this Bill, and that it should remain on Twitter?
That sounds like a very offensive tweet. Could the hon. Lady read it again? I didn’t quite catch it.
Yes:
“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”
in this country. It goes on:
“this is a toxic combination of bloc vote grubbing and woke”
culture, and there is a lovely GIF to go with it.
I do not want to give an off-the-cuff assessment of an individual piece of content—not least because I am not a lawyer. It does not sound like it meets the threshold of illegality. It most certainly is offensive, and that sort of matter is one that Ofcom will set out in its codes of practice, but there is obviously a balance between freedom of speech and content that is harmful, which the codes of practice will delve into. I would be interested if the hon. Lady could report that to Twitter and then report back to the Committee on what action it takes.
At the moment, there is no legal obligation to do anything about it, which is precisely why this Bill is needed, but let us put it to the test.
Question put and agreed to.
Clause 19 accordingly ordered to stand part of the Bill.
Clause 20
Record-keeping and review duties
Question proposed, That the clause stand part of the Bill.
The shadow Minister has eloquently introduced the purpose and effect of the clause, so I shall not repeat what she has said. On her point about publication, I repeat the point that I made on Tuesday, which is that the transparency requirements—they are requirements, not options—set out in clause 64 oblige Ofcom to ensure the publication of appropriate information publicly in exactly the way she requests.
Question put and agreed to.
Clause 20 accordingly ordered to stand part of the Bill.
Clauses 21 to 24 ordered to stand part of the Bill.
Clause 25
Children’s risk assessment duties
Amendment proposed: 16, in clause 25, page 25, line 10, at end insert—
“(3A) A duty for the children’s risk assessment to be approved by either—
(a) the board of the entity; or, if the organisation does not have a board structure,
(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.” —(Alex Davies-Jones.)
This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.
I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.
I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.
I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?
Colleagues have spoken eloquently to the purpose and effect of the various clauses and schedule 3 —the stand part component of this group. On schedule 3, the shadow Minister, the hon. Member for Worsley and Eccles South, asked about timing. The Government share her desire to get this done as quickly as possible. In its evidence a couple of weeks ago, Ofcom said it would be publishing its road map before the summer, which would set out the timetable for moving all this forward. We agree that that is extremely important.
I turn to one or two questions that arose on amendment 22. As always, the hon. Member for Aberdeen North asked a number of very good questions. The first was whether the concept of a “significant number” applied to a number in absolute terms or a percentage of the people using a particular service, and which is looked at when assessing what is significant. The answer is that it can be either—either a large number in absolute terms, by reference to the population of the whole United Kingdom, or a percentage of those using the service. That is expressed in clause 31(4)(a). Members will note the “or” there. It can be a number in proportion to the total UK population or the proportion using a service. I hope that answers the hon. Member’s very good question.
My concern is where services that meet neither of those criteria—they do not meet the “significant number” criterion in percentage terms because, say, only 0.05% of their users are children, and they do not meet it in population terms, because they are a pretty small platform and only have, say, 1,000 child users—but those children who use the platform are at very high risk because of the nature of the platform or the service provided. My concern is for those at highest risk where neither of the criteria are met and the service does not have to bother conducting any sort of age verification or access requirements.
I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.
The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.
On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.
Is the Minister coming on to say that he is accepting what we are saying here?
No, is the short answer. I was just mentioning in passing that there is that drafting issue.
On the principle, it is worth being very clear that, when it comes to content or matters that are illegal, that applies to all platforms, regardless of size, where children are at all at risk. In schedule 6, we set out a number of matters—child sexual exploitation and abuse, for example—as priority offences that all platforms have to protect children from proactively, regardless of scale.
The Minister has not addressed the points I raised. I specifically raised—he has not touched on this—harmful pro-anorexia blogs, which we know are dangerous but are not in scope, and games that children access that increase gambling addiction. He says that there is separate legislation for gambling addiction, but families have lost thousands of pounds through children playing games linked to gambling addiction. There are a number of other services that do not affect an appreciable number of children, and the drafting causes them to be out of scope.
There is no hard and fast rule about moving the Adjournment motion. It is up to the Government Whip.
I have a few more things to say, but I am happy to finish here if it is convenient.
Ordered, That the debate be now adjourned.—(Steve Double.)
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Good morning, Ms Rees; it is, as always, a pleasure to serve under your chairship.
Amendment 84 would remove the Secretary of State’s ability to modify Ofcom codes of practice
“for reasons of public policy”.
Labour agrees with the Carnegie UK Trust assessment of this: the codes are the fulcrum of the regulatory regime and it is a significant interference in Ofcom’s independence. Ofcom itself has noted that the “reasons of public policy” power to direct might weaken the regime. If Ofcom has undertaken a logical process, rooted in evidence, to arrive at a draft code, it is hard to see how a direction based on “reasons of public policy” is not irrational. That then creates a vulnerability to legal challenge.
On clause 40 more widely, the Secretary of State should not be able to give Ofcom specific direction on non-strategic matters. Ofcom’s independence in day-to-day decision making is paramount to preserving freedom of expression. Independence of media regulators is the norm in developed democracies. The UK has signed up to many international statements in that vein, including as recently as April 2022 at the Council of Europe. That statement says that
“media and communication governance should be independent and impartial to avoid undue influence on policy making, discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”
The Bill introduces powers for the Secretary of State to direct Ofcom on internet safety codes. These provisions should immediately be removed. After all, in broadcasting regulation, Ofcom is trusted to make powerful programme codes with no interference from the Secretary of State. Labour further notes that although the draft Bill permitted this
“to ensure that the code of practice reflects government policy”,
clause 40 now specifies that any code may be required to be modified
“for reasons of public policy”.
Although that is more normal language, it is not clear what in practice the difference in meaning is between the two sets of wording. I would be grateful if the Minister could confirm what that is.
The same clause gives the Secretary of State powers to direct Ofcom, on national security or public safety grounds, in the case of terrorism or CSEA—child sexual exploitation and abuse—codes of practice. The Secretary of State might have some special knowledge of those, but the Government have not demonstrated why they need a power to direct. In the broadcasting regime, there are no equivalent powers, and the Secretary of State was able to resolve the case of Russia Today, on national security grounds, with public correspondence between the Secretary of State and Ofcom.
Good morning, Ms Rees; it is a pleasure to serve under your chairmanship again. The SNP spokesman and the shadow Minister have already explained what these provisions do, which is to provide a power for the Secretary of State to make directions to Ofcom in relation to modifying a code of conduct. I think it is important to make it clear that the measures being raised by the two Opposition parties are, as they said, envisaged to be used only in exceptional circumstances. Of course the Government accept that Ofcom, in common with other regulators, is rightly independent and there should be no interference in its day-to-day regulatory decisions. This clause does not seek to violate that principle.
However, we also recognise that although Ofcom has great expertise as a regulator, there may be situations in which a topic outside its area of expertise needs to be reflected in a code of practice, and in those situations, it may be appropriate for a direction to be given to modify a code of conduct. A recent and very real example would be in order to reflect the latest medical advice during a public health emergency. Obviously, we saw in the last couple of years, during covid, some quite dangerous medical disinformation being spread—concerning, for example, the safety of vaccines or the “prudence” of ingesting bleach as a remedy to covid. There was also the purported and entirely false connection between 5G phone masts and covid. There were issues on public policy grounds—in this case, medical grounds—and it might have been appropriate to make sure that a code of conduct was appropriately modified.
It was mentioned earlier that some of us were on previous Committees that made recommendations more broadly that would perhaps be in line with the amendment. Since that time, there has been lots of discussion around this topic, and I have raised it with the Minister and colleagues. I feel reassured that there is a great need to keep the clause as is because of the fact that exceptional circumstances do arise. However, I would like reassurances that directions would be made only in exceptional circumstances and would not override the Ofcom policy or remit, as has just been discussed.
I can provide my hon. Friend with that reassurance on the exceptional circumstances point. The Joint Committee report was delivered in December, approximately six months ago. It was a very long report—I think it had more than 100 recommendations. Of course, members of the Committee are perfectly entitled, in relation to one or two of those recommendations, to have further discussions, listen further and adjust their views if they individually see fit.
Let me just finish this point and then I will give way. The shadow SNP spokesman, the hon. Member for Ochil and South Perthshire, asked about the Government listening and responding, and we accepted 66 of the Joint Committee’s recommendations —a Committee that he served on. We made very important changes to do with commercial pornography, for example, and fraudulent advertising. We accepted 66 recommendations, so it is fair to say we have listened a lot during the passage of this Bill. On the amendments that have been moved in Committee, often we have agreed with the amendments but the Bill has already dealt with the matter. I wanted to respond to those two points before giving way.
I am intrigued, as I am sure viewers will be. What is the new information that has come forward since December that has resulted in the Minister believing that he must stick with this? He has cited new information and new evidence, and I am dying to know what it is.
I am afraid it was not me that cited new information. It was my hon. Friend the Member for Watford who said he had had further discussions with Ministers. I am delighted to hear that he found those discussions enlightening, as I am sure they—I want to say they always are, but let us say they often are.
Before my hon. Friend moves on, can I ask a point of clarification? The hon. Member for Ochil and South Perthshire is right that this is an important point, so we need to understand it thoroughly. I think he makes a compelling argument about the exceptional circumstances. If Ofcom did not agree that a change that was being requested was in line with what my hon. Friend the Minister has said, how would it be able to discuss or, indeed, challenge that?
My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.
In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.
I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.
Before the Minister moves off the point about exceptional circumstances, it was the case previously that an amendment of the law resolution was always considered with Finance Bills. In recent years, that has stopped on the basis of it being exceptional circumstances because a general election was coming up. Then the Government changed that, and now they never table an amendment of the law resolution because they have decided that that is a minor change. Something has gone from being exceptional to being minor, in the view of this Government.
The Minister said that he envisions that this measure will be used only in exceptional circumstances. Can he commit himself to it being used only in exceptional circumstances? Can he give the commitment that he expects that it will be used only in exceptional circumstances, rather than simply envisioning that it will be used in such circumstances?
I have made clear how we expect the clause to be used. I am slightly hesitant to be more categorical simply because I do not want to make comments that might unduly bind a future Secretary of State—or, indeed, a future Parliament, because the measure is subject to the affirmative procedure—even were that Secretary of State, heaven forbid, to come from a party other than mine. Circumstances might arise, such as the pandemic, in which a power such as this needs to be exercised for good public policy reasons—in that example, public health. I would not want to be too categorical, which the hon. Lady is inviting me to be, lest I inadvertently circumscribe the ability of a future Parliament or a future Secretary of State to act.
The power is also limited in the sense that, in relation to matters that are not to do with national security or terrorism or CSEA, the power to direct can be exercised only at the point at which the code is submitted to be laid before Parliament. That cannot be done at any point. The power cannot be exercised at a time of the Secretary of State’s choosing. There is one moment, and one moment only, when that power can be exercised.
I also want to make it clear that the power will not allow the Secretary of State to direct Ofcom to require a particular regulated service to take a particular measure. The power relates to the codes of practice; it does not give the power to intrude any further, beyond the code of practice, in the arena of regulated activity.
I understand the points that have been made. We have listened to the Joint Committee, and we have made an important change, which is that to the affirmative procedure. I hope my explanation leaves the Committee feeling that, following that change, this is a reasonable place for clauses 40 and 41 to rest. I respectfully resist amendment 84 and new clause 12, and urge the Committee to allow clauses 40 and 41 to stand part of the Bill.
Question put, That the amendment be made.
Given that the clause is clearly uncontentious, I will be extremely brief.
I can see that that is the most popular thing I have said during the entire session—when you say, “And finally,” in a speech and the crowd cheers, you know you are in trouble.
Regulated user-to-user and search services will have duties to keep records of their risk assessments and the measures they take to comply with their safety duties, whether or not those are the ones recommended in the codes of practice. They must also undertake a children’s access assessment to determine whether children are likely to access their service.
Clause 48 places a duty on Ofcom to produce guidance to assist service providers in complying with those duties. It will help to ensure a consistent approach from service providers, which is essential in maintaining a level playing field. Ofcom will have a duty to consult the Information Commissioner prior to preparing this guidance, as set out in clause 48(2), in order to draw on the expertise of the Information Commissioner’s Office and ensure that the guidance is aligned with wider data protection and privacy regulation.
Question put and agreed to.
Clause 48 accordingly ordered to stand part of the Bill.
Clause 49
“Regulated user-generated content”, “user-generated content”, “news
publisher content”
I beg to move amendment 89, in clause 49, page 45, line 16, leave out subsection (e).
This amendment would remove the exemption for comments below news articles posted online.
Let me start by addressing the substance of the two amendments and then I will answer one or two of the questions that arose in the course of the debate.
As Opposition Members have suggested, the amendments would bring the comments that appear below the line on news websites such as The Guardian, MailOnline or the BBC into the scope of the Bill’s safety duties. They are right to point out that there are occasions when the comments posted on those sites are extremely offensive.
There are two reasons why comments below BBC, Guardian or Mail articles are excluded from the scope of the Bill. First, the news media publishers—newspapers, broadcasters and their representative industry bodies—have made the case to the Government, which we are persuaded by, that the comments section below news articles is an integral part of the process of publishing news and of what it means to have a free press. The news publishers—both newspapers and broadcasters that have websites—have made that case and have suggested, and the Government have accepted, that intruding into that space through legislation and regulation would represent an intrusion into the operation of the free press.
I am sorry, but I am having real trouble buying that argument. If the Minister is saying that newspaper comments sections are exempt in order to protect the free press because they are an integral part of it, why do we need the Bill in the first place? Social media platforms could argue in the same way that they are protecting free speech. They could ask, “Why should we regulate any comments on our social media platform if we are protecting free speech?” I am sorry; that argument does not wash.
There is a difference between random individuals posting stuff on Facebook, as opposed to content generated by what we have defined as a “recognised news publisher”. We will debate that in a moment. We recognise that is different in the Bill. Although the Opposition are looking to make amendments to clause 50, they appear to accept that the press deserve special protection. Article 10 case law deriving from the European convention on human rights also recognises that the press have a special status. In our political discourse we often refer generally to the importance of the freedom of the press. We recognise that the press are different, and the press have made the case—both newspapers and broadcasters, all of which now have websites—that their reader engagement is an integral part of that free speech. There is a difference between that and individuals chucking stuff on Facebook outside of the context of a news article.
There is then a question about whether, despite that, those comments are still sufficiently dangerous that they merit regulation by the Bill—a point that the shadow Minister, the hon. Member for Pontypridd, raised. There is a functional difference between comments made on platforms such as Facebook, Twitter, TikTok, Snapchat or Instagram, and comments made below the line on a news website, whether it is The Guardian, the Daily Mail, the BBC—even The National. The difference is that on social media platforms, which are the principal topic of the Bill, there is an in-built concept of virality—things going viral by sharing and propagating content widely. The whole thing can spiral rapidly out of control.
Virality is an inherent design feature in social media sites. It is not an inherent design feature of the comments we get under the news website of the BBC, The Guardian or the Daily Mail. There is no way of generating virality in the same way as there is on Facebook and Twitter. Facebook and Twitter are designed to generate massive virality in a way that comments below a news website are not. The reach, and the ability for them to grow exponentially, is orders of magnitude lower on a news website comment section than on Facebook. That is an important difference, from a risk point of view.
This issue comes down to a fundamental point—are we looking at volume or risk? There is no difference between an individual—a young person in this instance—seeing something about suicide or self-harm on a Facebook post or in the comments section of a newspaper article. The volume—whether it goes viral or not—does not matter if that individual has seen that content and it has directed them to somewhere that will create serious harm and lead them towards dangerous behaviour. The volume is not the point.
The hon. Lady raises an important philosophical question that underpins much of the Bill’s architecture. All the measures are intended to strike a balance. Where there are things that are at risk of leading to illegal activity, and things that are harmful to children, we are clamping down hard, but in other areas we are being more proportionate. For example, the legal but harmful to adult duties only apply to category 1 companies, and we are looking at whether that can be extended to other high-risk companies, as we debated earlier. In the earlier provisions that we debated, about “have regard to free speech”, there is a balancing exercise between the safety duties and free speech. A lot of the provisions in the Bill have a sense of balance and proportionality. In some areas, such as child sexual exploitation and abuse, there is no balance. We just want to stop that—end of story. In other areas, such as matters that are legal but harmful and touch on free speech, there is more of a balancing exercise.
In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.
I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.
I have a question on anonymity. On social media there will be a requirement to verify users’ identities, so if somebody posts on Twitter that they want to lynch me, it is possible to find out who that is, provided they do not have an anonymous account. There is no such provision for newspaper comment sections, so I assume it would be much more difficult for the police to find them, or for me not to see anonymous comments that threaten my safety below the line of newspaper articles—comments that are just as harmful, which threaten my safety on social media. Can the Minister can convince me otherwise?
The hon. Lady is correct in her analysis, I can confirm. Rather similar to the previous point, because of the interaction with freedom of the press—the argument that the newspapers and broadcasters have advanced—and because this is an inherently less viral environment, we have drawn the balance where we have. She is right to highlight a reasonable risk, but we have struck the balance in the way we have for that reason.
The shadow Minister, the hon. Member for Pontypridd, asked whether very harmful or illegal interactions in the metaverse would be covered or whether they have a metaphorical “get out of jail free” card owing to the exemption in clause 49(2)(d) for “one-to-one live aural communications”. In essence, she is asking whether, in the metaverse, if two users went off somewhere and interacted only with each other, that exemption would apply and they would therefore be outwith the scope of the Bill. I am pleased to tell her they would not, because the definition of live one-to-one aural communications goes from clause 49(2)(d) to clause 49(5), which defines “live aural communications”. Clause 49(5)(c) states that the exemption applies only if it
“is not accompanied by user-generated content of any other description”.
The actions of a physical avatar in the metaverse do constitute user-generated content of any other description. Owing to that fact, the exemption in clause 49(2)(d) would not apply to the metaverse.
I am happy to provide clarification on that. It is a good question and I hope I have provided an example of how, even though the metaverse was not conceived when the Bill was conceived, it does have an effect.
On that point, when it comes to definition of content, we have tabled an amendment about “any other content”. I am not convinced that the definition of content adequately covers what the Minister stated, because it is limited, does not include every possible scenario where it is user-generated and is not future-proofed enough. When we get to that point, I would appreciate it if the Minister would look at the amendment and ensure that what he intends is what happens.
I am grateful to the hon. Lady for thinking about that so carefully. I look forward to her amendment. For my information, which clause does her amendment seek to amend?
During the Joint Committee we were concerned about future-proofing. Although I appreciate it is not specifically included in the Bill because it is a House matter, I urge the setting up of a separate Online Safety Act committee that runs over time, so that it can continue to be improved upon and expanded, which would add value. We do not know what the next metaverse will be in 10 years’ time. However, I feel confident that the metaverse was included and I am glad that the Minister has confirmed that.
I thank my hon. Friend for his service on the Joint Committee. I heard the representations of my right hon. Friend the Member for Basingstoke about a Joint Committee, and I have conveyed them to the higher authorities.
The amendment that the Minister is asking about is to clause 189, which states:
“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.
It is amendment 76 that, after “including”, would insert “but not limited to”, in order that the Bill is as future-proofed as it can be.
I thank the hon. Lady for her rapid description of that amendment. We will come to clause 189 in due course. The definition of “content” in that clause is,
“anything communicated by means of an internet service”,
which sounds like it is quite widely drafted. However, we will obviously debate this issue properly when we consider clause 189.
The remaining question—
I intervene rather than making a subsequent substantive contribution because I am making a very simple point. My hon. Friend the Minister is making a really compelling case about the need for freedom of speech and the need to protect it within the context of newspapers online. However, could he help those who might be listening to this debate today to understand who is responsible if illegal comments are made on newspaper websites? I know that my constituents would be concerned about that, not particularly if illegal comments were made about a Member of Parliament or somebody else in the public eye, but about another individual not in the public eye.
What redress would that individual have? Would it be to ask the newspaper to take down that comment, or would it be that they could find out the identity of the individual who made the comment, or would it be that they could take legal action? If he could provide some clarity on that, it might help Committee members to understand even further why he is taking the position that he is taking.
I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.
Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.
Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.
Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.
Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.
We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.
Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.
That is helpful for understanding the rationale, but in the light of how people communicate online these days, although exempting telephone conversations makes sense, exempting what I am talking about does not. I would appreciate it if the Minister came back to me on that, and he does not have to give me an answer now. It would also help if he explained the difference between “aural” and “oral”, which are mentioned at different points in the Bill.
I will certainly come back with a more complete analysis of the point about protecting children—as parents, that clearly concerns us both. The literal definitions are that “aural” means “heard” and “oral” means “spoken”. They occur in different places in the Bill.
This is a difficult issue and legitimate questions have been raised, but as I said in response to the hon. Member for Batley and Spen, in this area as in others, there are balances to strike and different considerations at play—freedom of the press on the one hand, and the level of risk on the other. I think that the clause strikes that balance in an appropriate way.
Question put, That the amendment be made.
I thank the hon. Member for Batley and Spen for her speech. There is agreement across the House, in this Committee and in the Joint Committee that the commitment to having a free press in this country is extremely important. That is why recognised news publishers are exempted from the provisions of the Bill, as the hon. Lady said.
The clause, as drafted, has been looked at in some detail over a number of years and debated with news publishers and others. It is the best attempt that we have so far collectively been able to come up with to provide a definition of a news publisher that does not infringe on press freedom. The Government are concerned that if the amendment were adopted, it would effectively require news publishers to register with a regulator in order to benefit from the exemption. That would constitute the imposition of a mandatory press regulator by the back door. I put on record that this Government do not support any kind of mandatory or statutory press regulation, in any form, for reasons of freedom of the press. Despite what has been said in previous debates, we think to do that would unreasonably restrict the freedom of the press in this country.
While I understand its intention, the amendment would drive news media organisations, both print and broadcast, into the arms of a regulator, because they would have to join one in order to get the exemption. We do not think it is right to create that obligation. We have reached the philosophical position that statutory or mandatory regulation of the press is incompatible with press freedom. We have been clear about that general principle and cannot accept the amendment, which would violate that principle.
In relation to hostile states, such as Russia, I do not think anyone in the UK press would have the slightest objection to us finding ways to tighten up on such matters. As I have flagged previously, thought is being given to that issue, but in terms of the freedom of the domestic press, we feel very strongly that pushing people towards a regulator is inappropriate in the context of a free press.
The characterisation of these provisions is a little unfair, because some of the requirements are not trivial. The requirement in 50(2)(f) is that there must be a person—I think it includes a legal person as well as an actual person—who has legal responsibility for the material published, which means that, unlike with pretty much everything that appears on the internet, there is an identified person who has legal responsibility. That is a very important requirement. Some of the other requirements, such as having a registered address and a standards code, are relatively easy to meet, but the point about legal responsibility is very important. For that reason, I respectfully resist the amendment.
I will not push the amendment to a vote, but it is important to continue this conversation, and I encourage the Minister to consider the matter as the Bill proceeds. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
In its current form, the Online Safety Bill states that platforms do not have any duties relating to content from recognised media outlets and new publishers, and the outlets’ websites are also exempt from the scope of the Bill. However, the way the Bill is drafted means that hundreds of independently regulated specialist publishers’ titles will be excluded from the protections afforded to recognised media outlets and news publishers. This will have a long-lasting and damaging effect on an indispensable element of the UK’s media ecosystem.
Specialist publishers provide unparalleled insights into areas that broader news management organisations will likely not analyse, and it would surely be foolish to dismiss and damage specialist publications in a world where disinformation is becoming ever more prevalent. The former Secretary of State, the right hon. Member for Maldon (Mr Whittingdale), also raised this issue on Second Reading, where he stated that specialist publishers
“deserve the same level of protection.”—[Official Report, 19 April 2022; Vol. 712, c. 109.]
Part of the rationale for having the news publishers exemption in the Bill is that it means that the press will not be double-regulated. Special interest material is already regulated, so it should benefit from the same exemptions.
For the sake of clarity, and for the benefit of the Committee and those who are watching, could the hon. Gentleman say a bit more about what he means by specialist publications and perhaps give one or two examples to better illustrate his point?
I would be delighted to do so. I am talking about specific and occasionally niche publications. Let us take an example. Gardeners’ World is not exactly a hotbed of online harm, and nor is it a purveyor of disinformation. It explains freely which weeds to pull up and which not to, without seeking to confuse people in any way. Under the Bill, however, such publications will be needlessly subjected to rules, creating a regulatory headache for the sector. This is a minor amendment that will help many businesses, and I would be interested to hear from the Minister why the Government will not listen to the industry on this issue.
I thank the hon. Member for Ochil and South Perthshire for his amendment and his speech. I have a couple of points to make in reply. The first is that the exemption is about freedom of the press and freedom of speech. Clearly, that is most pertinent and relevant in the context of news, information and current affairs, which is the principal topic of the exemption. Were we to expand it to cover specialist magazines—he mentioned Gardeners’ World—I do not think that free speech would have the same currency when it comes to gardening as it would when people are discussing news, current affairs or public figures. The free speech argument that applies to newspapers, and to other people commenting on current affairs or public figures, does not apply in the same way to gardening and the like.
That brings me on to a second point. Only a few minutes ago, the hon. Member for Batley and Spen drew the Committee’s attention to the risks inherent in the clause that a bad actor could seek to exploit. It was reasonable of her to do so. Clearly, however, the more widely we draft the clause—if we include specialist publications such as Gardeners’ World, whose circulation will no doubt soar on the back of this debate—the greater the risk of bad actors exploiting the exemption.
My third point is about undue burdens being placed on publications. To the extent that such entities count as social media platforms—in-scope services—the most onerous duties under the Bill apply only to category 1 companies, or the very biggest firms such as Facebook and so on. The “legal but harmful” duties and many of the risk assessment duties would not apply to many organisations. In fact, I think I am right to say that if the only functionality on their websites is user comments, they would in any case be outside the scope of the Bill. I have to confess that I am not intimately familiar with the functionality of the Gardeners’ World website, but there is a good chance that if all it does is to provide the opportunity to post comments and similar things, it would be outside the scope of the Bill anyway, because it does not have the requisite functionality.
I understand the point made by the hon. Member for Ochil and South Perthshire, we will, respectfully, resist the amendment for the many reasons I have given.
I made general comments about clause 50 during the debate on amendment 107; I will not try the Committee’s patience by repeating them, but I believe that in them, I addressed some of the issues that the shadow Minister, the hon. Member for Pontypridd, has raised.
On the hon. Member for Aberdeen North’s question about where the Bill states that sites with limited functionality—for example, functionality limited to comments alone—are out of scope, paragraph 4(1) of schedule 1 states that
“A user-to-user service is exempt if the functionalities of the service are limited, such that users are able to communicate by means of the service only in the following ways—
(a) posting comments or reviews relating to provider content;
(b) sharing such comments or reviews on a different internet service”.
Clearly, services where a user can share freely are in scope, but if they cannot share directly—if they can only share via another service, such as Facebook—that service is out of scope. This speaks to the point that I made to the hon. Member for Batley and Spen in a previous debate about the level of virality, because the ability of content to spread, proliferate, and be forced down people’s throats is one of the main risks that we are seeking to address through the Bill. I hope that paragraph 4(1) of schedule 1 is of assistance, but I am happy to discuss the matter further if that would be helpful.
Question put and agreed to.
Clause 50 accordingly ordered to stand part of the Bill.
Clause 51
“Search content”, “search results” etc
Question proposed, That the clause stand part of the Bill.
Labour does not oppose the intention of the clause. It is important to define “search content” in order to understand the responsibilities that fall within search services’ remits.
However, we have issues with the way that the Bill treats user-to-user services and search services differently when it comes to risk-assessing and addressing legal harm—an issue that we will come on to when we debate schedule 10. Although search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars are fundamentally their responsibility. We do, however, accept that over the past 20 years, Google, for example, has developed mechanisms to provide a safer search experience for users while not curtailing access to lawful information. We also agree that search engines are critical to the proper functioning of the world wide web; they play a uniquely important role in facilitating access to the internet, and enable people to access, impart, and disseminate information.
Question put and agreed to.
Clause 51 accordingly ordered to stand part of the Bill.
Clause 52
“Illegal content” etc
I thank right hon. and hon. Members who have participated in the debate on this extremely important clause. It is extremely important because the Bill’s strongest provisions relate to illegal content, and the definition of illegal content set out in the clause is the starting point for those duties.
A number of important questions have been asked, and I would like to reply to them in turn. First, I want to speak directly about amendment 61, which was moved by the shadow Minister and which very reasonably and quite rightly asked the question about physically where in the world a criminal offence takes place. She rightly said that in the case of violence against some children, for example, that may happen somewhere else in the world but be transmitted on the internet here in the United Kingdom. On that, I can point to an existing provision in the Bill that does exactly what she wants. Clause 52(9), which appears about two thirds of the way down page 49 of the Bill, states:
“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom.”
What that is saying is that it does not matter whether the act of concern takes place physically in the United Kingdom or somewhere else, on the other side of the world. That does not matter in looking at whether something amounts to an offence. If it is criminal under UK law but it happens on the other side of the world, it is still in scope. Clause 52(9) makes that very clear, so I think that that provision is already doing what the shadow Minister’s amendment 61 seeks to do.
The shadow Minister asked a second question about the definition of illegal content, whether it involves a specific act and how it interacts with the “systems and processes” approach that the Bill takes. She is right to say that the definition of illegal content applies item by item. However, the legally binding duties in the Bill, which we have already debated in relation to previous clauses, apply to categories of content and to putting in place “proportionate systems and processes”—I think that that is the phrase used. Therefore, although the definition is particular, the duty is more general, and has to be met by putting in place systems and processes. I hope that my explanation provides clarification on that point.
The shadow Minister asked another question about the precise definitions of how the platforms are supposed to decide whether content meets the definition set out. She asked, in particular, questions about how to determine intent—the mens rea element of the offence. She mentioned that Ofcom had had some comments in that regard. Of course, the Government are discussing all this closely with Ofcom, as people would expect. I will say to the Committee that we are listening very carefully to the points that are being made. I hope that that gives the shadow Minister some assurance that the Government’s ears are open on this point.
The next and final point that I would like to come to was raised by all speakers in the debate, but particularly by my right hon. Friend the Member for Basingstoke, and is about violence against women and girls—an important point that we have quite rightly debated previously and come to again now. The first general point to make is that clause 52(4)(d) makes it clear that relevant offences include offences where the intended victim is an individual, so any violence towards and abuse of women and girls is obviously included in that.
As my right hon. Friend the Member for Basingstoke and others have pointed out, women suffer disproportionate abuse and are disproportionately the victims of criminal offences online. The hon. Member for Aberdeen North pointed out how a combination of protected characteristics can make the abuse particularly impactful—for example, if someone is a woman and a member of a minority. Those are important and valid points. I can reconfirm, as I did in our previous debate, that when Ofcom drafts the codes of practice on how platforms can meet their duties, it is at liberty to include such considerations. I echo the words spoken a few minutes ago by my right hon. Friend the Member for Basingstoke: the strong expectation across the House—among all parties here—is that those issues will be addressed in the codes of practice to ensure that those particular vulnerabilities and those compounded vulnerabilities are properly looked at by social media firms in discharging those duties.
My right hon. Friend also made points about intimate image abuse when the intimate images are made without the consent of the subject—the victim, I should say. I would make two points about that. The first relates to the Bill and the second looks to the future and the work of the Law Commission. On the Bill, we will come in due course to clause 150, which relates to the new harmful communications offence, and which will criminalise a communication—the sending of a message—when there is a real and substantial risk of it causing harm to the likely audience and there is intention to cause harm. The definition of “harm” in this case is psychological harm amounting to at least serious distress.
Clearly, if somebody is sending an intimate image without the consent of the subject, it is likely that that will cause harm to the likely audience. Obviously, if someone sends a naked image of somebody without their consent, that is very likely to cause serious distress, and I can think of few reasons why somebody would do that unless it was their intention, meaning that the offence would be made out under clause 150.
My right hon. Friend has strong feelings, which I entirely understand, that to make the measure even stronger the test should not involve intent at all, but should simply be a question of consent. Was there consent or not? If there was no consent, an offence would have been committed, without needing to go on to establish intention as clause 150 provides. As my right hon. Friend has said, Law Commission proposals are being developed. My understanding is that the Ministry of Justice, which is the Department responsible for this offence, is expecting to receive a final report, I am told, over the summer. It would then clearly be open to Parliament to legislate to put the offence into law, I hope as quickly as possible.
Once that happens, through whichever legislative vehicle, it will have two implications. First, the offence will automatically and immediately be picked up by clause 52(4)(d) and brought within the scope of the Bill because it is an offence where the intended victim is an individual. Secondly, there will be a power for the Secretary of State and for Parliament, through clause 176, I think—I am speaking from memory; yes, it is clause 176, not that I have memorised every clause in the Bill—via statutory instrument not only to bring the offence into the regular illegal safety duties, but to add it to schedule 7, which contains the priority offences.
Once that intimate image abuse offence is in law, via whichever legislative vehicle, that will have that immediate effect with respect to the Bill, and by statutory instrument it could be made a priority offence. I hope that gives my right hon. Friend a clear sense of the process by which this is moving forward.
I thank the Minister for such a clear explanation of his plan. Can he confirm that the Bill is a suitable legislative vehicle? I cannot see why it would not be. I welcome his agreement about the need for additional legislation over and above the communications offence. In the light of the way that nudification software and deepfake are advancing, and the challenges that our law enforcement agencies have in interpreting those quite complex notions, a straightforward law making it clear that publishing such images is a criminal offence would not only help law enforcement agencies, but would help the perpetrators to understand that what they are doing is a crime and they should stop.
As always, the right hon. Lady makes an incredibly powerful point. She asked specifically about whether the Bill is a suitable legislative vehicle in which to implement any Law Commission recommendations—we do not yet have the final version of that report—and I believe that that would be in scope. A decision about legislative vehicles depends on the final form of the Law Commission report and the Ministry of Justice response to it, and on cross-Government agreement about which vehicle to use.
I hope that addresses all the questions that have been raised by the Committee. Although the shadow Minister is right to raise the question, I respectfully ask her to withdraw amendment 61 on the basis that those matters are clearly covered in clause 52(9). I commend the clause to the Committee.
I am grateful to the Minister for his comments. The Labour party has concerns that clause 52(9) does not adequately get rid of the ambiguity around potential illegal online content. We feel that amendment 61 sets that out very clearly, which is why we will press it to a vote.
Just to help the Committee, what is it in clause 52(9) that is unclear or ambiguous?
We just feel that amendment 61 outlines matters much more explicitly and leaves no ambiguity by clearly defining any
“offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.”
I think they say the same thing, but we obviously disagree.
Question put, That the amendment be made.
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move amendment 90, in schedule 7, page 185, line 39, at end insert—
“Human trafficking
22A An offence under section 2 of the Modern Slavery Act 2015.”
This amendment would designate human trafficking as a priority offence.
Our amendment seeks to deal explicitly with what Meta and other companies refer to as “domestic servitude”, which we know better as human trafficking. This abhorrent practice has sadly been part of our society for hundreds if not thousands of years, and today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.
Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell, and co-ordinate the trafficking of young women. One would think that this issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported,
“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”
Those of us who have sat on the DCMS Committee and the Joint Committee on the draft Bill—I and my friends across the aisle, the hon. Members for Wolverhampton North East and for Watford—know exactly what it is like to have Facebook’s high heid yins before you. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.
The omission of human trafficking from schedule 7 is especially worrying because if it is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know that from their previous behaviour.
As I have indicated already, I do not propose that we have a clause stand part debate. It has been exhaustively debated, if I may say so.
Clause 54 ordered to stand part of the Bill.
Clause 55
Regulations under sections 53 and 54
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move amendment 127, in clause 69, page 60, line 26, after “must” insert—
“within six months of this Act being passed”.
As ever, it is a pleasure to serve under your chairship, Sir Roger. The thoughts and prayers of us all are with my hon. Friend the Member for Batley and Spen and all her friends and family.
Labour welcomes the clause, which sets out Ofcom’s duties to provide guidance to providers of internet services. It is apparent, however, that we cannot afford to kick the can down the road and delay implementation of the Bill any further than necessary. With that in mind, I urge the Minister to support the amendment, which would give Ofcom an appropriate amount of time to produce this important guidance.
It is a pleasure, once again, to serve under your august chairmanship, Sir Roger. I associate the Government with the remarks that you and the shadow Minister made, marking the anniversary of Jo Cox’s appalling murder, which shook the entire House when it happened. She will never be forgotten.
The Government are sympathetic to the intent of the amendment, which seeks to ensure that guidance for providers on protecting children from online pornography is put in place as quickly as possible. We of course sympathise with that objective, but we feel that the Secretary of State must retain the power to determine when to bring in the provisions of part 5, including the requirement under the clause for Ofcom to produce guidance, to ensure that implementation of the framework comprehensively and effectively regulates all forms of pornography online. That is the intention of the whole House and of this Committee.
Ofcom needs appropriate time and flexibility to get the guidance exactly right. We do not want to rush it and consequently see loopholes, which pornography providers or others might seek to exploit. As discussed, we will be taking a phased approach to bringing duties under the Bill into effect. We expect prioritisation for the most serious harms as quickly as possible, and we expect the duties on illegal content to be focused on most urgently. We have already accelerated the timescales for the most serious harms by putting priority illegal content in the various schedules to the Bill.
Ofcom is working hard to prepare implementation. We are all looking forward to the implementation road map, which it has committed to produce before the summer. For those reasons, I respectfully resist the amendment.
Question put, That the amendment be made.
I have just a short comment on these clauses. I very much applaud the Government’s approach to the funding of Ofcom through this mechanism. Clause 75 sets out clearly that the fees payable to Ofcom under section 71 should only be
“sufficient to meet, but…not exceed the annual cost to OFCOM”.
That is important when we start to think about victim support. While clearly Ofcom will have a duty to monitor the efficacy of the mechanisms in place on social media platforms, it is not entirely clear to me from the evidence or conversations with Ofcom whether it will see it as part of its duty to ensure that other areas of victim support are financed through those fees.
It may well be that the Minister thinks it more applicable to look at this issue when we consider the clauses on fines, and I plan to come to it at that point, but it would be helpful to understand whether he sees any role for Ofcom in ensuring that there is third-party specialist support for victims of all sorts of crime, including fraud or sexual abuse.
Let me start by associating myself with the remarks by the hon. Member for Worsley and Eccles South. We are in complete concurrence with the concept that the polluter should pay. Where there are regulatory costs caused by the behaviour of the social media firms that necessitates the Bill, it is absolutely right that those costs should fall on them and not on the general taxpayer. I absolutely agree with the principles that she outlined.
The hon. Lady raised a question about clause 70(6) and the potential exemption from the obligation to pay fees. That is a broadly drawn power, and the phrasing used is where
“OFCOM consider that an exemption…is appropriate”
and where the Secretary of State agrees. The Bill is not being prescriptive; it is intentionally providing flexibility in case there are circumstances where levying the fees might be inappropriate or, indeed, unjust. It is possible to conceive of an organisation that somehow exceeds the size threshold, but so manifestly does not need regulation that it would be unfair or unjust to levy the fees. For example, if a charity were, by some accident of chance, to fall into scope, it might qualify. But we expect social media firms to pay these bills, and I would not by any means expect the exemption to be applied routinely or regularly.
On the £88 million and the £110 million that have been referenced, the latter amount is to cover the three-year spending review period, which is the current financial year—2022-23—2023-24 and 2024-25. Of that £110 million, £88 million is allocated to Ofcom in the first two financial years; the remainder is allocated to DCMS for its work over the three-year period of the spending review. The £88 million for Ofcom runs out at the end of 2023-24.
The hon. Lady then asked whether the statutory fees in these clauses will kick in when the £88 million runs out—whether they will be available in time. The answer is yes. We expect and intend that the fees we are debating will become effective in 2024-25, so they will pick up where the £88 million finishes.
Ofcom will set the fees at a level that recoups its costs, so if the Bill becomes larger in scope, for example through amendments in the Commons or the Lords—not that I wish to encourage amendments—and the duties on Ofcom expand, we would expect the fees to be increased commensurately to cover any increased cost that our legislation imposes.
Before the Minister gets past this point—I think he has reached the point of my question—the fees do not kick in for two years. The figure is £88 million, but the point I was making is that the scope of the Bill has already increased. I asked about this during the evidence session with Ofcom. Fraudulent advertising was not included before, so there are already additional powers for Ofcom that need to be funded. I was questioning whether the original estimate will be enough for those two years.
That covers the preparatory work rather than the actual enforcement work that will follow. For the time being, we believe that it is enough, but of course we always maintain an active dialogue with Ofcom.
Finally, there was a question from my right hon. Friend the Member for Basingstoke, who asked how victims will be supported and compensated. As she said, Ofcom will always pay attention to victims in its work, but we should make it clear that the fees we are debating in these clauses are designed to cover only Ofcom’s costs and not those of third parties. I think the costs of victim support and measures to support victims are funded separately via the Ministry of Justice, which leads in this area. I believe that a victims Bill is being prepared that will significantly enhance the protections and rights that victims have—something that I am sure all of us will support.
Question put and agreed to.
Clause 70 accordingly ordered to stand part of the Bill.
Clauses 71 to 76 ordered to stand part of the Bill.
Clause 77
General duties of OFCOM under section 3 of the Communications Act
Question proposed, That the clause stand part of the Bill.
We welcome clause 77, which is an important clause that seeks to amend Ofcom’s existing general duties in the Communications Act 2003. Given the prevalence of illegal harms online, as we discussed earlier in proceedings, it is essential that the Communications Act is amended to reflect the important role that Ofcom will have as a new regulator.
As the Minister knows, and as we will discuss shortly when we reach amendments to clause 80, we have significant concerns about the Government’s approach to size versus harm when categorising service providers. Clause 77(4) amends section 3 of the Communications Act by inserting new subsection (4A). New paragraph (4A)(d) outlines measures that are proportionate to
“the size or capacity of the provider”,
and to
“the level of risk of harm presented by the service in question, and the severity of the potential harm”.
We know that harm, and the potential of accessing harmful content, is what is most important in the Bill—it says it in the name—so I am keen for my thoughts on the entire categorisation process to be known early on, although I will continue to press this issue with the Minister when we debate the appropriate clause.
Labour also supports clause 78. It is vital that Ofcom will have a duty to publish its proposals on strategic priorities within a set time period, and ensuring that that statement is published is a positive step towards transparency, which has been so crucially missing for far too long.
Similarly, Labour supports clause 79, which contains a duty to carry out impact assessments. That is vital, and it must be conveyed in the all-important Communications Act.
As the shadow Minister has set out, these clauses ensure that Ofcom’s duties under the Communications Act 2003 are updated to reflect the new duties that we are asking it to undertake—I think that is fairly clear from the clauses. On the shadow Minister’s comment about size and risk, I note her views and look forward to debating that more fully in a moment.
Question put and agreed to.
Clause 77 accordingly ordered to stand part of the Bill.
Clauses 78 and 79 ordered to stand part of the Bill.
Clause 80
Meaning of threshold conditions etc
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Amendment 80, in schedule 10, page 192, line, at end insert—
“(c) the assessed risk of harm arising from that part of the service.”
This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.
Amendment 81, in schedule 10, page 192, line 39, after “functionality” insert—
“and at least one specified condition about the assessed risk of harm”
This amendment is linked to Amendment 80.
Amendment 82, in schedule 10, page 192, line 41, at end insert—
‘(4A) At least one specified condition about the assessed risk of harm must provide for a service assessed as posing a very high risk of harm to its users to meet the Category 1 threshold.”
This amendment is linked to Amendment 80, it widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.
That schedule 10 be the Tenth schedule to the Bill.
Clause 81 stand part.
Clause 82 stand part.
I completely agree with my hon. Friend. The evidence we heard from Danny Stone from the Antisemitism Policy Trust clearly outlined the real-world harm that legal but harmful content causes. Such content may be legal, but it causes mass casualties and harm in the real world.
There are ways that we can rectify that in the Bill. Danny Stone set them out in his evidence and the SNP amendments, which the Labour Front Bench supports wholeheartedly, outline them too. I know the Minister wants to go further; he has said as much himself to this Committee and on the Floor of the House. I urge him to support some of the amendments, because it is clear that such changes can save lives.
Schedule 10 outlines the regulations specifying threshold conditions for categories of part 3 services. Put simply, as the Minister knows, Labour has concerns about the Government’s plans to allow thresholds for each category to be set out in secondary legislation. As we have said before, the Bill has already faced significant delays at the hands of the Government and we have real concerns that a reliance on secondary legislation further kicks the can down the road.
We also have concerns that the current system of categorisation is inflexible in so far as we have no understanding of how it will work if a service is required to shift from one category to another, and how long that would take. How exactly will that work in practice? Moreover, how long would Ofcom have to preside over such decisions?
We all know that the online space is susceptible to speed, with new technologies and ways of functioning popping up all over, and very often. Will the Minister clarify how he expects the re-categorisation process to occur in practice? The Minister must accept that his Department has been tone deaf on this point. Rather than an arbitrary size cut-off, the regulator must use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net.
Labour welcomes clause 81, which sets out Ofcom’s duties in establishing a register of categories of certain part 3 services. As I have repeated throughout the passage of the Bill, having a level of accountability and transparency is central to its success. However, we have slight concerns that the wording in subsection (1), which stipulates that the register be established
“as soon as reasonably practicable”,
could be ambiguous and does not give us the certainty we require. Given the huge amount of responsibility the Bill places on Ofcom, will the Minister confirm exactly what he believes the stipulation means in practice?
Finally, we welcome clause 82. It clarifies that Ofcom has a duty to maintain the all-important register. However, we share the same concerns I previously outlined about the timeframe in which Ofcom will be compelled to make such changes. We urge the Minister to move as quickly as he can, to urge Ofcom to do all they can and to make these vital changes.
As we have heard, the clauses set out how different platforms will be categorised with the purpose of ensuring duties are applied in a reasonable and proportionate way that avoids over-burdening smaller businesses. However, it is worth being clear that the Online Safety Bill, as drafted, requires all in-scope services, regardless of their user size, to take action against content that is illegal and where it is necessary to protect children. It is important to re-emphasise the fact that there is no size qualification for the illegal content duties and the duties on the protection of children.
It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.
It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.
I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.
I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.
We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.
I thank the Minister for his comments. I still think that such platforms are too dangerous not to be subject to more stringent legislation than similar-sized platforms. For the Chair’s information, I would like to press amendment 80 to a vote. If it falls, I will move straight to pressing amendment 82 to a vote, missing out amendment 81. Does that makes sense, Chair, and is it possible?
As I indicated, that means that amendments 81 and 82 now fall. Just for the hon. Lady’s information, ordinarily, where an amendment has been moved in Committee, it would not be selected to be moved on the Floor of the House on Report. However, the Minister has indicated that he is minded to look at this again. If, of course, the Government choose to move an amendment on Report, that then would be put to the House.
I did not indicate at the start of the debate that I would take the clause stand part and clause 84 stand part together, but I am perfectly relaxed about it and very happy to do so, as the hon. Lady has spoken to them. If any other colleague wishes to speak to them, that is fine by me.
Perhaps I might start with amendment 34, which the shadow Minister just spoke to. We agree that it is very important to consider the risks posed to victims who are outside of the territory of the United Kingdom. However, for the reasons I will elaborate on, we believe that the Bill as drafted achieves that objective already.
First, just to remind the Committee, the Bill already requires companies to put in place proportionate systems and processes to prevent UK users from encountering illegal content. Critically, that includes where a UK user creates illegal content via an in-scope platform, but where the victim is overseas. Let me go further and remind the Committee that clause 9 requires platforms to prevent UK users from encountering illegal content no matter where that content is produced or published. The word “encounter” is very broadly defined in clause 189 as meaning
“read, view, hear or otherwise experience content”.
As such, it will cover a user’s contact with any content that they themselves generate or upload to a service.
Critically, there is another clause, which we have discussed previously, that is very important in the context of overseas victims, which the shadow Minister quite rightly raises. The Committee will recall that subsection (9) of clause 52, which is the important clause that defines illegal content, makes it clear that that content does not have to be generated, uploaded or accessed in the UK, or indeed to have anything to do with the UK, in order to count as illegal content towards which the company has duties, including risk assessment duties. Even if the illegal act—for example, sexually abusing a child—happens in some other country, not the UK, it still counts as illegal content under the definitions in the Bill because of clause 52(9). It is very important that those duties will apply to that circumstance. To be completely clear, if an offender in the UK uses an in-scope platform to produce content where the victim is overseas, or to share abuse produced overseas with other UK users, the platform must tackle that, both through its risk assessment duties and its other duties.
As such, the entirely proper intent behind amendment 34 is already covered by the Bill as drafted. The shadow Minister, the hon. Member for Pontypridd, has already referred to the underlying purpose of clauses 83 and 84. As we discussed before, the risk assessments are central to the duties in the Bill. It is essential that Ofcom has a proper picture of the risks that will inform its various regulatory activities, which is why these clauses are so important. Clause 84 requires Ofcom to produce guidance to services to make sure they are carrying out those risk assessments properly, because it is no good having a token risk assessment or one that does not properly deal with the risks. The guidance published under clause 84 will ensure that happens. As such, I will respectfully resist amendment 34, on the grounds that its contents are already covered by the Bill.
I am grateful for the Minister’s clarification. Given his assurances that its contents are already covered by the Bill, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 83 ordered to stand part of the Bill.
Clause 84 ordered to stand part of the Bill.
Clause 85
Power to require information
I want to make a brief comment echoing the shadow Minister’s welcome for the inclusion of senior managers and named people in the Bill. I agree that that level of personal liability and responsibility is the only way that we will be able to hold some of these incredibly large, unwieldy organisations to account. If they could wriggle out of this by saying, “It’s somebody else’s responsibility,” and if everyone then disagreed about whose responsibility it was, we would be in a much worse place, so I also support the inclusion of these clauses and schedule 11.
I am delighted by the strong support that these clauses have received from across the aisle. I hope that proves to be a habit-forming development.
On the shadow Minister’s point about publishing the risk assessments, to repeat the point I made a few days ago, under clause 64, which we have already debated, Ofcom has the power—indeed, the obligation—to compel publication of transparency reports that will make sure that the relevant information sees the light of day. I accept that publication is important, but we believe that objective is achieved via the transparency measures in clause 64.
On the point about senior management liability, which again we debated near the beginning of the Bill, we believe—I think we all agree—that this is particularly important for information disclosure. We had the example, as I mentioned at the time, of one of the very large companies refusing to disclose information to the Competition and Markets Authority in relation to a competition matter and simply paying a £50 million fine rather than complying with the duties. That is why criminal liability is so important here in relation to information disclosure.
To reassure the shadow Minister, on the point about when that kicks in, it was in the old version of the Bill, but potentially did not commence for two years. In this new version, updated following our extensive and very responsive listening exercise—I am going to get that in every time—the commencement of this particular liability is automatic and takes place very shortly after Royal Assent. The delay and review have been removed, for the reason the hon. Lady mentioned, so I am pleased to confirm that to the Committee.
The shadow Minister described many of the provisions. Clause 85 gives Ofcom powers to require information, clause 86 gives the power to issue notices and clause 87 the important power to require an entity to name that relevant senior manager, so they cannot wriggle out of their duty by not providing the name. Clause 88 gives the power to require companies to undergo a report from a so-called skilled person. Clause 89 requires full co-operation with Ofcom when it opens an investigation, where co-operation has been sadly lacking in many cases to date. Clause 90 requires people to attend an interview, and the introduction to schedule 11 allows Ofcom to enter premises to inspect or audit the provider. These are very powerful clauses and will mean that social media companies can no longer hide in the shadows from the scrutiny they so richly deserve.
Question put and agreed to.
Clause 85 accordingly ordered to stand part of the Bill.
Clauses 86 to 91 ordered to stand part of the Bill.
Schedule 11
OFCOM’s powers of entry, inspection and audit
Amendment made: 4, in schedule 11, page 202, line 17, leave out
“maximum summary term for either-way offences”
and insert
“general limit in a magistrates’ court”.—(Chris Philp.)
Schedule 11, as amended, agreed to.
Clause 92
Offences in connection with information notices
Question proposed, That the clause stand part of the Bill.
The Minister will be pleased to hear that we, again, support these clauses. We absolutely support the Bill’s aims to ensure that information offences and penalties are strong enough to dissuade non-compliance. However, as we said repeatedly, we feel that the current provisions are lacking.
As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator. I am grateful that the Minister has confirmed that the measures will come into force with immediate effect following Royal Assent, rather than waiting two years. That is welcome news. The Government should require that top bosses at social media companies be criminally liable for systemic and repeated failures on online safety, and I am grateful for the Minister’s confirmation on that point.
As these harms are allowed to perpetuate, tech companies cannot continue to get away without penalty. Will the Minister confirm why the Bill does not include further penalties, in the form of criminal offences, should a case of systemic and repeated failures arise? Labour has concerns that, without stronger powers, Ofcom may not feel compelled or equipped to sanction those companies who are treading the fine line of doing just enough to satisfy the requirements outlined in the Bill as it stands.
Labour also welcomes clause 93, which sets out the criminal offences that can be committed by named senior managers in relation to their entity’s information obligations. It establishes that senior managers who are named in a response to an information notice can be held criminally liable for failing to prevent the relevant service provider from committing an information offence. Senior managers can only be prosecuted under the clause where the regulated provider has already been found liable for failing to comply with Ofcom’s information request. As I have already stated, we feel that this power needs to go further if we are truly to tackle online harm. For far too long, those at the very top have known about the harm that exists on their platforms, but they have failed to take action.
Labour supports clause 94 and we have not sought to amend at this stage. It is vital that provisions are laid in the Bill, such as those in subsection (3), which specify actions that a person may take to commit an offence of this nature. We all want to see the Bill keep people safe online, and at the heart of doing so is demanding a more transparent approach from those in silicon valley. My hon. Friend the Member for Worsley and Eccles South made an excellent case for the importance of transparency earlier in the debate but, as the Minister knows, and as I have said time and again, the offences must go further than just applying to simple failures to provide information. We must consider a systemic approach to harm more widely, and that goes far beyond simple information offences.
There is no need to repeat myself. Labour supports the need for clause 95 as it stands and we support clause 96, which is in line with penalties for other information offences that already exist.
I am delighted to discover that agreement with the Governments clauses continues to provoke a tsunami of unanimity across the Committee. I sense a gathering momentum behind these clauses.
As the shadow Minister mentioned, the criminal offences here are limited to information provision and disclosure. We have debated the point before. The Government’s feeling is that going beyond the information provision into other duties for criminal liability would potentially go a little far and have a chilling effect on the companies concerned.
Also, the fines that can be levied—10% of global revenue—run into billions of pounds, and there are the denial of service provisions, where a company can essentially be disconnected from the internet in extreme cases; these do provide more than adequate enforcement powers for the other duties in the Bill. The information duties are so fundamental—that is why personal criminal liability is needed. Without the information, we cannot really make any further assessment of whether the duties are being met.
The shadow Minister has set out what the other clauses do: clause 92 creates offences; clause 93 introduces senior managers’ liability; clause 94 sets out the offences that can be committed in relation to audit notices issued by Ofcom; clause 95 creates offences for intentionally obstructing or delaying a person exercising Ofcom’s power; and clause 96 sets out the penalties for the information offences set out in the Bill, which of course include a term of imprisonment of up to two years. Those are significant criminal offences, which I hope will make sure that executives working for social media firms properly discharge those important duties.
Question put and agreed to.
Clause 92 accordingly ordered to stand part of the Bill.
Clauses 93 to 95 ordered to stand part of the Bill.
Clause 96
Penalties for information offences
Amendment made: 2, in clause 96, page 83, line 15, leave out
“maximum summary term for either-way offences”
and insert
“general limit in a magistrates’ court”—(Chris Philp.)
Clause 96, as amended, ordered to stand part of the Bill.
Clause 97
Co-operation and disclosure of information: overseas regulators
Question proposed, That the clause stand part of the Bill.
I am delighted that support for the Government’s position on the clauses continues and that cross-party unanimity is taking an ever stronger hold. I am sure the Whips Office will find that particularly reassuring.
The shadow Minister asked a question about clause 100. Clause 100 amends section 24B of the Communications Act 2003, which allows Ofcom to provide information to the Secretary of State to assist with the formulation of policy. She asked me to clarify what that means, which I am happy to do. In most circumstances, Ofcom will be required to obtain the consent of providers in order to share information relating to their business. This clause sets out two exceptions to that principle. If the information required by the Secretary of State was obtained by Ofcom to determine the proposed fees threshold, or in response to potential threats to national security or to the health or safety of the public, the consent of the business is not required. In those instances, it would obviously not be appropriate to require the provider’s consent.
It is important that users of regulated services are kept informed of developments around online safety and the operation of the regulatory framework.
This specifically relates to the Secretary of State, but would the Minister expect both Ofcom and his Department to be working with the Scottish Government and the Northern Ireland Executive? I am not necessarily talking about sharing all the information, but where there are concerns that it is very important for those jurisdictions to be aware of, will he try to ensure that he has a productive relationship with both devolved Administrations?
I thank the hon. Member for her question. Where the matter being raised or disclosed touches on matters of devolved competence—devolved authority—then yes, I would expect that consultation to take place. Matters concerning the health and safety of the public are entirely devolved, I think, so I can confirm that in those circumstances it would be appropriate for the Secretary of State to share information with devolved Administration colleagues.
The shadow Minister has eloquently, as always, touched on the purpose of the various other clauses in this group. I do not wish to try the patience of the Committee, particularly as lunchtime approaches, by repeating what she has ably said already, so I will rest here and simply urge that these clauses stand part of the Bill.
Question put and agreed to.
Clause 97 accordingly ordered to stand part of the Bill.
Clauses 98 to 102 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I have a few questions, concerns and suggestions relating to these clauses. I think it was the hon. Member for Don Valley who asked me last week about the reports to the National Crime Agency and how that would work—about how, if a human was not checking those things, there would be an assurance that proper reports were being made, and that scanning was not happening and reports were not being made when images were totally legal and there was no problem with them. [Interruption.] I thought it was the hon. Member for Don Valley, although it may not have been. Apologies—it was a Conservative Member. I am sorry for misnaming the hon. Member.
The hon. Member for Pontypridd made a point about the high level of accuracy of the technologies. That should give everybody a level of reassurance that the reports that are and should be made to the National Crime Agency on child sexual abuse images will be made on a highly accurate basis, rather than a potentially inaccurate one. Actually, some computer technology—particularly for scanning for images, rather than text—is more accurate than human beings. I am pleased to hear those particular statistics.
Queries have been raised on this matter by external organisations—I am particularly thinking about the NSPCC, which we spoke about earlier. The Minister has thankfully given a number of significant reassurances about the ability to proactively scan. External organisations such as the NSPCC are still concerned that there is not enough on the face of the Bill about proactive scanning and ensuring that the current level of proactive scanning is able—or required—to be replicated when the Bill comes into action.
During an exchange in an earlier Committee sitting, the Minister gave a commitment—I am afraid I do not have the quote—to being open to looking at amending clause 103. I am slightly disappointed that there are no Government amendments, but I understand that there has been only a fairly short period; I am far less disappointed than I was previously, when the Minister had much more time to consider the actions he might have been willing to take.
The suggestion I received from the NSPCC is about the gap in the Bill regarding the ability of Ofcom to take action. These clauses allow Ofcom to take action against individual providers about which it has concerns; those providers will have to undertake duties set out by Ofcom. The NSPCC suggests that there could be a risk register, or that a notice could be served on a number of companies at one time, rather than Ofcom simply having to pick one company, or to repeatedly pick single companies and serve notices on them. Clause 83 outlines a register of risk profiles that must be created by Ofcom. It could therefore serve notice on all the companies that fall within a certain risk profile or all the providers that have common functionalities.
If there were a new, emerging concern, that would make sense. Rather than Ofcom having to go through the individual process with all the individual providers when it knows that there is common functionality—because of the risk assessments that have been done and Ofcom’s oversight of the different providers—it could serve notice on all of them in one go. It could not then accidentally miss one out and allow people to move to a different platform that had not been mentioned. I appreciate the conversation we had around this issue earlier, and the opportunity to provide context in relation to the NSPCC’s suggestions, but it would be great if the Minister would be willing to consider them.
I have another question, to which I think the Minister will be able to reply in the affirmative, which is on the uses of the technology as it evolves. We spoke about that in an earlier meeting. The technology that we have may not be what we use in the future to scan for terrorist-related activity or child sexual abuse material. It is important that the Bill adequately covers future conditions. I think that it does, but will the Minister confirm that, as technology advances and changes, these clauses will adequately capture the scanning technologies that are required, and any updates in the way in which platforms work and we interact with each other on the internet?
I have fewer concerns about future-proofing with regard to these provisions, because I genuinely think they cover future conditions, but it would be incredibly helpful and provide me with a bit of reassurance if the Minister could confirm that. I very much look forward to hearing his comments on clause 103.
Let me start by addressing some questions raised by hon. Members, beginning with the last point made by the hon. Member for Aberdeen North. She sought reconfirmation that the Bill will keep up with future developments in accredited technology that are not currently contemplated. The answer to her question can be found in clause 105(9), in which the definition of accredited technology is clearly set out, as technology that is
“accredited (by OFCOM or another person appointed by OFCOM) as meeting minimum standards of accuracy”.
That is not a one-off determination; it is a determination, or an accreditation, that can happen from time to time, periodically or at any point in the future. As and when new technologies emerge that meet the minimum standards of accuracy, they can be accredited, and the power in clause 103 can be used to compel platforms to use those technologies. I hope that provides the reassurance that the hon. Member was quite rightly asking for.
The shadow Minister, the hon. Member for Pontypridd, asked a related question about the process for publishing those minimum standards. The process is set out in clause 105(10), which says that Ofcom will give advice to the Secretary of State on the appropriate minimum standards, and the minimum standards will then be
“approved…by the Secretary of State, following advice from OFCOM.”
We are currently working with Ofcom to finalise the process for setting those standards, which of course will need to take a wide range of factors into account.
Let me turn to the substantive clauses. Clause 103 is extremely important, because as we heard in the evidence sessions and as Members of the Committee have said, scanning messages using technology such as hash matching, to which the shadow Minister referred, is an extremely powerful way of detecting CSEA content and providing information for law enforcement agencies to arrest suspected paedophiles. I think it was in the European Union that Meta—particularly Facebook and Facebook Messenger—stopped using this scanner for a short period time due to misplaced concerns about privacy laws, and the number of referrals of CSEA images and the number of potential paedophiles who were referred to law enforcement dropped dramatically.
A point that the hon. Member for Aberdeen North and I have discussed previously is that it would be completely unacceptable if a situation arose whereby these messages—I am thinking particularly about Facebook Messenger—did not get scanned for CSEA content in a way that they do get scanned today. When it comes to preventing child sexual exploitation and abuse, in my view there is no scope for compromise or ambiguity. That scanning is happening at the moment; it is protecting children on a very large scale and detecting paedophiles on quite a large scale. In my view, under no circumstances should that scanning be allowed to stop. That is the motivation behind clause 103, which provides Ofcom with the power to make directions to require the use of accredited technology.
As the hon. Member for Aberdeen North signalled in her remarks, given the importance of this issue the Government are of course open to thinking about ways in which the Bill can be strengthened if necessary, because we do not want to leave any loopholes. I urge any social media firms watching our proceedings never to take any steps that degrade or reduce the ability to scan for CSEA content. I thank the hon. Member for sending through the note from the NSPCC, which I have received and will look at internally.
I echo the sentiments that have been expressed by the shadow Minister, and thank her and her colleagues for tabling this amendment and giving voice to the numerous organisations that have been in touch with us about this matter. The Scottish National party is more than happy to support the amendment, which would make the Bill stronger and better, and would better enable Ofcom to take action when necessary.
I understand the spirit behind these amendments, focusing on the word “presence” rather than “prevalence” in various places. It is worth keeping in mind that throughout the Bill we are requiring companies to implement proportionate systems and processes to protect their users from harm. Even in the case of the most harmful illegal content, we are not placing the duty on companies to remove every single piece of illegal content that has ever appeared online, because that is requesting the impossible. We are asking them to take reasonable and proportionate steps to create systems and processes to do so. It is important to frame the legally binding duties in that way that makes them realistically achievable.
As the shadow Minister said, amendments 35, 36, 39 and 40 would replace the word “prevalence” with “presence”. That would change Ofcom’s duty to enforce not just against content that was present in significant numbers—prevalent—but against a single instance, which would be enough to engage the clause.
We mutually understand the intention behind these amendments, but we think the significant powers to compel companies to adopt certain technology contained in section 103 should be engaged only where there is a reasonable level of risk. For example, if a single piece of content was present on a platform, if may not be reasonable or proportionate to force the company to adopt certain new technologies, where indeed they do not do so at the moment. The use of “prevalence” ensures that the powers are used where necessary.
It is clear—there is no debate—that in the circumstances where scanning technology is currently used, which includes on Facebook Messenger, there is enormous prevalence of material. To elaborate on a point I made in a previous discussion, anything that stops that detection happening would be unacceptable and, in the Government’s view, it would not be reasonable to lose the ability to detect huge numbers of images in the service of implementing encryption, because there is nothing more important than scanning against child sexual exploitation images.
However, we think adopting the amendment and replacing the word “prevalence” with “presence” would create an extremely sensitive trigger that would be engaged on almost every site, even tiny ones or where there was no significant risk, because a single example would be enough to trigger the amendment, as drafted. Although I understand the spirit of the amendment, it moves away from the concepts of proportionality and reasonableness in the systems and processes that the Bill seeks to deliver.
Amendment 37 seeks to widen the criteria that Ofcom must consider when deciding to use section 103 powers. It is important to ensure that Ofcom considers a wide range of factors, taking into account the harm occurring, but clause 104(2)(f) already requires Ofcom to consider
“the level of risk of harm to individuals in the United Kingdom presented by relevant content, and the severity of that harm”.
Therefore, the Bill already contains provision requiring Ofcom to take those matters into account, as it should, but the shadow Minister is right to draw attention to the issue.
Finally, amendment 38 seeks to amend clause 116 to require Ofcom to consider the risk of harm posed by individuals in the United Kingdom, in relation to adults and children in the UK or elsewhere, through the production, publication and dissemination of illegal content. In deciding whether to make a confirmation decision requiring the use of technology, it is important that Ofcom considers a wide range of factors. However, clause 116(6)(e) already proposes to require Ofcom to consider, in particular, the risk and severity of harm to individuals in the UK. That is clearly already in the Bill.
I hope that this analysis provides a basis for the shadow Minister to accept that the Bill, in this area, functions as required. I gently request that she withdraw her amendment.
I welcome the Minister’s comments, but if we truly want the Bill to be world-leading, as the Government and the Minister insist it will be, and if it is truly to keep children safe, surely one image of child sexual exploitation and abuse on a platform is one too many. We do not need to consider prevalence over presence. I do not buy that argument. I believe we need to do all we can to make this Bill as strong as possible. I believe the amendments would do that.
Question put, That the amendment be made.
I beg to move amendment 6, in clause 104, page 89, line 14, after “(2)(f)” insert “, (g)”
This amendment ensures that subsection (3) of this clause (which clarifies what “relevant content” in particular paragraphs of subsection (2) refers to in relation to different kinds of services) applies to the reference to “relevant content” in subsection (2)(g) of this clause.
This technical amendment will ensure that the same definition of “relevant content” used in subsection (2) is used in subsection (3).
Amendment 6 agreed to.
Clause 104, as amended, ordered to stand part of the Bill.
Clauses 105 and 106 ordered to stand part of the Bill.
Clause 107
OFCOM’s guidance about functions under this Chapter
Question proposed, That the clause stand part of the Bill.
I have a quick question for the Minister about the timelines in relation to the guidance and the commitment that Ofcom gave to producing a road map before this coming summer. When is that guidance likely to be produced? Does that road map relate to the guidance in this clause, as well as the guidance in other clauses? If the Minister does not know the answer, I have no problem with receiving an answer at a later time. Does the road map include this guidance as well as other guidance that Ofcom may or may not be publishing at some point in the future?
I welcome the cross-party support for the provisions set out in these important clauses. Clause 107 points out the requirement for Ofcom to publish guidance, which is extremely important. Clause 108 makes sure that it publishes an annual report. Clause 109 covers the interpretations.
The hon. Member for Aberdeen North asked the only question, about the contents of the Ofcom road map, which in evidence it committed to publishing before the summer. I cannot entirely speak for Ofcom, which is of course an independent body. In order to avoid me giving the Committee misleading information, the best thing is for officials at the Department for Digital, Culture, Media and Sport to liaise with Ofcom and ascertain what the exact contents of the road map will be, and we can report that back to the Committee by letter.
It will be fair to say that the Committee’s feeling—I invite hon. Members to intervene if I have got this wrong—is that the road map should be as comprehensive as possible. Ideally, it would lay out the intended plan to cover all the activities that Ofcom would have to undertake in order to make the Bill operational, and the more detail there is, and the more comprehensive the road map can be, the happier the Committee will be.
Officials will take that away, discuss it with Ofcom and we can revert with fuller information. Given that the timetable was to publish the road map prior to the summer, I hope that we are not going to have to wait very long before we see it. If Ofcom is not preparing it now, it will hopefully hear this discussion and, if necessary, expand the scope of the road map a little bit accordingly.
Question put and agreed to.
Clause 107 accordingly ordered to stand part of the Bill
Clauses 108 and 109 ordered to stand part of the Bill.
Clause 110
Provisional notice of contravention
Question proposed, That the clause stand part of the Bill.
I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.
The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.
I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.
Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.
I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.
That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.
At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?
I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.
There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.
Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.
This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.
First, on the substance of the clause, as the shadow Minister said, the process of providing a provisional notice of contravention gives the subject company a fair chance to respond and put its case, before the full enforcement powers are brought down on its head, and that is of course only reasonable, given how strong and severe these powers are. I am glad there is once again agreement between the two parties.
I would like to turn now to the points raised by my right hon. Friend the Member for Basingstoke, who, as ever, has made a very thoughtful contribution to our proceedings. Let me start by answering her question as to what the Bill says about where fines that are levied will go. We can discover the answer to that question in paragraph 8 of schedule 12, which appears at the bottom of page 206 and the top of page 207—in the unlikely event that Members had not memorised that. If they look at that provision, they will see that the Bill as drafted provides that fines that are levied under the powers provided in it and that are paid to Ofcom get paid over to the Consolidated Fund, which is essentially general Treasury resources. That is where the money goes under the Bill as drafted.
My right hon. Friend asks whether some of the funds could be, essentially, hypothecated and diverted directly to pay victims. At the moment, the Government are dealing with victims, or pay for services supporting victims, not just via legislation—the victims Bill—but via expenditure that, I think, is managed by the Ministry of Justice to support victims and organisations working with victims in a number of ways. I believe that the amount earmarked for this financial year is in excess of £300 million, which is funded just via the general spending review. That is the situation as it is today.
I am happy to ask colleagues in Government the question that my right hon. Friend raises. It is really a matter for the Treasury, so I am happy to pass her idea on to it. But I anticipate a couple of responses coming from the Treasury in return. I would anticipate it first saying that allocating money to a particular purpose, including victims, is something that it likes to do via spending reviews, where it can balance all the demands on Government revenue, viewed in the round.
Secondly, it might say that the fine income is very uncertain; we do not know what it will be. One year it could be nothing; the next year it could be billions and billions of pounds. It depends on the behaviour of these social media firms. In fact, if the Bill does its job and they comply with the duties as we want and expect them to, the fines could be zero, because the firms do what they are supposed to. Conversely, if they misbehave, as they have been doing until now, the fines could be enormous. If we rely on hypothecation of these fines as a source for funding victim services, it might be that, in a particular year, we discover that there is no income, because no fines have been levied.
I agree 100%. The testimony of Frances Haugen, the Facebook whistleblower, highlighted the fact that expert researchers and academics will need to examine the data and look at what is happening behind social media platforms if we are to ensure that the Bill is truly fit for purpose and world leading. That process should be carried out as quickly as possible, and Ofcom must also be encouraged to publish guidance on how access to data will work.
Ultimately, the amendments make a simple point: civil society and researchers should be able to access data, so why will the Minister not let them? The Bill should empower independently verified researchers and civil society to request tech companies’ data. Ofcom should be required to publish guidance as soon as possible —within months, not years—on how data may be accessed. That safety check would hold companies to account and make the internet a safer and less divisive space for everyone.
The process would not be hard or commercially ruinous, as the platforms claim. The EU has already implemented it through its Digital Services Act, which opens up the secrets of tech companies’ data to Governments, academia and civil society in order to protect internet users. If we do not have that data, researchers based in the EU will be ahead of those in the UK. Without more insight to enable policymaking, quality research and harm analysis, regulatory intervention in the UK will stagnate. What is more, without such data, we will not know Instagram’s true impact on teen mental health, nor the reality of violence against women and girls online or the risks to our national security.
We propose amending the Bill to accelerate data sharing provisions while mandating Ofcom to produce guidance on how civil society and researchers can access data, not just on whether they should. As I said, that should happen within months, not years. The provisions should be followed by a code of practice, as outlined in the amendment, to ensure that platforms do not duck and dive in their adherence to transparency requirements. A code of practice would help to standardise data sharing in a way that serves platforms and researchers.
The changes would mean that tech companies can no longer hide in the shadows. As Frances Haugen said of the platforms in her evidence a few weeks ago:
“The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 188, Q320.]
I understand the shadow Minister’s point. We all heard from Frances Haugen about the social media firms’ well-documented reluctance—to put it politely—to open themselves up to external scrutiny. Making that happen is a shared objective. We have already discussed several times the transparency obligations enshrined in clause 64. Those will have a huge impact in ensuring that the social media firms open up a lot more and become more transparent. That will not be an option; they will be compelled to do that. Ofcom is obliged under clause 64 to publish the guidance around those transparency reports. That is all set in train already, and it will be extremely welcome.
Researchers’ access to information is covered in clause 136, which the amendments seek to amend. As the shadow Minister said, our approach is first to get Ofcom to prepare a report into how that can best be done. There are some non-trivial considerations to do with personal privacy and protecting people’s personal information, and there are questions about who counts as a valid researcher. When just talking about it casually, it might appear obvious who is or is not a valid researcher, but we will need to come up with a proper definition of “valid researcher” and what confidentiality obligations may apply to them.
This is all sorted in the health environment because of the personal data involved—there is no data more personal than health data—and a trusted and safe environment has been created for researchers to access personal data.
This data is a little different—the two domains do not directly correspond. In the health area, there has been litigation—an artificial intelligence company is currently engaged in litigation with an NHS hospital trust about a purported breach of patient data rules—so even in that long-established area, there is uncertainty and recent, or perhaps even current, litigation.
We are asking for the report to be done to ensure that those important issues are properly thought through. Once they are, Ofcom has the power under clause 136 to lay down guidance on providing access for independent researchers to do their work.
The Minister has committed to Ofcom being fully resourced to do what it needs to do under the Bill, but he has spoken about time constraints. If Ofcom were to receive 25,000 risk assessments, for example, there simply would not be enough people to go through them. Does he agree that, in cases in which Ofcom is struggling to manage the volume of data and to do the level of assessment required, it may be helpful to augment that work with the use of independent researchers? I am not asking him to commit to that, but to consider the benefits.
Yes, I would agree that bona fide academic independent researchers do have something to offer and to add in this area. The more we have highly intelligent, experienced and creative people looking at a particular problem or issue, the more likely we are to get a good and well-informed result. They may have perspectives that Ofcom does not. I agree that, in principle, independent researchers can add a great deal, but we need to ensure that we get that set up in a thoughtful and proper way. I understand the desire to get it done quickly, but it is important to take the time to do it not just quickly, but right. It is an area that does not exist already—at the moment, there is no concept of independent researchers getting access to the innards of social media companies’ data vaults—so we need to make sure that it is done in the right way, which is why it is structured as it is. I ask the Committee to stick with the drafting, whereby there will be a report and then Ofcom will have the power. I hope we end up in the same place—well, the same place, but a better place. The process may be slightly slower, but we may also end up in a better place for the consideration and thought that will have to be given.
I appreciate where the Minister is coming from. It seems that he wants to back the amendment, so I am struggling to see why he will not, especially given that the DSA—the EU’s new legislation—is already doing this. We know that the current wording in the Bill is far too woolly. If providers can get away with it, they will, which is why we need to compel them, so that we are able to access this data. We need to put that on the face of the Bill. I wish that we did not have to do so, but we all wish that we did not have to have this legislation in the first place. Unless we put it in the Bill, however, the social media platforms will carry on regardless, and the internet will not be a safe place for children and adults in the UK. That is why I will push amendment 53 to a vote.
Question put, That the amendment be made.
I beg to move amendment 56, in clause 111, page 94, line 24, at end insert— “Section [Supply chain risk assessment duties] Supply chain risk assessments”
This amendment is linked to NC11.
As my hon. Friend the Member for Pontypridd has pointed out, there is little or no transparency about one of the most critical ways in which platforms tackle harms. Human moderators are on the frontline of protecting children and adults from harmful content. They must be well resourced, trained and supported in order to fulfil that function, or the success of the Bill’s aims will be severely undermined.
I find it shocking that platforms offer so little data on human moderation, either because they refuse to publish it or because they do not know it. For example, in evidence to the Home Affairs Committee, William McCants from YouTube could not give precise statistics for its moderator team after being given six days’ notice to find the figure, because many moderators were employed or operated under third-party auspices. For YouTube’s global counter-terrorism lead to be unaware of the detail of how the platform is protecting its users from illegal content is shocking, but it is not uncommon.
In evidence to this Committee, Meta’s Richard Earley was asked how many of Meta’s 40,000 human moderators were outsourced to remove illegal content and disinformation from the platform. My hon. Friend the Member for Pontypridd said:
“You do not have the figures, so you cannot tell me.”
Richard Earley replied:
“I haven’t, no, but I will be happy to let you know afterwards in our written submission.”
Today, Meta submitted its written evidence to the Committee. It included no reference to human content moderators, despite its promise.
The account that my hon. Friend gave just now shows why new clause 11 is so necessary. Meta’s representative told this Committee in evidence:
“Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 45, Q76.]
But now we know from whistleblowers such as Daniel, whose case my hon. Friend described, that that is untrue. What is happening to Daniel and the other human moderators is deeply concerning. There are powerful examples of the devastating emotional impact that can occur because human moderators are not monitored, trained and supported.
There are risks of platforms shirking responsibility when they outsource moderation to third parties. Stakeholders have raised concerns that a regulated company could argue that an element of its service is not in the scope of the regulator because it is part of a supply chain. We will return to that issue when we debate new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties.
Platforms, in particular those supporting user-to-user generated content, employ those services from third parties. Yesterday, I met Danny Stone, the chief executive of the Antisemitism Policy Trust, who described the problem of antisemitic GIFs. Twitter would say, “We don’t supply GIFs. The responsibility is with GIPHY.” GIPHY, as part of the supply chain, would say, “We are not a user-to-user platform.” If someone searched Google for antisemitic GIFs, the results would contain multiple entries saying, “Antisemitic GIFs—get the best GIFs on GIPHY. Explore and share the best antisemitic GIFs.”
One can well imagine a scenario in which a company captured by the regulatory regime established by the Bill argues that an element of its service is not within the ambit of the regulator because it is part of a supply chain presented by, but not necessarily the responsibility of, the regulated service. The contracted element, which I have just described by reference to Twitter and GIPHY, supported by an entirely separate company, would argue that it was providing a business-to-business service that is not user-generated content but content designed and delivered at arm’s length and provided to the user-to-user service to deploy for its users.
I suggest that dealing with this issue would involve a timely, costly and unhelpful legal process during which systems were not being effectively regulated—the same may apply in relation to moderators and what my hon. Friend the Member for Pontypridd described; there are a number of lawsuits involved in Daniel’s case—and complex contract law was invoked.
We recognise in UK legislation that there are concerns and issues surrounding supply chains. Under the Bribery Act 2010, for example, a company is liable if anyone performing services for or on the company’s behalf is found culpable for specific actions. These issues on supply chain liability must be resolved if the Bill is to fulfil its aim of protecting adults and children from harm.
May I first say a brief word about clause stand part, Sir Roger?
Thank you. Clause 111 sets out and defines the “enforceable requirements” in this chapter—the duties that Ofcom is able to enforce against. Those are set out clearly in the table at subsection (2) and the requirements listed in subsection (3).
The amendment speaks to a different topic. It seeks to impose or police standards for people employed as subcontractors of the various companies that are in scope of the Bill, for example people that Facebook contracts; the shadow Minister, the hon. Member for Pontypridd, gave the example of the gentleman from Kenya she met yesterday. I understand the point she makes and I accept that there are people in those supply chains who are not well treated, who suffer PTSD and who have to do extraordinarily difficult tasks. I do not dispute at all the problems she has referenced. However, the Government do not feel that the Bill is the right place to address those issues, for a couple of reasons.
First, in relation to people who are employed in the UK, we have existing UK employment and health and safety laws. We do not want to duplicate or cut across those. I realise that they relate only to people employed in the UK, but if we passed the amendment as drafted, it would apply to people in the UK as much as it would apply to people in Kenya.
Secondly, the amendment would effectively require Ofcom to start paying regard to employment conditions in Kenya, among other places—indeed, potentially any country in the world—and it is fair to say that that sits substantially outside Ofcom’s area of expertise as a telecoms and communications regulator. That is the second reason why the amendment is problematic.
The third reason is more one of principle. The purpose of the Bill is to keep users safe online. While I understand the reasonable premise for the amendment, it seeks essentially to regulate working conditions in potentially any country in the world. I am just not sure that it is appropriate for an online safety Bill to seek to regulate global working conditions. Facebook, a US company, was referenced, but only 10% of its activity—very roughly speaking—is in the UK. The shadow Minister gave the example of Kenyan subcontractors. Compelling though her case was, I am not sure it is appropriate that UK legislation on online safety should seek to regulate the Kenyan subcontractor of a United States company.
The Government of Kenya can set their own employment regulations and President Biden’s Government can impose obligations on American companies. For us, via a UK online safety Bill, to seek to regulate working conditions in Kenya goes a long way beyond the bounds of what we are trying to do, particularly when we take into account that Ofcom is a telecommunications and communications regulator. To expect it to regulate working conditions anywhere in the world is asking quite a lot.
I accept that a real issue is being raised. There is definitely a problem, and the shadow Minister and the hon. Member for Aberdeen North are right to raise it, but for the three principal reasons that I set out, I suggest that the Bill is not the place to address these important issues.
The Minister mentions workers in the UK. I am a proud member of the Labour party and a proud trade unionist; we have strong protections for workers in the UK. There is a reason why Facebook and some of these other platforms, which are incredibly exploitative, will not have human moderators in the UK looking at this content: because they know they would be compelled to treat them a hell of a lot better than they do the workers around the world that they are exploiting, as they do in Kenya, Dublin and the US.
To me, the amendment speaks to the heart of the Bill. This is an online safety Bill that aims to keep the most vulnerable users safe online. People around the world are looking at content that is created here in the UK and having to moderate it; we are effectively shipping our trash to other countries and other people to deal with it. That is not acceptable. We have the opportunity here to keep everybody safe from looking at this incredibly harmful content. We have a duty to protect those who are looking at content created in the UK in order to keep us safe. We cannot let those people down. The amendment and new clause 11 give us the opportunity to do that. We want to make the Bill world leading. We want the UK to stand up for those people. I urge the Minister to do the right thing and back the amendment.
The Minister has not commented on the problem I raised of the contracted firm in the supply chain not being covered by the regulations under the Bill—the problem of Twitter and the GIFs, whereby the GIFs exist and are used on Twitter, but Twitter says, “We’re not responsible for them; it’s that firm over there.” That is the same thing, and new clause 11 would cover both.
I am answering slightly off the cuff, but I think the point the hon. Lady is raising—about where some potentially offensive or illegal content is produced on one service and then propagated or made available by another—is one we debated a few days ago. I think the hon. Member for Aberdeen North raised that question, last week or possibly the week before. I cannot immediately turn to the relevant clause—it will be in our early discussions in Hansard about the beginning of the Bill—but I think the Bill makes it clear that where content is accessed through another platform, which is the example that the hon. Member for Worsley and Eccles South just gave, the platform through which the content is made available is within the scope of the Bill.
Question put, That the amendment be made.
We support clause 112, which gives Ofcom the power to issue a confirmation decision if, having followed the required process—for example, in clause 110—its final decision is that a regulated service has breached an enforceable requirement. As we know, this will set out Ofcom’s final decision and explain whether Ofcom requires the recipient of the notice to take any specific steps and/or pay a financial penalty. Labour believes that this level of scrutiny and accountability is vital to an Online Safety Bill that is truly fit for purpose, and we support clause 112 in its entirety.
We also support the principles of clause 113, which outlines the steps that a person may be required to take either to come into compliance or to remedy the breach that has been committed. Subsection (5) in particular is vital, as it outlines how Ofcom can require immediate action when the breach has involved an information duty. We hope this will be a positive step forward in ensuring true accountability of big tech companies, so we are happy to support the clause unamended.
It is right and proper that Ofcom has powers when a regulated provider has failed to carry out an illegal content or children’s risk assessment properly or at all, and when it has identified a risk of serious harm that the regulated provider is not effectively mitigating or managing. As we have repeatedly heard, risk assessments are the very backbone of the Bill, so it is right and proper that Ofcom is able to force a company to take measures to comply in the event of previously failing to act.
Children’s access assessments, which are covered by clause 115, are a crucial component of the Bill. Where Ofcom finds that a regulated provider has failed to properly carry out an assessment, it is vital that it has the power and legislative standing to force the company to do more. We also appreciate the inclusion of a three-month timeframe, which would ensure that, in the event of a provider re-doing the assessment, it would at least be completed within a specific—and small—timeframe.
While we recognise that the use of proactive technologies may come with small issues, Labour ultimately feels that clause 116 is balanced and fair, as it establishes that Ofcom may require the use of proactive technology only on content that is communicated publicly. It is fair that content in the public domain is subject to those important safety checks. It is also right that under subsection (7), Ofcom may set a requirement forcing services to review the kind of technology being used. That is a welcome step that will ensure that platforms face a level of scrutiny that has certainly been missing so far.
Labour welcomes and is pleased to support clause 117, which allows Ofcom to impose financial penalties in its confirmation decision. That is something that Labour has long called for, as we believe that financial penalties of this nature will go some way towards improving best practice in the online space and deterring bad actors more widely.
The shadow Minister has set out the provisions in the clauses, and I am grateful for her support. In essence, clauses 112 to 117 set out the processes around confirmation decisions and make provisions to ensure that those are effective and can be operated in a reasonable and fair way. The clauses speak largely for themselves, so I am not sure that I have anything substantive to add.
Question put and agreed to.
Clause 112 accordingly ordered to stand part of the Bill.
Clauses 113 to 117 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Dean Russell.)
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Bore da, Ms Rees. It is, as ever, a pleasure to serve under your chairship. I rise to speak to clauses 118 to 121 and Government amendments 154 to 157.
As we all know, clause 118 is important and allows Ofcom to impose a financial penalty on a person who fails to complete steps that have been required by Ofcom in a confirmation decision. This is absolutely vital if we are to guarantee that regulated platforms take seriously their responsibilities in keeping us all safe online. We support the use of fines. They are key to overall behavioural change, particularly in the context of personal liability. We welcome clause 118, which outlines the steps Ofcom can take in what we hope will become a powerful deterrent.
Labour also welcomes clause 119. It is vital that Ofcom has these important powers to impose a financial penalty on a person who fails to comply with a notice that requires technology to be implemented to identify and deal with content relating to terrorism and child sexual exploitation and abuse on their service. These are priority harms and the more that can be done to protect us on these two points the better.
Government amendments 155 and 157 ensure that Ofcom has the power to impose a monetary penalty on a provider of a service who fails to pay a fee that it is required to pay under new schedule 2. We see these amendments as crucial in giving Ofcom the important powers it needs to be an effective regulator, which is something we all require. We have some specific observations around new schedule 2, but I will save those until we consider that schedule. For now, we support these amendments and I look forward to outlining our thoughts shortly.
We support clause 120, which allows Ofcom to give a penalty notice to a provider of a regulated service who does not pay the fee due to Ofcom in full. This a vital provision that also ensures that Ofcom’s process to impose a penalty can progress only when it has given due notice to the provider and once the provider has had fair opportunity to make fair representations to Ofcom. This is a fair approach and is central to the Bill, which is why we have not sought to amend.
Finally, we support clause 121, which ensures that Ofcom must state the reasons why it is imposing a penalty, the amount of the penalty and any aggravating or mitigating factors. Ofcom must also state when the penalty must be paid. It is imperative that when issuing a notice Ofcom is incentivised to publish information about the amount, aggravating or mitigating factors and when the penalty must be paid. We support this important clause and have not sought to amend.
It is a pleasure to serve under your chairmanship once again, Ms Rees, and I congratulate Committee members on evading this morning’s strike action.
I am delighted that the shadow Minister supports the intent behind these clauses, and I will not speak at great length given the unanimity on this topic. As she said, clause 118 allows Ofcom to impose a financial penalty for failure to take specified steps by a deadline set by Ofcom. The maximum penalty that can be imposed is the greater of £18 million or 10% of qualifying worldwide revenue. In the case of large companies, it is likely to be a much larger amount than £18 million.
Clause 119 enables Ofcom to impose financial penalties if the recipient of a section 103 notice does not comply by the deadline. It is very important to ensure that section 103 has proper teeth. Government amendments 154 to 157 make changes that allow Ofcom to recover not only the cost of running the service once the Bill comes into force and into the future but also the preparatory cost of setting up for the Bill to come into force.
As previously discussed, £88 million of funding is being provided to Ofcom in this financial year and next. We believe that something like £20 million of costs that predate these financial years have been funded as well. That adds up to around £108 million. However, the amount that Ofcom recovers will be the actual cost incurred. The figure I provided is simply an indicative estimate. The actual figure would be based on the real costs, which Ofcom would be able to recoup under these measures. That means that the taxpayer—our constituents —will not bear any of the costs, including the set-up and preparatory cost. This is an equitable and fair change to the Bill.
Clause 120 sets out that some regulated providers will be required to pay a regulatory fee to Ofcom, as set out in clause 71. Clause 120 allows Ofcom to impose a financial penalty if a regulated provider does not pay its fee by the deadline it sets. Finally, clause 121 sets out the information that needs to be included in these penalty notices issued by Ofcom.
I have questions about the management of the fees and the recovery of the preparatory cost. Does the Minister expect that the initial fees will be higher as a result of having to recoup the preparatory cost and will then reduce? How quickly will the preparatory cost be recovered? Will Ofcom recover it quickly or over a longer period of time?
The Bill provides a power for Ofcom to recover those costs. It does not specify over what time period. I do not think they will be recouped over a period of years. Ofcom can simply recoup the costs in a single hit. I would imagine that Ofcom would seek to recover these costs pretty quickly after receiving these powers. The £108 million is an estimate. The actual figure may be different once the reconciliation and accounting is done. It sounds like a lot of money, but it is spread among a number of very large social media firms. It is not a large amount of money for them in the context of their income, so I would expect that recouping to be done on an expeditious basis—not spread over a number of years. That is my expectation.
Question put and agreed to.
Clause 118 accordingly ordered to stand part of the Bill.
Clause 119 ordered to stand part of the Bill.
Clause 120
Non-payment of fee
Amendments made: 154, in clause 120, page 102, line 20, after “71” insert:
“or Schedule (Recovery of OFCOM’s initial costs)”.
This amendment, and Amendments 155 to 157, ensure that Ofcom have the power to impose a monetary penalty on a provider of a service who fails to pay a fee that they are required to pay under NS2.
Amendment 155, in clause 120, page 102, line 21, leave out “that section” and insert “Part 6”.
Amendment 156, in clause 120, page 102, line 26, after “71” insert—
“or Schedule (Recovery of OFCOM’s initial costs)”
Amendment 157, in clause 120, page 103, line 12, at end insert—
“or Schedule (Recovery of OFCOM’s initial costs)”.—(Chris Philp.)
Clause 120, as amended, ordered to stand part of the Bill.
Clause 121 ordered to stand part of the Bill.
Clause 122
Amount of penalties etc
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss:
Government amendment 158.
That schedule 12 be the Twelfth schedule to the Bill.
Labour supports clause 122 and schedule 12, which set out in detail the financial penalties that Ofcom may impose, including the maximum penalty that can be imposed. Labour has long supported financial penalties for those failing to comply with the duties in the Bill. We firmly believe that tough action is needed on online safety, but we feel the sanctions should go further and that there should be criminal liability for offences beyond just information-related failures. We welcome clause 122 and schedule 12. It is vital that Ofcom is also required to produce guidelines around how it will determine penalty amounts. Consistency across the board is vital, so we feel this is a positive step forward and have not sought to amend the clause.
Paragraph 8 of schedule 12 requires monetary penalties to be paid into the consolidated fund. There is no change to that requirement, but it now appears in new clause 43, together with the requirement to pay fees charged under new schedule 2 into the consolidated fund. We therefore support the amendments.
I have nothing further to add on these amendments. The shadow Minister has covered them, so I will not detain the Committee further.
Question put and agreed to.
Clause 122 accordingly ordered to stand part of the Bill.
Schedule 12
Penalties imposed by OFCOM under Chapter 6 of Part 7
Amendment made: 158, in schedule 12, page 206, line 43, leave out paragraph 8.—(Chris Philp.)
Paragraph 8 of Schedule 12 requires monetary penalties to be paid into the Consolidated Fund. There is no change to that requirement, but it now appears in NC43 together with the requirement to pay fees charged under NS2 into the Consolidated Fund.
Schedule 12, as amended, agreed to.
Clause 123
Service restriction orders
I beg to move amendment 50, in clause 123, page 106, line 36, at end insert—
“(9A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (5).”
This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.
If no other Members wish to speak to amendments 50 and 51 and clauses 123 to 127, I will call the Minister to respond.
Let me start with amendments 50 and 51, which were introduced by the shadow Minister and supported by the SNP spokesperson. The Government recognise the valid intent behind the amendments, namely to make sure that applications can be streamlined and done quickly, and that Ofcom can make bulk applications if large numbers of service providers violate the new duties to the extent that interim service restriction orders or access restriction orders become necessary.
We want a streamlined process, and we want Ofcom to deal efficiently with it, including, if necessary, by making bulk applications to the court. Thankfully, however, procedures under the existing civil procedure rules already allow so-called multi-party claims to be made. Those claims permit any number of claimants, any number of defendants or respondents and any number of claims to be covered in a single form. The overriding objective of the CPR is that cases are dealt with justly and proportionately. Under the existing civil procedure rules, Ofcom can already make bulk applications to deal with very large numbers of non-compliant websites and service providers in one go. We completely agree with the intent behind the amendments, but their content is already covered by the CPR.
It is worth saying that the business disruption measures—the access restriction orders and the service restriction orders—are intended to be a last resort. They effectively amount to unplugging the websites from the internet so that people in the United Kingdom cannot access them and so that supporting services, such as payment services, do not support them. The measures are quite drastic, although necessary and important, because we do not want companies and social media firms ignoring our legislation. It is important that we have strong measures, but they are last resorts. We would expect Ofcom to use them only when it has taken reasonable steps to enforce compliance using other means.
If a provider outside the UK ignores letters and fines, these measures are the only option available. As the shadow Minister, the hon. Member for Pontypridd, mentioned, some pornography providers probably have no intention of even attempting to comply with our regulations; they are probably not based in the UK, they are never going to pay the fine and they are probably incorporated in some obscure, offshore jurisdiction. Ofcom will need to use these powers in such circumstances, possibly on a bulk scale—I am interested in her comment that that is what the German authorities had to do—but the powers already exist in the CPR.
It is also worth saying that in its application to the courts, Ofcom must set out the information required in clauses 123(5) and 125(3), so evidence that backs up the claim can be submitted, but that does not stop Ofcom doing this on a bulk basis and hitting multiple different companies in one go. Because the matter is already covered in the CPR, I ask the shadow Minister to withdraw the amendment.
I am interested to know whether the Minister has anything to add about the other clauses. I am happy to give way to him.
I thank the shadow Minister for giving way. I do not have too much to say on the other clauses, because she has introduced them, but in my enthusiasm for explaining the civil procedure rules I neglected to respond to her question about the interim orders in clauses 124 and 126.
The hon. Lady asked what criteria have to be met for these interim orders to be made. The conditions for clause 124 are set out in subsections (3) and (4) of that clause, which states, first, that it has to be
“likely that the…service is failing to comply with an enforceable requirement”—
so it is likely that there has been a breach—and, secondly, that
“the level of risk of harm to individuals in the United Kingdom…and the nature and severity of that harm, are such that it would not be appropriate to wait to establish the failure before applying for the order.”
Similar language in clause 124(4) applies to breaches of section 103.
Essentially, if it is likely that there has been a breach, and if the resulting harm is urgent and severe—for example, if children are at risk—we would expect these interim orders to be used as emergency measures to prevent very severe harm. I hope that answers the shadow Minister’s question. She is very kind, as is the Chair, to allow such a long intervention.
In a Bill Committee, a Member can speak more than once. However, your intervention resolved the situation amicably, Minister.
The Minister and his Back Benchers will, I am sure, be tired of our calls for more transparency, but I will be kind to him and confirm that Labour welcomes the provisions in clause 128.
We believe that it is vital that, once Ofcom has followed the process outlined in clause 110 when issuing a confirmation decision outlining its final decision, that is made public. We particularly welcome provisions to ensure that when a confirmation decision is issued, Ofcom will be obliged to publish the identity of the person to whom the decision was sent, details of the failure to which the decision relates, and details relating to Ofcom’s response.
Indeed, the transparency goes further, as Ofcom will be obliged to publish details of when a penalty notice has been issued in many more areas: when a person fails to comply with a confirmation decision; when a person fails to comply with a notice to deal with terrorism content or child sexual exploitation and abuse content, or both; and when there has been a failure to pay a fee in full. That is welcome indeed. Labour just wishes that the Minister had committed to the same level of transparency on the duties in the Bill to keep us safe in the first place. That said, transparency on enforcement is a positive step forward, so we have not sought to amend the clause at this stage.
I am grateful for the shadow Minister’s support. I have nothing substantive to add, other than to point to the transparency reporting obligation in clause 64, which we have debated.
Question put and agreed to.
Clause 128 accordingly ordered to stand part of the Bill.
Clause 129
OFCOM’s guidance about enforcement action
I beg to move amendment 7, in clause 129, page 114, line 3, at end insert—
“(aa) the Information Commissioner, and”.
This amendment ensures that before Ofcom produce guidance about their exercise of their enforcement powers, they must consult the Information Commissioner.
If I may, in the interest of speed and convenience, I will speak to clause stand part as well.
The clause requires Ofcom to issue guidance setting out how it will use its enforcement powers in the round. That guidance will ensure that the enforcement process is transparent, it will cover the general principles and processes of the enforcement regime, and it is intended to help regulated providers and other stakeholders to understand how Ofcom will exercise its powers.
Clause 129(4) states that the Secretary of State will be consulted in the process. What would be the Secretary of State’s powers in relation to that? Would she be able to overrule Ofcom in the writing of its guidance?
The hon. Member asks for my assistance in interpreting legislative language. Generally speaking, “consult” means what it suggests. Ofcom will consult the Secretary of State, as it will consult the ICO, to ascertain the Secretary of State’s opinion, but Ofcom is not bound by that opinion. Unlike the power in a previous clause—I believe it was clause 40—where the Secretary of State could issue a direct instruction to Ofcom on certain matters, here we are talking simply about consulting. When the Secretary of State expresses an opinion in response to the consultation, it is just that—an opinion. I would not expect it to be binding on Ofcom, but I would expect Ofcom to pay proper attention to the views of important stakeholders, which in this case include both the Secretary of State and the ICO. I hope that gives the hon. Member the clarification he was seeking.
As we know, clause 129 requires Ofcom to publish guidance about how it will use its enforcement powers. It is right that regulated providers and other stakeholders have a full understanding of how, and in what circumstances, Ofcom will have the legislative power to exercise this suite of enforcement powers. We also welcome Government amendment 7, which will ensure that the Information Commissioner—a key and, importantly, independent authority—is included in the consultation before guidance is produced.
As we have just heard, however, the clause sets out that Secretary of State must be consulted before Ofcom produces guidance, including revised or replacement guidance, about how it will use its enforcement powers. We feel that that involves the Secretary of State far too closely in the enforcement of the regime. The Government should be several steps away from being involved, and the clause seriously undermines Ofcom’s independence—the importance of which we have been keen to stress as the Bill progresses, and on which Conservative Back Benchers have shared our view—so we cannot support the clause.
I repeat the point I made to the hon. Member for Liverpool, Walton a moment ago. This is simply an obligation to consult. The clause gives the Secretary of State an opportunity to offer an opinion, but it is just that—an opinion. It is not binding on Ofcom, which may take that opinion into account or not at its discretion. This provision sits alongside the requirement to consult the Information Commissioner’s Office. I respectfully disagree with the suggestion that it represents unwarranted and inappropriate interference in the operation of a regulator. Consultation between organs of state is appropriate and sensible, but in this case it does not fetter Ofcom’s ability to act at its own discretion. I respectfully do not agree with the shadow Minister’s analysis.
Apologies, Ms Rees, for coming in a bit late on this, but I was not aware of the intention to vote against the clause. I want to make clear what the Scottish National party intends to do, and the logic behind it. The inclusion of Government amendment 7 is sensible, and I am glad that the Minister has tabled it. Clause 129 is incredibly important, and the requirement to publish guidance will ensure that there is a level of transparency, which we and the Labour Front Benchers have been asking for.
The Minister has been clear about the requirement for Ofcom to consult the Secretary of State, rather than to be directed by them. As a whole, this Bill gives the Secretary of State far too much power, and far too much ability to intervene in the workings of Ofcom. In this case, however, I do not have an issue with the Secretary of State being consulted, so I intend to support the inclusion of this clause, as amended by Government amendment 7.
Question put, That the amendment be made.
When I spoke at the very beginning of the Committee’s proceedings, I said that the legislation was necessary, that it was a starting point and that it would no doubt change and develop over time. However, I have been surprised at how little, considering all of the rhetoric we have heard from the Secretary of State and other Ministers, the Bill actually deals with the general societal harm that comes from the internet. This is perhaps the only place in the Bill where it is covered.
I am thinking of the echo chambers that are created around disinformation and the algorithms that companies use. I really want to hear from the Minister where he sees this developing and why it is so weak and wishy-washy. While I welcome that much of the Bill seeks to deal with the criminality of individuals and the harm and abuse that can be carried out over the internet, overall it misses a great opportunity to deal with the harmful impact the internet can have on society.
Let me start by speaking on the issue of disinformation more widely, which clearly is the target of the two amendments and the topic of clause 130. First, it is worth reminding the Committee that non-legislatively—operationally—the Government are taking action on the disinformation problem via the counter-disinformation unit of the Department for Digital, Culture, Media and Sport, which we have discussed previously.
The unit has been established to monitor social media firms and sites for disinformation and then to take action and work with social media firms to take it down. For the first couple of years of its operation, it understandably focused on disinformation connected to covid. In the last two or three months, it has focused on disinformation relating to the Russia-Ukraine conflict —in particular propaganda being spread by the Russian Government, which, disgracefully, has included denying responsibility for various atrocities, including those committed at Bucha. In fact, in cases in which the counter-disinformation unit has not got an appropriate response from social media firms, those issues have been escalated to me, and I have raised them directly with those firms, including Twitter, which has tolerated all kinds of disinformation from overt Russian state outlets and channels, including from Russian embassy Twitter accounts, which are of particular concern to me. Non-legislative action is being taken via the CDU.
It is fantastic to hear that those other things are happening—that is all well and good—but surely we should explicitly call out disinformation and misinformation in the Online Safety Bill. The package of other measures that the Minister mentions is fantastic, but I think they have to be in the Bill.
The hon. Lady says that those measures should be in the Bill—more than they already are—but as I have pointed out, the way in which the legal architecture of the Bill works means that the mechanisms to do that would be adding a criminal offence to schedule 7 as a priority offence, for example, or using a statutory instrument to designate the relevant kind of harm as a priority harm, which we plan to do in due course for a number of harms. The Bill can cover disinformation with the use of those mechanisms.
We have not put the harmful to adults content in the Bill; it will be set out in statutory instruments. The National Security Bill is still progressing through Parliament, and we cannot have in schedule 7 of this Bill an offence that has not yet been passed by Parliament. I hope that that explains the legal architecture and mechanisms that could be used under the Bill to give force to those matters.
On amendment 57, the Government feel that six months is a very short time within which to reach clear conclusions, and that 18 months is a more appropriate timeframe in which to understand how the Bill is bedding in and operating. Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. To be clear, the Bill already requires Ofcom to produce codes of practice that set out the steps that providers will take to tackle illegal content— I mentioned the new National Security Bill, which is going through Parliament—and harmful content, which may, in some circumstances, include disinformation.
Disinformation that is illegal or harmful to individuals is in scope of the duties set out in the Bill. Ofcom’s codes of practice will, as part of those duties, have to set out the steps that providers should take to reduce harm to users that arises from such disinformation. Those steps could include content-neutral design choices or interventions of other kinds. We would like Ofcom to have a certain amount of flexibility in how it develops those codes of practice, including by being able to combine or disaggregate those codes in ways that are most helpful to the general public and the services that have to pay regard to them. That is why we have constructed them in the way we have. I hope that provides clarity about the way that disinformation can be brought into the scope of the Bill and how that measure then flows through to the codes of practice. I gently resist amendments 57 and 58 while supporting the clause standing part of the Bill.
Question put, That the amendment be made.
The clause allows Ofcom to confer functions on the content board in relation to content-related functions under the Bill, but does not require it to do so. We take the view that how Ofcom manages its responsibilities internally is a matter for Ofcom. That may change over time. The clause simply provides that Ofcom may, if Ofcom wishes, ask its content board to consider online safety matters alongside its existing responsibilities. I trust that the Committee considers that a reasonable measure.
Labour welcomes the clause, which, as the Minister has said, sets out some important clarifications with respect to the Communications Act 2003. We welcome the clarification that the content board will have delegated and advisory responsibilities, and look forward to the Minister’s confirmation of exactly what those are and how this will work in practice. It is important that the content board and the advisory committee on disinformation and misinformation are compelled to communicate, too, so we look forward to an update from the Minister on what provisions in the Bill will ensure that that happens.
The shadow Minister has asked how this will work in practice, but as I said, the internal operation of Ofcom obviously is a matter for Ofcom. As Members have said in the recent past—indeed, in the last hour—they do not welcome undue Government interference in the operation of Ofcom, so it is right that we leave this as a matter for Ofcom. We are providing Ofcom with the power, but we are not compelling it to use that power. We are respecting Ofcom’s operational independence—a point that shadow Ministers and Opposition Members have made very recently.
Question put and agreed to.
Clause 131 accordingly ordered to stand part of the Bill.
Clause 132
Research about users’ experiences of regulated services
Question proposed, That the clause stand part of the Bill.
I agree with the right hon. Member for Basingstoke that these are important clauses. I want to put them into the context of what we heard from Frances Haugen, who, when she spoke to Congress, said that Facebook consistently chose to maximise its growth rather than implement safeguards on its platforms. She said:
“During my time at Facebook, I came to realise a devastating truth: Almost no one outside of Facebook knows what happens inside Facebook. “The company intentionally hides vital information from the public, from the U.S. government, and from governments around the world.”
When we consider users’ experiences, I do not think it is good enough just to look at how the user engages with information. We need far more transparency about how the companies themselves are run. I would like to hear the Minister’s views on how this clause, which looks at users’ experiences, can go further in dealing with the harms at source, with the companies, and making sure a light is shone on their practices.
I welcome the support of the hon. Member for Pontypridd for these clauses. I will turn to the questions raised by my right hon. Friend the Member for Basingstoke. First, she asked whether Ofcom has to publish these reports so that the public, media and Parliament can see what they say. I am pleased to confirm that Ofcom does have to publish the reports; section 15 of the Communications Act 2003 imposes a duty on Ofcom to publish reports of this kind.
Secondly, my right hon. Friend asked about educating the public on issues pertinent to these reports, which is what we would call a media literacy duty. Again, I confirm that, under the Communications Act, Ofcom has a statutory duty to promote media literacy, which would include matters that flow from these reports. In fact, Ofcom published an expanded and updated set of policies in that area at the end of last year, which is why the old clause 103 in the original version of this Bill was removed—Ofcom had already gone further than that clause required.
Thirdly, my right hon. Friend asked about the changes that might happen in response to the findings of these reports. Of course, it is open to Ofcom—indeed, I think this Committee would expect it—to update its codes of practice, which it can do from time to time, in response to the findings of these reports. That is a good example of why it is important for those codes of practice to be written by Ofcom, rather than being set out in primary legislation. It means that when some new fact or circumstance arises or some new bit of research, such as the information required in this clause, comes out, those codes of practice can be changed. I hope that addresses the questions my right hon. Friend asked.
The hon. Member for Liverpool, Walton asked about transparency, referring to Frances Haugen’s testimony to the US Senate and her disclosures to The Wall Street Journal, as well as the evidence she gave this House, both to the Joint Committee and to this Committee just before the Whitsun recess. I have also met her bilaterally to discuss these issues. The hon. Gentleman is quite right to point out that these social media firms use Facebook as an example, although there are others that are also extremely secretive about what they say in public, to the media and even to representative bodies such as the United States Congress. That is why, as he says, it is extremely important that they are compelled to be a lot more transparent.
The Bill contains a large number of provisions compelling or requiring social media firms to make disclosures to Ofcom as the regulator. However, it is important to have public disclosure as well. It is possible that the hon. Member for Liverpool, Walton was not in his place when we came to the clause in question, but if he turns to clause 64 on page 56, he will see that it includes a requirement for Ofcom to give every provider of a relevant service a notice compelling them to publish a transparency report. I hope he will see that the transparency obligation that he quite rightly refers to—it is necessary—is set out in clause 64(1). I hope that answers the points that Committee members have raised.
Question put and agreed to.
Clause 132 accordingly ordered to stand part of the Bill.
Clause 133 ordered to stand part of the Bill.
Clause 134
OFCOM’s statement about freedom of expression and privacy
Question proposed, That the clause stand part of the Bill.
As we all know, the clause requires Ofcom to publish annual reports on the steps it has taken, when carrying out online safety functions, to uphold users’ rights under articles 8 and 10 of the convention, as required by section 6 of the Human Rights Act 1998. It will come as no surprise to the Minister that Labour entirely supports this clause.
Upholding users’ rights is a central part of this Bill, and it is a topic we have debated repeatedly in our proceedings. I know that the Minister faces challenges of his own, as the Opposition do, regarding the complicated balance between freedom of speech and safety online. It is only right and proper, therefore, for Ofcom to have a specific duty to publish reports about what steps it is taking to ensure that the online space is fair and equal for all.
That being said, we know that we can and should go further. My hon. Friend the Member for Batley and Spen will shortly address an important new clause tabled in her name—I believe it is new clause 25—so I will do my best not to repeat her comments, but it is important to say that Ofcom must be compelled to publish reports on how its overall regulatory operating function is working. Although Labour welcomes clause 134 and especially its commitment to upholding users’ rights, we believe that when many feel excluded in the existing online space, Ofcom can do more in its annual reporting. For now, however, we support clause 134.
I welcome the shadow Minister’s continuing support for these clauses. Clause 134 sets out the requirement on Ofcom to publish reports setting out how it has complied with articles 8 and 10 of the European convention on human rights.
I will pause for a second, because my hon. Friend the Member for Don Valley and others have raised concerns about the implications of the Bill for freedom of speech. In response to a question he asked last week, I set out in some detail the reasons why I think the Bill improves the position for free speech online compared with the very unsatisfactory status quo. This clause further strengthens that case, because it requires this report and reminds us that Ofcom must discharge its duties in a manner compatible with articles 8 and 10 of the ECHR.
From memory, article 8 enshrines the right to a family life, and article 10 enshrines the right to free speech, backed up by quite an extensive body of case law. The clause reminds us that the powers that the Bill confers on Ofcom must be exercised—indeed, can only be exercised—in conformity with the article 10 duties on free speech. I hope that that gives my hon. Friend additional assurance about the strength of free speech protection inherent in the Bill. I apologise for speaking at a little length on a short clause, but I think that was an important point to make.
Question put and agreed to.
Clause 134 accordingly ordered to stand part of the Bill.
Clause 135
OFCOM’s transparency reports
Question proposed, That the clause stand part of the Bill.
Again, Labour welcomes clause 135, which places a duty on Ofcom to produce its own reports based on information from the transparency reports that providers are required to publish. However, the Minister will know that Labour feels the Bill has much more work to do on transparency more widely, as we have repeatedly outlined through our debates. The Minister rejected our calls for increased transparency when we were addressing, I believe, clause 61. We are not alone in feeling that transparency reports should go further. The sector and his own Back Benchers are calling for it, yet so far his Department has failed to act.
It is a welcome step that Ofcom must produce its own reports based on information from the provider’s transparency reports, but the ultimate motivation for the reports to provide a truly accurate depiction of the situation online is for them to be made public. I know the Minister has concerns around security, but of course no one wants to see users put at harm unnecessarily. That is not what we are asking for here. I will refrain from repeating debates we have already had at length, but I wish to again put on the record our concerns around the transparency reporting process as it stands.
That being said, we support clause 135. It is right that Ofcom is compelled to produce its own reports; we just wish they were made public. With the transparency reports coming from the providers, we only wish they would go further.
I have spoken to these points previously, so I do not want to tax the Committee’s patience by repeating what I have said.
Question put and agreed to.
Clause 135 accordingly ordered to stand part of the Bill.
Clause 136
OFCOM’s report about researchers’ access to information
Question proposed, That the clause stand part of the Bill.
Again, Labour welcomes clause 136, which is a positive step towards a transparent approach to online safety, given that it requires Ofcom to publish a report about the access that independent researchers have, or could have, to matters relating to the online safety of regulated services. As my hon. Friend the Member for Worsley and Eccles South rightly outlined in an earlier sitting, Labour strongly believes that the transparency measures in the Bill do not go far enough.
Independent researchers already play a vital role in regulating online safety. Indeed, there are far too many to list, but many have supported me, and I am sure the Minister, in our research on the Bill. That is why we have tabled a number of amendments on this point, as we sincerely feel there is more work to be done. I know the Minister says he understands and is taking on board our comments, but thus far we have seen little movement on transparency.
In this clause we are specifically talking about access to information for researchers. Obviously, the transparency matters were covered in clauses 64 and 135. There is consensus across both parties that access to information for bona fide academic researchers is important. The clause lays out a path to take us in the direction of providing that access by requiring Ofcom to produce a report. We debated the matter earlier. The hon. Member for Worsley and Eccles South—I hope I got the pronunciation right this time—
The hon. Lady made some points about the matter in an earlier sitting, as the shadow Minister just said. It is an area we are giving some careful thought to, because it is important that it is properly academically researched. Although Ofcom is being well resourced, as we have discussed, with lots of money and the ability to levy fees, we understand that it does not have a monopoly on wisdom—as good a regulator as it is. It may well be that a number of academics could add a great deal to the debate by looking at some of the material held inside social media firms. The Government recognise the importance of the matter, and some thought is being given to these questions, but at least we can agree that clause 136 as drafted sets out a path that leads us in this important direction.
Question put and agreed to.
Clause 136 accordingly ordered to stand part of the Bill.
Clause 137
OFCOM’s reports
Briefly, before I hand over to my hon. Friend the Member for Worsley and Eccles South, I should say that Labour welcomes clause 137, which gives Ofcom a discretionary power to publish reports about certain online safety measures and matters. Clearly, it is important to give Ofcom the power to redact or exclude confidential matters where needs be, and I hope that there will be a certain level of common sense and public awareness, should information of this nature be excluded. As I have previously mentioned—I sound a bit like a broken record—Labour echoes the calls for more transparency, which my hon. Friend the Member for Batley and Spen will come on to in her new clause. However, broadly, we support this important clause.
I would like to press the Minister briefly on how exactly the exclusion of material from Ofcom reports will work in practice. Can he outline any specific contexts or examples, beyond commercial sensitivity and perhaps matters of national security, where he can envision this power being used?
I welcome the shadow Minister’s support for the clause, once again. The clause provides Ofcom with the power to publish relevant reports about online safety matters to keep users, the public and Parliament well informed. Again, clearly, it is up to Ofcom to decide how it publishes those reports; we will not compel it.
On the question about confidential material that might be withheld, the relevant language in clause 137 looks, to me, to precisely echo the language we saw previously in clause—where was it? Anyway, we have come across this in a previous clause. When it comes to publishing material that can be excluded, the language is just the same.
I would like to make it clear that, while, obviously, this decision is a matter for Ofcom, I would expect that exclusion to be used on a pretty rare basis. Obviously, one would expect matters that are acutely commercially sensitive to be excluded—or redacted—to address that. If there was very sensitive intellectual property, where it would prejudice a company’s commercial interest to have all of that intellectual property exposed, I would expect Ofcom to exercise the exclusion or at least redact what it publishes.
However, because transparency is so important—it is a point that the Committee has made repeatedly—I would expect these exclusions to be used sparingly, and only where absolutely necessary to deliver issues such as the commercial confidentiality or IP protection. Then, it should be used to the minimum extent necessary, because I think that this Committee thinks, and Parliament thinks, that the disclosure around these reports and the reports about breaches—mentioned in the clause I was trying to reach for previously, which was clause 128(4)(b) and (5)(b); perhaps Hansard would be kind enough to clarify that point to make me look slightly more articulate than I in fact am—should be used only very carefully and very rarely. The Committee should be clear on that, and that the bias, as it were—the assumption—should be on the side of disclosure rather than withholding information.
Question put and agreed to.
Clause 137 accordingly ordered to stand part of the Bill.
Clause 138
Appeals against OFCOM decisions relating to the register under section 81
Question proposed, That the clause stand part of the Bill.
Good morning, Ms Rees. It is a pleasure to serve on the Committee with you in the Chair. Clause 138 allows companies to make appeals against Ofcom’s decisions regarding the categorisation of services within categories 1, 2A or 2B.
We have argued, many times, that we believe the Government’s size-based approach to categorisation is flawed. Our preference for an approach based on risk is backed up by the views of multiple stakeholders and the Joint Committee. It was encouraging to hear last week of the Minister’s intention to look again at the issues of categorisation, and I hope we will see movement on that on Report.
Clause 138 sets out that where a regulated provider has filed an appeal, they are exempt from carrying out the duties in the Bill that normally apply to services designated as category 1, 2A or 2B. That is concerning, given that there is no timeframe in which the appeals process must be concluded.
While the right to appeal is important, it is feasible that many platforms will raise appeals about their categorisation to delay the start of their duties under the Bill. I understand that the platforms will still have to comply with the duties that apply to all regulated services, but for a service that has been classified by Ofcom as high risk, it is potentially dangerous that none of the risk assessments on measures to assess harm will be completed while the appeal is taking place. Does the Minister agree that the appeals process must be concluded as quickly as possible to minimise the risk? Will he consider putting a timeframe on that?
Clause 139 allows for appeals against decisions by Ofcom to issue notices about dealing with terrorism and child sexual abuse material, as well as a confirmation decision or a penalty notice. As I have said, in general the right to appeal is important. However, would an appeals system work if, for example, a company were appealing to a notice under clause 103? In what circumstances does the Minister imagine that a platform would appeal a notice by Ofcom requiring the platform to use accredited technology to identify child sexual abuse content and swiftly take down that content? It is vital that appeals processes are concluded as rapidly as possible, so that we do not risk people being exposed to harmful or dangerous content.
The shadow Minister has set out the purpose of the clauses, which provide for, in clause 138 appeal rights for decisions relating to registration under clause 81, and in clause 139 appeals against Ofcom notices.
I agree that it is important that judicial decisions in this area get made quickly. I note that the appeals are directly to the relevant upper tribunal, which is a higher tier of the tribunal system and tends to be a little less congested than the first-tier tribunal, which often gets used for some first-instance matters. I hope that appeals going to the upper tribunal, directly to that more senior level, provides some comfort.
On putting in a time limit, the general principle is that matters concerning listing are reserved to the judiciary. I recall from my time as a Minister in the Ministry of Justice, that the judiciary guards its independence fiercely. Whether it is the Senior President of Tribunals or the Lord Chief Justice, they consider listing matters to be the preserve of the judiciary, not the Executive or the legislature. Compelling the judiciary to hear a case in a certain time might well be considered to infringe on such principles.
We can agree, however—I hope the people making those listing decisions hear that we believe, that Parliament believes—that it is important to do this quickly, in particular where there is a risk of harm to individuals. Where there is risk to individuals, especially children, but more widely as well, those cases should be heard very expeditiously indeed.
The hon. Member for Worsley and Eccles South also asked about the basis on which appeals might be made and decided. I think that is made fairly clear. For example, clause 139(3) makes it clear that, in deciding an appeal, the upper tribunal will use the same principles as would be applied by the High Court to an application for judicial review—so, standard JR terms—which in the context of notices served or decisions made under clause 103 might include whether the power had been exercised in conformity with statute. If the power were exercised or purported to be exercised in a manner not authorised by statute, that would be one grounds for appeal, or if a decision were considered so grossly unreasonable that no reasonable decision maker could make it, that might be a grounds for appeal as well.
I caution the Committee, however: I am not a lawyer and my interpretation of judicial review principles should not be taken as definitive. Lawyers will advise their clients when they come to apply the clause in practice and they will not take my words in Committee as definitive when it comes to determining “standard judicial review principles”—those are well established in law, regardless of my words just now.
There is a concern that platforms might raise appeals about their categorisation in order to delay the start of their duties under the Bill. How would the Minister act if that happened—if a large number of appeals were pending and the duties under the Bill therefore did not commence?
Clearly, resourcing of the upper tribunal is a matter decided jointly by the Lord Chancellor and the Secretary of State for Justice, in consultation with the Lord Chief Justice, and, in this case, the Senior President of Tribunals. Parliament would expect the resourcing of that part of the upper tribunal to be such that cases could be heard in an expedited matter. Particularly where cases concern the safety of the public—and particularly of children—we expect that to be done as quickly as it can.
Question put and agreed to.
Clause 138 accordingly ordered to stand part of the Bill.
Clause 139 ordered to stand part of the Bill.
Clause 140
Power to make super-complaints
I beg to move amendment 143, in clause 140, page 121, line 1, after “services” insert “, consumers”.
The Bill currently specifies that super-complaints can be made back to Ofcom by bodies representing users or members of the public. The addition of consumer representatives through the amendments is important. Consumer representatives are a key source of information about harms to users of online services, which are widespread, and would be regulated by this legislation. We support the amendments, which would include consumers on the list as an entity that is eligible to make super-complaints.
Clearly, we want the super-complaint function to be as effective as possible and for groups of relevant people, users or members of the public to be able to be represented by an eligible entity to raise super-complaints. I believe we are all on the same page in wanting to do that. If I am honest, I am a little confused as to what the addition of the term “consumers” will add. The term “users” is defined quite widely, via clause 140(6), which then refers to clause 181, where, as debated previously, a “user” is defined widely to include anyone using a service, whether registered or not. So if somebody stumbles across a website, they count as a user, but the definition being used in clause 140 about bringing super-complaints also includes “members of the public”—that is, regular citizens. Even if they are not a user of that particular service, they could still be represented in bringing a complaint.
Given that, by definition, “users” and “members of the public” already cover everybody in the United Kingdom, I am not quite sure what the addition of the term “consumers” adds. By definition, consumers are a subset of the group “users” or “members of the public”. It follows that in seeking to become an eligible entity, no eligible entity will purport to act for everybody in the United Kingdom; they will always be seeking to define some kind of subset of people. That might be children, people with a particular vulnerability or, indeed, consumers, who are one such subset of “members of the public” or “users”. I do not honestly understand what the addition of the word “consumers” adds here when everything is covered already.
Will the Minister explicitly say that he thinks that an eligible entity, acting on behalf of consumers, could, if it fulfils the other criteria, bring a super-complaint?
Yes, definitely. That is the idea of an eligible entity, which could seek to represent a particular demographic, such as children or people from a particular marginalised group, or it could represent people who have a particular interest, which would potentially include consumers. So I can confirm that that is the intention behind the drafting of the Bill. Having offered that clarification and made clear that the definition is already as wide as it conceivably can be—we cannot get wider than “members of the public”—I ask the hon. Member for Aberdeen North to consider withdrawing the amendments, particularly as there are so many. It will take a long time to vote on them.
I thank the Minister for the clarification. Given that he has explicitly said that he expects that groups acting on behalf of consumers could, if they fulfil the other criteria, be considered as eligible entities for making super-complaints, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Amendment proposed: 66, in clause 140, page 121, line 8, at end insert—
“(d) causing harm to any human or animal.”
This amendment ensures groups are able to make complaints regarding animal abuse videos.—(Alex Davies-Jones.)
I beg to move amendment 77, in clause 140, page 121, line 9, leave out subsection (2).
This amendment removes the tests that complaints have to be of particular importance in order to be admissible.
When I first read clause 140, subsection (2) raised a significant number of red flags for me. The subsection might be reasonable if we did not have giant companies—social media platforms particularly—that significant numbers of people across the UK use regularly. Facebook might be counted as a “single regulated service”, but 85% of UK residents—57.1 million people—had a Facebook account earlier this year. Twitter is used by 28% of people living in the UK, which is 19 million users. TikTok is at 19%, which is significantly less, but still a very high number of people—13 million users. I can understand the decision that a super-complaint picking on one certain company might be a bit extreme, but it does not make sense when we are considering the Facebooks of this world.
If someone is making a complaint about a single regulated service and that service is Facebook, Twitter, TikTok or another large platform—or a new, yet-to-be-created platform—that significant numbers of people use, there is no justification for treating that complaint differently just because it is against a single entity. When a complaint is made against Facebook—I am picking on Facebook because 85% of the UK public are members of it; it is an absolute behemoth—I would like there to be no delay in its being taken to Ofcom. I would like Ofcom not to have to check and justify that the complaint is “of particular importance”.
Subsection (2)(a) states that one of the tests of the complaint should be that it “is of particular importance” or, as subsection (2)(b) notes, that it
“relates to the impacts on a particularly large number of users of the service or members of the public.”
I do not understand what
“large number of users of the service”
would mean. Does a large number of the users of Facebook mean 50% of its users? Does it mean 10%? What is a large number? Is that in percentage terms, or is it something that is likely to impact 1 million people? Is that a large number? The second part—
“large number…of members of the public”—
is again difficult to define. I do not think there is justification for this additional hoop just because the complaint relates to a single regulated service.
Where a complaint relates to a very small platform that is not causing significant illegal harm, I understand that Ofcom may want to consider whether it will accept, investigate and give primacy and precedence to that. If the reality is that the effect is non-illegal, fairly minor and impacts a fairly small number of people, in the order of hundreds instead of millions, I can understand why Ofcom might not want to give that super-complaint status and might not want to carry out the level of investigation and response necessary for a super-complaint. But I do not see any circumstances in which Ofcom could justify rejecting a complaint against Facebook simply because it is a complaint against a single entity. The reality is that if something affects one person on Facebook, it will affect significantly more than one person on Facebook because of Facebook’s absolutely massive user base. Therefore this additional hoop is unrealistic.
Paragraph (a), about the complaint being “of particular importance”, is too woolly. Does it relate only to complaints about things that are illegal? Does it relate only to things that are particularly urgent—something that is happening now and that is having an impact today? Or is there some other criterion that we do not yet know about?
I would very much appreciate it if the Minister could give some consideration to amendment 77, which would simply remove subsection (2). If he is unwilling to remove that subsection, I wonder whether we could meet halfway and whether, let us say, category 1 providers could all be excluded from the “single provider” exemption, because they have already been assessed by Ofcom to have particular risks on their platforms. That group is wider than the three names that I have mentioned, and I think that that would be a reasonable and realistic decision for the Government—and direction for Ofcom—to take. It would be sensible.
If the Government believe that there is more information—more direction—that they could add to the clause, it would be great if the Minister could lay some of that out here and let us know how he intends subsection (2) to operate in practice and how he expects Ofcom to use it. I get that people might want it there as an additional layer of protection, but I genuinely do not imagine that it can be justified in the case of the particularly large providers, where there is significant risk of harm happening.
I will illustrate that with one last point. The Government specifically referred earlier to when Facebook—Meta—stopped proactively scanning for child sexual abuse images because of an issue in Europe. The Minister mentioned the significant amount of harm and the issues that were caused in a very small period. And that was one provider—the largest provider that people use and access. That massive amount of harm can be caused in a very small period. I do not support allowing Meta or any other significantly large platform to have a “get out of jail” card. I do not want them to be able to go to Ofcom and say, “Hey, Ofcom, we’re challenging you on the basis that we don’t think this complaint is of particular importance” or “We don’t think the complaint relates to the impacts on a particularly large number of users of the service or members of the public.” I do not want them to have that ability to wriggle out of things because this subsection is in the Bill, so any consideration that the Minister could give to improving clause 140 and subsection (2) would be very much appreciated.
I think the Committee, and the House, are pretty unanimous in agreeing that the power to make super-complaints is important. As we have discussed, there are all kinds of groups, such as children, under-represented groups and consumers, that would benefit from being represented where systemic issues are not being addressed and that Ofcom may have somehow overlooked or missed in the discharge of its enforcement powers.
I would observe in passing that one of the bases on which super-complaints can be made—this may be of interest to my hon. Friend the Member for Don Valley—is where there is a material risk under clause 140(1)(b) of
“significantly adversely affecting the right to freedom of expression within the law of users of the services or members of the public”.
That clause is another place in the Bill where freedom of expression is expressly picked out and supported. If freedom of expression is ever threatened in a way that we have not anticipated and that the Bill does not provide for, there is a particular power here for a particular free speech group, such as the Free Speech Union, to make a super-complaint. I hope that my hon. Friend finds the fact that freedom of expression is expressly laid out there reassuring.
Let me now speak to the substance of amendment 77, tabled by the hon. Member for Aberdeen North. It is important to first keep in mind the purpose of the super-complaints, which, as I said a moment ago, is to provide a basis for raising issues of widespread and systemic importance. That is the reason for some of the criteria in sections (1)(a), (b) and (c), and why we have subsection (2)—because we want to ensure that super-complaints are raised only if they are of a very large scale or have a profound impact on freedom of speech or some other matter of particular importance. That is why the tests, hurdles and thresholds set out in clause 140(2) have to be met.
If we were to remove subsection (2), as amendment 77 seeks to, that would significantly lower the threshold. We would end up having super-complaints that were almost individual in nature. We set out previously why we think an ombudsman-type system or having super-complaints used for near-individual matters would not be appropriate. That is why the clause is there, and I think it is reasonable that it is.
The hon. Lady asked a couple of questions about how this arrangement might operate in practice. She asked whether a company such Facebook would be caught if it alone were doing something inappropriate. The answer is categorically yes, because the condition in clause 140(2)(b)—
“impacts on a particularly large number of users”,
which would be a large percentage of Facebook’s users,
“or members of the public”—
would be met. Facebook and—I would argue—any category 1 company would, by definition, be affecting large numbers of people. The very definition of category 1 includes the concept of reach—the number of people being affected. That means that, axiomatically, clause 140(2)(b) would be met by any category 1 company.
The hon. Lady also raised the question of Facebook, for a period of time in Europe, unilaterally ceasing to scan for child sexual exploitation and abuse images, which, as mentioned, led to huge numbers of child sex abuse images and, consequently, huge numbers of paedophiles not being detected. She asks how these things would be handled under the clause if somebody wanted to raise a super-complaint about that. Hopefully, Ofcom would stop them happening in the first place, but if it did not the super-complaint redress mechanism would be the right one. These things would categorically be caught by clause 140(2)(a), because they are clearly of particular importance.
In any reasonable interpretation of the words, the test of “particular importance” is manifestly met when it comes to stopping child sexual exploitation and abuse and the detection of those images. That example would categorically qualify under the clause, and a super-complaint could, if necessary, be brought. I hope it would never be necessary, because that is the kind of thing I would expect Ofcom to catch.
Having talked through the examples from the hon. Lady, I hope I have illustrated how the clause will ensure that either large-scale issues affecting large numbers of people or issues that are particularly serious will still qualify for super-complaint status with subsection (2) left in the Bill. Given those assurances, I urge the hon. Member to consider withdrawing her amendment.
I welcome the Minister’s fairly explicit explanation that he believes that every category 1 company would be in scope, even if there was a complaint against one single provider. I would like to push the amendment to a vote on the basis of the comments I made earlier and the fact that each of these platforms is different. We have heard concerns about, for example, Facebook groups being interested in celebrating eight-year-olds’ birthdays. We have heard about the amount of porn on Twitter, which Facebook does not have in the same way. We have heard about the kind of algorithmic stuff that takes people down a certain path on TikTok. We have heard all these concerns, but they are all specific to that one provider. They are not a generic complaint that could be brought toward a group of providers.
Would the hon. Lady not agree that in all those examples—including TikTok and leading people down dark paths—the conditions in subsection (2) would be met? The examples she has just referred to are, I would say, certainly matters of particular importance. Because the platforms she mentions are big in scale, they would also meet the test of scale in paragraph (b). In fact, only one of the tests has to be met—it is one or the other. In all the examples she has just given, not just one test—paragraph (a) or (b)— would be met, but both. So all the issues she has just raised would make a super-complaint eligible to be made.
I am glad the Minister confirms that he expects that that would be the case. I am clearer now that he has explained it, but on my reading of the clause, the definitions of “particular importance” or
“a particularly large number of users…or members of the public”
are not clear. I wanted to ensure that this was put on the record. While I do welcome the Minister’s clarification, I would like to push amendment 77 to a vote.
Question put, That the amendment be made.
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Amendment 113 was tabled by Paul Maynard, who is not on the Committee. Does any Member wish to move the amendment?
Amendment proposed: 113, in clause 150, page 127, line 28, at end insert “; or
(b) physical harm resulting from an epileptic seizure, where the seizure has been triggered by the intentional sending of flashing images to a person with epilepsy.”—(Kim Leadbeater.)
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Good morning, ladies and gentlemen. Please ensure your phones are switched to silent.
Clause 168
Publication by OFCOM
Question proposed, That the clause stand part of the Bill.
It is a pleasure to serve under your chairmanship, Sir Roger. Clause 168 is a very short and straightforward clause. Ofcom will be required to publish a variety of documents under the Online Safety Bill. The clause simply requires that this be done in a way that is appropriate and likely to bring it to the attention of any audience who are going to be affected by it. Ofcom is already familiar with this type of statutory obligation through existing legislation, such as the Digital Economy Act 2017, which places similar obligations on Ofcom. Ofcom is well versed in publishing documents in a way that is publicly accessible. Clause 168 puts the obligation on to a clear statutory footing.
As the Minister said, clause 168 rightly sets out that the raw material the Bill requires of Ofcom is published in a way that will bring it to the attention of any audience likely to be affected by it. It will be important that all the guidance is published in a way that is easily available and accessible, including for people who are not neurotypical, or experience digital exclusion. I think we would all agree, after the work we have done on the Bill, that the subjects are complex and the landscape is difficult to understand. I hope Ofcom will make its documents as accessible as possible.
Question put and agreed to.
Clause 168 accordingly ordered to stand part of the Bill.
Clause 169
Service of notices
Question proposed, That the clause stand part of the Bill.
Clause 169 sets out the process for the service of any notice under the Bill, including notices to deal with child sexual exploitation and abuse or terrorism content, information notices, enforcement notices, penalty notices and public statement notices to providers of regulated services both within and outside the United Kingdom. The clause sets out that Ofcom may give a notice to a person by handing it to them, leaving it at the person’s last known address, sending it by post to that address or sending it by email to the person’s email address. It provides clarity regarding who Ofcom must give notice to in respect of different structures. For example, notice may be given to an officer of a body corporate.
As the Minister said, clause 169 sets out the process of issuing notices or decisions by Ofcom. It mostly includes provisions about how Ofcom is to contact the company, which seem reasonable. The Opposition do not oppose clause 169.
Question put and agreed to.
Clause 169 accordingly ordered to stand part of the Bill.
Clause 170
Repeal of Part 4B of the Communications Act
Question proposed, That the clause stand part of the Bill.
Clause 170 repeals the video-sharing platform regime. While the VSP and online safety regimes have similar objectives, the new framework in the Bill will be broader and will apply to a wider range of online platforms. It is for this reason that we will repeal the VSP regime and transition those entities regulated as VSPs across to the online safety regime, which is broader and more effective in its provisions. The clause simply sets out the intention to repeal the VSP.
Clause 171 repeals part 3 of the Digital Economy Act 2017. As we have discussed previously, the Online Safety Bill now captures all online sites that display pornography, including commercial pornography sites, social media sites, video sharing platforms, forums and search engines. It will provide much greater protection to children than the Digital Economy Act. The Digital Economy Act was criticised for not covering social media platforms, which this Bill does cover. By removing that section from the Digital Economy Act, we are laying the path to regulate properly and more comprehensively.
Finally, in this group, clause 172 amends section 1B of the Protection of Children Act 1978 and creates a defence to the offence of making an indecent photograph of a child for Ofcom, its staff and those assisting Ofcom in exercising its online safety duties. Clearly, we do not want to criminalise Ofcom staff while they are discharging their duties under the Bill that we are imposing on them, so it is reasonable to set out that such a defence exists. I hope that provides clarity to the Committee on the three clauses.
The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.
When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his
“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]
in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?
Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.
The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.
Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.
On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?
I agree with the hon. Member for Worsley and Eccles South, that we want to see these provisions brought into force as quickly as possible, for the reasons that she set out. We are actively thinking about ways of ensuring that these provisions are brought into force as fast as possible. It is something that we have been actively discussing with Ofcom, and that, I hope, will be reflected in the road map that it intends to publish before the summer. That will of course remain an area of close working between the Department for Digital, Culture, Media and Sport and Ofcom, ensuring that these provisions come into force as quickly as possible. Of course, the illegal duties will be brought into force more quickly. That includes the CSEA offences set out in schedule 6.
The hon. Member for Aberdeen North raised questions in relation to the repeal of part 3 of the Digital Economy Act. Although that is on the statute book, it was never commenced. When it is repealed, we will not be removing from force something that is applied at the moment, because the statutory instrument to commence it was never laid. So the point she raised about whether the Bill would come into force the day after the Digital Economy Act is repealed does not apply; but the point she raised about bringing this legislation into force quickly is reasonable and right, and we will work on that.
The hon. Lady asked about the differences in scope between the video-sharing platform and the online safety regime. As I said, the online safety regime does have an increased scope compared with the VSP regime, but I think it is reasonable to keep an eye on that as she suggested, and keep it under review. There is of course a formal review mechanism in clause 149, but I think that more informally, it is reasonable that as the transition is made we keep an eye on it, as a Government and as parliamentarians, to ensure that nothing gets missed out.
I would add that, separately from the Bill, the online advertising programme is taking a holistic look at online advertising in general, and that will also be looking at matters that may also touch on the VSPs and what they regulate.
Question put and agreed to.
Clause 170 accordingly ordered to stand part of the Bill.
Clauses 171 and 172 ordered to stand part of the Bill.
Clause 173
Powers to amend section 36
Question proposed, That the clause stand part of the Bill.
The clause gives the Secretary of State the power to amend the list of fraudulent offences in section 36 in relation to the duties in relation to fraudulent advertising. These are the new duties that were introduced following feedback from Parliament, the Joint Committee, Martin Lewis and many other people. That is to ensure that we can keep the list of fraudulent offences up to date. The power to make those changes is subject to some constraints, as we would expect. The clause lists the criteria that any new offences must meet before the Secretary of State can include them in the section 36 list, which relates to the prevalence of the paid-for advertisements that amount to the new offence on category 1 services and the risk and severity of harm that that content poses to individuals in the UK.
The clause further limits the Secretary of State’s power to include new fraud offences, listing types of offence that may not be added. Offences from the Consumer Protection from Unfair Trading Regulations would be one instance. As I mentioned, the power to update section 36 is necessary to ensure that the legislation is future-proofed against new legislation and changes in criminal behaviour. Hon. Members have often said that it is important to ensure that the Bill is future-proof, and here is an example of exactly that future-proofing.
I have a couple of questions, particularly on clause 176 and the powers to amend schedules 6 and 7. I understand the logic for schedule 5 being different—in that terrorism offences are a wholly reserved matter—and therefore why only the Secretary of State would be making any changes.
My question is on the difference in the ways to amend schedules 6 and 7—I am assuming that Government amendment 126, which asks the Secretary of State to consult Scottish Ministers and the Department of Justice in Northern Ireland, and which we have already discussed, will be voted on and approved before we come to clause 176. I do not understand the logic for having different procedures to amend the child sexual exploitation and abuse offences and the priority offences. Why have the Government chosen two different procedures for amending the two schedules?
I understand why that might not be a terribly easy question to answer today, and I would be happy for the Minister to get in touch afterwards with the rationale. It seems to me that both areas are very important, and I do not quite understand why the difference is there.
Let me start by addressing the questions the shadow Minister raised about these powers. She used the phrase “free rein” in her speech, but I would not exactly describe it as free rein. If we turn to clause 179, which we will come to in a moment or two, and subsection (1)(d), (e), (f) and (g), we see that all the regulations made under clauses 173 to 176, which we are debating, require an SI under the affirmative procedure. Parliament will therefore get a chance to have its say, to object and indeed to vote down a provision if it wishes to. It is not that the Secretary of State can act alone; changes are subject to the affirmative SI procedure.
It is reasonable to have a mechanism to change the lists of priority offences and so on by affirmative SI, because the landscape will change and new offences will emerge, and it is important that we keep up to date. The only alternative is primary legislation, and a slot for a new Act of Parliament does not come along all that often—perhaps once every few years for any given topic. I think that would lead to long delays—potentially years—before the various exemptions, lists of priority offences and so on could be updated. I doubt that it is Parliament’s intention, and it would not be good for the public if we had to wait for primary legislation to change the lists. The proposed mechanism is the only sensible and proportionate way to do it, and it is subject to a parliamentary vote.
A comment was made about Ofcom’s independence. The way the offences are defined has no impact on Ofcom’s operational independence. That is about how Ofcom applies the rules; this is about what the rules themselves are. It is right that we are able to update them relatively nimbly by affirmative SI.
The hon. Member for Aberdeen North asked about the differences in the way schedules 6 and 7 can be updated. I will happily drop her a line with further thoughts if she wants me to, but in essence we are happy to get the Scottish child sexual exploitation and abuse offences, set out in part 2 of schedule 6, adopted as soon as Scottish Ministers want. We do not want to delay any measures on child exploitation and abuse, and that is why it is done automatically. Schedule 7, which sets out the other priority offences, could cover any topic at all—any criminal offence could fall under that schedule—whereas schedule 6 is only about child sexual exploitation and abuse. Given that the scope of schedule 7 takes in any criminal offence, it is important to consult Scottish Ministers if it is a Scottish offence but then use the statutory instrument procedure, which applies it to the entire UK internet. Does the hon. Lady want me to write to her, or does that answer her question?
That is actually incredibly helpful. I do not need a further letter, thanks.
I am grateful to the hon. Lady for saving DCMS officials a little ink, and electricity for an email.
I hope I have addressed the points raised in the debate, and I commend the clause to the Committee.
Question put and agreed to.
Clause 173 accordingly ordered to stand part of the Bill.
Clauses 174 and 175 ordered to stand part of the Bill.
Clause 176
Powers to amend Schedules 5, 6 and 7
Amendment made: 126, in clause 176, page 145, line 4, at end insert—
“(5A) The Secretary of State must consult the Scottish Ministers before making regulations under subsection (3) which—
(a) add an offence that extends only to Scotland, or
(b) amend or remove an entry specifying an offence that extends only to Scotland.
(5B) The Secretary of State must consult the Department of Justice in Northern Ireland before making regulations under subsection (3) which—
(a) add an offence that extends only to Northern Ireland, or
(b) amend or remove an entry specifying an offence that extends only to Northern Ireland.”—(Chris Philp.)
This amendment ensures that the Secretary of State must consult the Scottish Ministers or the Department of Justice in Northern Ireland before making regulations which amend Schedule 7 in connection with an offence which extends to Scotland or Northern Ireland only.
Clause 176, as amended, ordered to stand part of the Bill.
Clause 177
Power to make consequential provision
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Clause 178 stand part.
Government amendment 160.
Clause 179 stand part.
As new services and functions emerge and evolve, and platforms and users develop new ways to interact online, the regime will need to adapt. Harms online will also continue to change, and the framework will not function effectively if it cannot respond to these changes. These clauses provide the basis for the exercise of the Secretary of State’s powers under the Bill to make secondary legislation. The Committee has already debated the clauses that confer the relevant powers.
Clause 177 gives the Secretary of State the power to make consequential changes to this legislation or regulations made under it. It further provides that the regulations may amend or repeal relevant provisions made under the Communications Act 2003 or by secondary legislation made under that Act. The power is necessary to give effect to the various regulation-making powers in the Bill, which we have mostly already debated, and to ensure that the provisions of the 2003 Act and regulations that relate to online safety can continue to be updated as appropriate. That is consistent with the principle that the Bill must be flexible and future-proof. The circumstances in which these regulation-making powers may be exercised are specified and constrained by the clauses we have previously debated. Clause 178 ensures that the regulation-making powers in the Bill may make different provisions for different purposes, in particular ensuring that regulations make appropriate provisions for different types of service.
Amendment 160 forms part of a group of amendments that will allow Ofcom to recover costs from the regulated services for work that Ofcom carries out before part 6 of the Bill is commenced. As I said previously, the costs may be recouped over a period of three to five years. Currently, the costs of preparations for the exercise of safety functions include only costs incurred after commencement. The amendment makes sure that initial costs incurred before commencement can be recouped as well.
I rise briefly to support amendment 76, in the name of the hon. Member for Aberdeen North. Labour supports broadening the definition of “content” in this way. I refer the Minister to our earlier contributions about the importance of including newspaper comments, for example, in the scope of the Bill. This is a clear example of a key loophole in the Bill. We believe that a broadened definition of “content” would be a positive step forward to ensure that there is future-proofing, to prevent any unnecessary harm from any future content.
The shadow Minister, in her first contribution to the debate, introduced the broad purpose of the various clauses in this group, so I do not propose to repeat those points.
I would like to touch on one or two issues that came up. One is that clause 187 defines the meaning of “harm” throughout the Bill, although clause 150, as we have discussed, has its own internal definition of harm that is different. The more general definition of harm is made very clear in clause 187(2), which states:
“‘Harm’ means physical or psychological harm.”
That means that harm has a very broad construction in the Bill, as it should, to make sure that people are being protected as they ought to be.
In one of our earlier debates, I asked the Minister about the difference between “oral” and “aural”, and I did not get a very satisfactory answer. I know the difference in their dictionary definition—I understand that they are different, although the words sound the same. I am confused that clause 189 uses “oral” as part of the definition of content, but clause 49 refers to
“one-to-one live aural communications”
in defining things that are excluded.
I do not understand why the Government have chosen to use those two different words in different places in the Bill. It strikes me that, potentially, we mean one or the other. If they do mean two different things, why has one thing been chosen for clause 49 and another thing for clause 189? Why has the choice been made that clause 49 relates to communications that are heard, but clause 189 relates to communications that are said? I do not quite get the Government’s logic in using those two different words.
I know this is a picky point, but in order to have good legislation, we want it to make sense, for there to be a good rationale for everything that is in it and for people to be able to understand it. At the moment, I do not properly understand why the choice has been made to use two different words.
More generally, the definitions in clause 189 seem pretty sensible, notwithstanding what I said in the previous debate in respect of amendment 76, which, with your permission, Sir Roger, I intend to move when we reach the appropriate point.
As the hon. Member for Pontypridd said, clause 189 sets out various points of definition and interpretation necessary for the Bill to be understood and applied.
I turn to the question raised by the hon. Member for Aberdeen North. First, I strongly commend and congratulate her on having noticed the use of the two words. Anyone who thinks that legislation does not get properly scrutinised by Parliament has only to look to the fact that she spotted this difference, 110 pages apart, in two different clauses—clauses 49 and 189. That shows that these things do get properly looked at. I strongly congratulate her on that.
I think the best way of addressing her question is probably to follow up with her after the sitting. Clause 49 relates to regulated user-to-user content. We are in clause 49(2)—is that right?
It is cross-referenced in subsection (5). The use of the term “aural” in that subsection refers to sound only—what might typically be considered telephony services. “Oral” is taken to cover livestreaming, which includes pictures and voice. That is the intention behind the use of the two different words. If that is not sufficient to explain the point—it may not be—I would be happy to expand in writing.
That would be helpful, in the light of the concerns I raised and what the hon. Member for Pontypridd mentioned about gaming, and how those communications work on a one-to-one basis. Having clarity in writing on whether clause 49 relates specifically to telephony-type services would be helpful, because that is not exactly how I read it.
Given that the hon. Lady has raised the point, it is reasonable that she requires more detail. I will follow up in writing on that point.
Amendment proposed: 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.—(Kirsty Blackman.)
This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.
Question put, That the amendment be made.
Labour has not tabled any amendments to clause 190, which lists the provisions that define or explain terms used in the Bill. However, it will come as no surprise that we dispute the Bill’s definition of harm, and I am grateful to my hon. Friend the Member for Batley and Spen for raising those important points in our lively debate about amendment 112 to clause 150. We maintain that the Minister has missed the point, in that the Bill’s definition of harm fails to truly capture physical harm caused as a consequence of being online. I know that the Minister has promised to closely consider that as we head to Report stage, but I urge him to bear in mind the points raised by Labour, as well as his own Back Benchers.
The Minister knows, because we have repeatedly raised them, that we have concerns about the scope of the Bill’s provisions relating to priority content. I will not repeat myself, but he will be unsurprised to learn that this is an area in which we will continue to prod as the Bill progresses through Parliament.
I have made points on those issues previously. I do not propose to repeat now what I have said before.
Question put and agreed to.
Clause 190 accordingly ordered to stand part of the Bill.
Clause 191 ordered to stand part of the Bill.
Clause 192
Extent
I beg to move amendment 141, in clause 192, page 160, line 9, at end insert—
“(aa) section (Offence under the Obscene Publications Act 1959: OFCOM defence);”.
This amendment provides for NC35 to extend only to England and Wales.
With this it will be convenient to discuss Government new clause 35—Offence under the Obscene Publications Act 1959: OFCOM defence—
“(1) Section 2 of the Obscene Publications Act 1959 (prohibition of publication of obscene matter) is amended in accordance with subsections (2) and (3).
(2) After subsection (5) insert—
“(5A) A person shall not be convicted of an offence against this section of the publication of an obscene article if the person proves that—
(a) at the time of the offence charged, the person was a member of OFCOM, employed or engaged by OFCOM, or assisting OFCOM in the exercise of any of their online safety functions (within the meaning of section188 of the Online Safety Act 2022), and
(b) the person published the article for the purposes of OFCOM’s exercise of any of those functions.”
(3) In subsection (7)—
(a) the words after “In this section” become paragraph (a), and
(b) at the end of that paragraph, insert “;
(b) “OFCOM” means the Office of Communications.””
This new clause (to be inserted after clause 171) amends section 2 of the Obscene Publications Act 1959 to create a defence for OFCOM and their employees etc to the offence of the publication of an obscene article.
New clause 35 amends section 2 of the Obscene Publications Act 1959 to create a defence for Ofcom to the offence of publishing an obscene article where Ofcom is exercising its online safety duties. Ofcom has a range of functions that may result in its staff handling such content, so we want to ensure that that is covered properly. We have debated that already.
Clause 192 covers territorial extent. The regulation of the internet, as a reserved matter, covers all of the United Kingdom, but particular parts of the Bill extend to particular areas of the UK. In repealing that point in the Obscene Publications Act, we are ensuring that the Bill applies to the relevant parts of the United Kingdom, because that area of legislation has different areas of applicability. The clause and our amendments are important in ensuring that that is done in the right way.
I welcome the hon. Member’s intervention, and I am grateful for her and her party’s support for this important amendment.
It is also worth drawing colleagues’ attention to the history of issues, which have been brought forward in this place before. We know there was reluctance on the part of Ministers when the Digital Economy Act 2017 was on the parliamentary agenda to commence the all-important part 3, which covered many of the provisions now in part 5. Ultimately, the empty promises made by the Minister’s former colleagues have led to huge, record failures, even though the industry is ready, having had years to prepare to implement the policy. I want to place on record my thanks to campaigning groups such as the Age Verification Providers Association and others, which have shown fierce commitment in getting us this far.
It might help if I cast colleagues’ minds back to the Digital Economy Act 2017, which received Royal Assent in April of that year. Following that, in November 2018, the then Minister of State for Digital and Creative Industries told the Science and Technology Committee that part 3 of the DEA would be in force “by Easter next year”. Then, in December 2018, both Houses of Parliament approved the necessary secondary legislation, the Online Pornography (Commercial Basis) Regulations 2018, and the required statutory guidance.
But shortly after, in April 2018, the first delay arose when the Government published an online press release stating that part 3 of the DEA would not come into force until 15 July 2019. However, June 2019 came around and still there was nothing. On 20 June, five days after it should have come into force, the then Under-Secretary of State told the House of Lords that the defendant had failed to notify the European Commission of the statutory guidance, which would need to be done, and that that would result in a delay to the commencement of part 3
“in the region of six months”.—[Official Report, House of Lords, 20 June 2019; Vol. 798, c. 883.]
However, on 16 October 2019, the then Secretary of State announced via a written statement to Parliament that the Government
“will not be commencing part 3 of the Digital Economy Act 2017 concerning age verification for online pornography.”—[Official Report, 16 October 2019; Vol. 666, c. 17WS.]
A mere 13 days later, the Government called a snap general election. I am sure those are pretty staggering realities for the Minister to hear—and defend—but I am willing to listen to his defence. It really is not good enough. The industry is ready, the technology has been there for quite some time, and, given this Government’s fondness for a U-turn, there are concerns that part 5 of the Bill, which we have spent weeks deliberating, could be abandoned in a similar way as part 3 of the DEA was.
The Minister has failed to concede on any of the issues we have raised in Committee. It seems we are dealing with a Government who are ignoring the wide-ranging gaps and issues in the Bill. He has a relatively last-ditch opportunity to at least bring about some positive change, and to signify that he is willing to admit that the legislation as it stands is far from perfect. The provisions in part 5 are critical—they are probably the most important in the entire Bill—so I urge him to work with Labour to make sure they are put to good use in a more than reasonable timeframe.
On the implementation of part 3 of the Digital Economy Act 2017, all the events that the shadow Minister outlined predated my time in the Department. In fact, apart from the last few weeks of the period she talked about, the events predated my time as a Minister in different Departments, and I cannot speak for the actions and words of Ministers prior to my arrival in DCMS. What I can say, and I have said in Committee, is that we are determined to get the Bill through Parliament and implemented as quickly as we can, particularly the bits to do with child safety and the priority illegal content duties.
The shadow Minister commented at the end of her speech that she thought the Government had been ignoring parliamentary opinion. I take slight issue with that, given that we published a draft Bill in May 2021 and went through a huge process of scrutiny, including by the Joint Committee of the Commons and the Lords. We accepted 66 of the Joint Committee’s recommendations, and made other very important changes to the Bill. We have made changes such as addressing fraudulent advertising, which was previously omitted, and including commercial pornography—meaning protecting children—which is critical in this area.
The Government have made a huge number of changes to the Bill since it was first drafted. Indeed, we have made further changes while the Bill has been before the Committee, including amending clause 35 to strengthen the fraudulent advertising duties on large search companies. Members of Parliament, such as the right hon. Member for East Ham (Sir Stephen Timms), raised that issue on Second Reading. We listened to what was said at that stage and we made the changes.
There have also been quite a few occasions during these Committee proceedings when I have signalled—sometimes subtly, sometimes less so—that there are areas where further changes might be forthcoming as the Bill proceeds through both Houses of Parliament. I do not think the hon. Member for Pontypridd, or any member of the Committee, should be in any doubt that the Government are very open to making changes to the Bill where we are able to and where they are right. We have done so already and we might do so again in the future.
On the specifics of the amendment, we share the intention to protect children from accessing pornography online as quickly as possible. The amendment seeks to set a three-month timeframe within which part 5 must come into force. However, an important consideration for the commencement of part 5 will be the need to ensure that all kinds of providers of online pornography are treated the same, including those hosting user-generated content, which are subject to the duties of part 3. If we take a piecemeal approach, bringing into force part 5, on commercial pornography, before part 3, on user-to-user pornography, that may enable some of the services, which are quite devious, to simply reconfigure their services to circumvent regulation or cease to be categorised as part 5 services and try to be categorised as part 3 services. We want to do this in a comprehensive way to ensure that no one will be able to wriggle out of the provisions in the Bill.
Parliament has also placed a requirement on Ofcom to produce, consult on and publish guidance for in-scope providers on meeting the duties in part 5. The three-month timescale set out in the amendment would be too quick to enable Ofcom to properly consult on that guidance. It is important that the guidance is right; if it is not, it may be legally challenged or turn out to be ineffective.
I understand the need to get this legislation implemented quickly. I understand the scepticism that flows from the long delays and eventual cancellation of part 3 of the Digital Economy Act 2017. I acknowledge that, and I understand where the sentiment comes from. However, I think we are in a different place today. The provisions in the Bill have been crafted to address some of the concerns that Members had about the previous DEA measures—not least the fact that they are more comprehensive, as they cover user-to-user, which the DEA did not. There is therefore a clear commitment to getting this done, and getting it done fast. However, we also have to get it done right, and I think the process we have set out does that.
The Ofcom road map is expected before the summer. I hope that will give further reassurance to the Committee and to Parliament about the speed with which these things can get implemented. I share Members’ sentiments about needing to get this done quickly, but I do not think it is practical or right to do it in the way set out in amendment 49.
I am grateful for the Minister’s comments. However, I respectfully disagree, given the delays already since 2017. The industry is ready for this. The providers of the age verification services are ready for this. We believe that three months is an adequate timeframe, and it is vital that we get this done as quickly as possible. With that in mind, I will be pushing amendment 49 to a vote.
Question put, That the amendment be made.
This very important and concise clause sets out that the Bill, when passed, will be cited as the Online Safety Act 2022, which I hope is prophetic when it comes the lightning speed of passage through the House of Lords.
Question put and agreed to.
Clause 194 accordingly ordered to stand part of the Bill.
New Clause 35
Offence under the Obscene Publications Act 1959: OFCOM defence
“(1) Section 2 of the Obscene Publications Act 1959 (prohibition of publication of obscene matter) is amended in accordance with subsections (2) and (3).
(2) After subsection (5) insert—
‘(5A) A person shall not be convicted of an offence against this section of the publication of an obscene article if the person proves that—
(a) at the time of the offence charged, the person was a member of OFCOM, employed or engaged by OFCOM, or assisting OFCOM in the exercise of any of their online safety functions (within the meaning of section188 of the Online Safety Act 2022), and
(b) the person published the article for the purposes of OFCOM’s exercise of any of those functions.’
(3) In subsection (7)—
(a) the words after ‘In this section’ become paragraph (a), and
(b) at the end of that paragraph, insert ‘;
(b) “OFCOM” means the Office of Communications.’”—(Chris Philp.)
This new clause (to be inserted after clause 171) amends section 2 of the Obscene Publications Act 1959 to create a defence for OFCOM and their employees etc to the offence of the publication of an obscene article.
Brought up, read the First and Second time, and added to the Bill.
New Clause 42
Recovery of OFCOM’s initial costs
“Schedule (Recovery of OFCOM’s initial costs) makes provision about fees chargeable to providers of regulated services in connection with OFCOM’s recovery of costs incurred on preparations for the exercise of their online safety functions.”—(Chris Philp.)
This new clause introduces NS2.
Brought up, and read the First time.
With this it will be convenient to discuss Government new clause 43 and Government new schedule 2.
New clause 42 introduces new schedule 2. New clause 43 provides that the additional fees charged to providers under new schedule 2 must be paid into the consolidated fund. We discussed that a few days ago. That is where the fees are currently destined and I owe my right hon. Friend the Member for Basingstoke some commentary on this topic in due course. The Bill already provided that monetary penalties must be paid into the Consolidated Fund; the provisions are now placed into that clause.
New schedule 2, which is quite detailed, makes provisions in connection with Ofcom’s ability to recover its initial costs, which we have previously debated. As discussed, it is important that the taxpayer not only is protected from the ongoing costs but that the set-up costs are recovered. The taxpayer should not have to pay for the regulatory framework; the people who are being regulated should pay, whether the costs are incurred before or after commencement, in line with the “polluter pays” principle. Deep in new schedule 2 is the answer to the question that the hon. Member for Aberdeen North asked a day or two ago about the period over which set-up costs can be recovered, with that period specified as between three and five years. I hope that provides an introduction to the new clauses and new schedules.
We welcome this grouping, which includes two new clauses and a new schedule. Labour has raised concerns about the future funding of Ofcom more widely, specifically when we discussed groupings on clause 42. The Minister’s response did little to alleviate our concerns about the future of Ofcom’s ability to raise funds to maintain its position as the regulator. Despite that, we welcome the grouping, particularly the provisions in the new schedule, which will require Ofcom to seek to recover the costs it has incurred when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services. This is an important step, which we see as being broadly in line with the kind of mechanisms already in place for other, similar regulatory regimes.
Ultimately, it is right that fees charged to providers under new schedule 2 must be paid into the Consolidated Fund and important that Ofcom can recover its costs before a full fee structure and governance process is established. However, I have some questions for the Minister. How many people has Ofcom hired into roles, and can any of those costs count towards the calculation of fees? We want to ensure that other areas of regulation do not lose out as a consequence. Broadly speaking, though, we are happy to support the grouping and have not sought to table amendment at this stage.
So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.
Question put and agreed to.
New clause 42 accordingly read a Second time, and added to the Bill.
New Clause 43
Payment of sums into the Consolidated Fund
“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.
(2) In subsection (1), after paragraph (i) insert—
‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;
(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’
(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.
(4) After subsection (3) insert—
‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’
(5) In the heading, omit ‘licence’.”—(Chris Philp.)
This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.
Brought up, read the First and Second time, and added to the Bill.
New Clause 3
Establishment of Advocacy Body
“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.
(2) A ‘child user’—
(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and
(b) includes both any existing child user and any future child user.
(3) The work of the Advocacy Body may include—
(a) representing the interests of child users;
(b) the protection and promotion of these interests;
(c) any other matter connected with those interests.
(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—
(a) safety duties about illegal content, in particular CSEA content;
(b) safety duties protecting children;
(c) ‘enforceable requirements’ relating to children.
(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.
(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.
(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)
This new clause creates a new advocacy body for child users of regulated internet services.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.
Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:
“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]
A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.
Let me start by stating the fact that this Bill, as drafted, rightly has incredibly strong protections for children. The children’s safety duties that we have already debated are extremely strong. They apply to any platform with significant numbers of children using it and they impose a duty on such companies to protect children from harm. The priority illegal safety duties are listed in schedule 6, on child sexual exploitation and abuse offences—they have their very own schedule because we attach such importance to them. Committee members should be in no doubt that protecting children is at the very heart of the Bill. I hope that has been obvious from the debates we have had.
On children’s ability to raise complaints and seek redress under the Bill, it is worth reminding ourselves of a couple of clauses that we have debated previously, through which we are trying to make sure it is as easy as possible for children to report problematic content or to raise complaints. Members will recall that we debated clause 17. Clause 17(6)(c) allows for
“a parent of, or other adult with responsibility for, a child”
to raise content-reporting claims with users, so that children are not left on their own. We have also been clear under the complaints procedures set out in clause 18(2)(c) that those procedures must be
“easy to access, easy to use (including by children)”.
That is an explicit reference to accessibility for children.
The hon. Member for Aberdeen North has also already referred to the fact that in both the children’s risk assessment duties and the adult’s risk assessment duties people’s characteristics, including whether they are a member of a particular group, have to be taken into account. The children’s risk assessment duties are set out in clause 10(6)(d). Children with particular characteristics —orientation, race and so on—have to be particularly considered. The fact that a clause on the children’s risk assessment duties even exists in the first place shows that specific and special consideration has to be given to children and the risks they face. That is hardwired right into the architecture of the Bill.
All the provisions that I have just mentioned—starting with clause 10 on children’s risk assessment duties, right through to the end of the Bill and the priority offences in schedule 6, on child sexual exploitation and abuse offences—show that, right throughout the whole Bill, the protection of children is integral to what we are trying to do with the Bill.
On the consultation that happened in forming and framing the Bill, really extensive engagement and consultation took place throughout the preparation of this piece of legislation, including direct consultation with children themselves, their parents and the many advocacy groups for children. There should be no doubt at all that children have been thoroughly consulted as the Bill has been prepared.
On the specifics of new clause 3, which relate to advocacy for children, as the hon. Member for Aberdeen North referred to in passing a moment ago, there is a mechanism in clause 140 for organisations that represent particular groups, such as children, to raise super-complaints with Ofcom when there is a problem. In fact, when we debated that clause, I used children as an example when I spoke about the “eligible entities” that can raise super-complaints—I used the NSPCC speaking for children as a specific example of the organisations I would expect the term “eligible entity” to include. Clause 140 explicitly empowers organisations such as the NSPCC and others to speak for children.
I agree wholeheartedly about the importance of the role of the Children’s Commissioner and she does a fantastic job, but is it not testament to the fact that there is a need for this advocacy body that she is advocating for it and thinks it is a really good idea? The Children Act 2004 is a fantastic Act, but that was nearly 20 years ago and the world has changed significantly since then. The Bill shows that. The fact that she is advocating for it may suggest that she sees the need for a separate entity.
There is a danger if we over-create statutory bodies with overlapping responsibilities. I just read out the current statutory functions of the Children’s Commissioner under the 2004 Act. If we were to agree to the new clause, we would basically be creating a second statutory advocate or body with duties that are the same as some of those that the Children’s Commissioner already exercises. I read from section 2 of the Act, where those duties are set out. I do not think that having two people with conflicting or competing duties would be particularly helpful.
I am grateful to the Minister for his support for Labour legislation. Does he acknowledge that we have different Children’s Commissioners across the nations of the UK? Each would have the same rights to advocate for children, so we would have four, rather than one focusing on one specific issue, which is what the Children’s Commissioners across the UK are advocating for.
I do not have in front of me the relevant devolved legislation—I have only the Children Act 2004 directly in front of me—but I assume it is broadly similar. The hon. Member for Aberdeen North can correct me if I am wrong, but I assume it is probably broadly similar in the way—[Interruption.] She is not sure, so I do not feel too bad about not being sure either. I imagine it is similar. I am not sure that having similar statutory bodies with the same function—we would create another with the new clause—is necessarily helpful.
The Bill sets out formal processes that allow other organisations, such as the NSPCC, to raise complaints that have to be dealt with. That ensures that the voices of groups—including children, but not just children—will be heard. I suspect that if we have a children’s advocacy body, other groups will want them and might feel that they have been overlooked by omission.
The good thing about the way the super-complaint structure in clause 140 works is that it does not prescribe what the groups are. Although I am sure that children will be top of the list, there will be other groups that want to advocate and to be able to bring super-complaints. I imagine that women’s groups will be on that list, along with groups advocating for minorities and people with various sexual orientations. Clause 140 is not exclusive; it allows all these groups to have a voice that must be heard. That is why it is so effective.
My right hon. Friend the Member for Basingstoke and the hon. Member for Batley and Spen asked whether the groups have enough resources to advocate on issues under the super-complaint process. That is a fair question. The allocation of funding to different groups tends to be done via the spending review process. Colleagues in other Departments—the Department for Education or, in the case of victims, the Ministry of Justice—allocate quite a lot of money to third-sector groups. The victims budget was approximately £200 million a year or two ago, and I am told it has risen to £300 million for the current financial year. That is the sort of funding that can find its way into the hands of the organisations that advocate for particular groups of victims. My right hon. Friend asked whether the proceeds of fines could be applied to fund such work, and I have undertaken to raise that with the Treasury.
We already have a statutory advocate for children: the four Children’s Commissioners for the four parts of the United Kingdom. We have the super-complaints process, which covers more than children’s groups, crucial though they are. We have given Ofcom statutory duties to consult when developing its codes of practice, and we have money flowing via the Ministry of Justice, the DFE and others, into advocate groups. Although we agree with the intention behind new clause 3, we believe its objectives are very well covered via the mechanisms that I have just set out at some length.
There have not been all that many times during the debate on the Bill when the Minister has so spectacularly missed the point as he has on this section. I understand everything he said about provisions already being in place to protect to children and the provisions regarding the super-complaints, but the new clause is not intended to be a replacement for the super-complaints procedure, which we all support—in fact, we have tried to strengthen that procedure. The new clause is intended to be an addition—another, very important layer.
Unfortunately, I do not have at the front of my mind the legislation that set up the Children’s Commissioner for Scotland, or the one for England. The Minister talked through some of the provisions and phrasing in the Children Act 2004. He said that the role of the Children’s Commissioner for England is to encourage bodies to act positively on behalf of children—to encourage. There is no requirement for the body to act in the way the Children’s Commissioner says it should act. Changes have been made in Wales establishing the Future Generations Commissioner, who has far more power.
As far as I can tell, the user advocacy body proposed in new clause 3 would not have the ability to compel Ofcom either.
But it would be a statutory consultee that is specifically mentioned in this provision. I cannot find in the Bill a provision giving Ofcom a statutory duty to consult the four Children’s Commissioners. The new clause would make the children’s advocacy body a statutory consultee in decisions that affect children.
The Bill will require Ofcom to consult people who represent the interests of children. Although not named, it would be astonishing if the first people on that list were not the four Children’s Commissioners when developing the relevant codes of practice. The statutory obligation to consult those groups when developing codes of practice and, indeed, guidance is set out in clauses 37(6)(d) and 69(3)(d).
That is very helpful, but there are still shortcomings in what the Minister says. The Bill, as drafted, requires Ofcom to require things of other organisations. Some of the detail is in the Bill, some of the detail will come in secondary legislation and some of the detail will come in the codes of practice published by Ofcom. We broadly agree that the Bill will ensure people are safer on the internet than they currently are, but we do not have all the detail on the Government’s intent. We would like more detail on some things, but we are not saying, “We need every little bit of detail.” If we did, the Bill would not be future-proof. We would not be able to change and update the Bill if we required everything to be in the Bill.
The Bill is not a one-off; it will continually change and grow. Having a user advocacy body would mean that emerging threats can quickly be brought to Ofcom’s attention. Unlike the Children’s Commissioners, who have a hundred other things to do, the entire purpose of this body would be to advocate on behalf of children online. The Children’s Commissioners do an amazing job, but this is not their No. 1 priority. If the Minister wants this to be a world-leading Bill, its No. 1 priority should be to protect the human rights of children.
I think the hon. Lady is being a little unfair to the Children’s Commissioners. Dame Rachel de Souza is doing a fantastic job of advocating specifically in the digital sphere. She really is doing a fantastic job, and I say that as a Minister. I would not say she is leaving any gaps.
These digital children’s safety issues link to wider children’s safety issues that exist offline, such as sexual exploitation, grooming and so on, so it is useful that the same person advocates for children in both the offline and online worlds.
The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.
The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.
I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
(2 years, 5 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
(2 years, 4 months ago)
Commons ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move, That the clause be read a Second time.
Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.
Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.
We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.
I warmly welcome my hon. Friend to his position. He will understand that those of us who have followed the Bill in some detail since its inception had some nervousness as to who might be standing at that Dispatch Box today, but we could not be more relieved that it is him. May I pick up on his point about the point of order from our right hon. Friend the Member for Haltemprice and Howden (Mr Davis)? Does he agree that an additional point to add to his list is that, unusually, this legislation has a remarkable amount of cross-party consensus behind its principles? That distinguishes it from some of the other legislation that perhaps we should not consider in these two weeks. I accept there is plenty of detail to be examined but, in principle, this Bill has a lot of support in this place.
I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.
Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend the Member for Croydon South (Chris Philp), who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.
We have also listened to the work of other Members of the House, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the right hon. Member for Barking (Dame Margaret Hodge), my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend the Member for Solihull (Julian Knight), who have all made important contributions to the discussion of the Bill.
We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.
My hon. Friend did not have the joy of being on the Bill Committee, as I did with my hon. Friend the Member for Croydon South (Chris Philp), who was the Minister at that point. The point that my hon. Friend has just made about free speech is so important for women and girls who are not able to go online because of the violent abuse that they receive, and that has to be taken into account by those who seek to criticise the Bill. We have to make sure that people who currently feel silenced do not feel silenced in future and can participate online in the way that they should be able to do. My hon. Friend is making an excellent point and I welcome him to his position.
My right hon. Friend is entirely right on that point. The structure of the Bill is very simple. There is a legal priority of harms, and things that are illegal offline will be regulated online at the level of the criminal threshold. There are protections for freedom of speech and there is proper transparency about harmful content, which I will come on to address.
Does the Minister agree that, in moderating content, category 1 service providers such as Twitter should be bound by the duties under our domestic law not to discriminate against anyone on the grounds of a protected characteristic? Will he take a look at the amendments I have brought forward today on that point, which I had the opportunity of discussing with his predecessor, who I think was sympathetic?
The hon. and learned Lady makes a very important point. The legislation sets regulatory thresholds at the criminal law level based on existing offences in law. Many of the points she made are covered by existing public law offences, particularly in regards to discriminating against people based on their protected characteristics. As she well knows, the internet is a reserved matter, so the legal threshold is set at where UK law stands, but where law may differ in Scotland, the police authorities in Scotland can still take action against individuals in breach of the law.
The difficulty is that Twitter claims it is not covered by the Equality Act 2010. I have seen legal correspondence to that effect. I am not talking about the criminal law here. I am talking about Twitter’s duty not to discriminate against women, for example, or those who hold gender critical beliefs in its moderation of content. That is the purpose of my amendment today—it would ensure that Twitter and other service providers providing a service in the United Kingdom abide by our domestic law. It is not really a reserved or devolved matter.
The hon. and learned Lady is right. There are priority offences where the companies, regardless of their terms of service, have to meet their obligations. If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law and meets the criminal threshold online. The job of the regulator is to hold them to account for that. They also have to be transparent in their terms of service as category 1 companies. If they have clear policies against discrimination, which they on the whole all do, they will have to set out what they would do, and the regulator can hold them to account to make sure they do what they say. The regulator cannot make them take down speech that is legal or below a criminal threshold, but they can hold them to account publicly for the decisions they make.
One of the most important aspects of this Bill with regard to the category 1 companies is transparency. At the moment, the platforms make decisions about curating their content—who to take down, who to suppress, who to leave up—but those are their decisions. There is no external scrutiny of what they do or even whether they do what they say they will do. As a point of basic consumer protection law, if companies say in their terms of service that they will do something, they should be held to account for it. What is put on the label also needs to be in the tin and that is what the Bill will do for the internet.
I now want to talk about journalism and the role of the news media in the online world, which is a very important part of this Bill. The Government are committed to defending the invaluable role of a free media. Online safety legislation must protect the vital role of the press in providing people with reliable and accurate sources of information. Companies must therefore put in place protections for journalistic content. User-to-user services will not have to apply their safety duties in part 3 of the Bill to news publishers’ content shared on their services. News publishers’ content on their own sites will also not be in scope of regulation.
I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.
The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.
I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.
The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.
In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.
The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.
I congratulate the Minister on his appointment, and I look forward to supporting him in his role as he previously supported me in mine. I think he made an important point a minute ago about content that is legal but considered to be harmful. It has been widely misreported in the press that this Bill censors or prohibits such content. As the Minister said a moment ago, it does no such thing. There is no requirement on platforms to censor or remove content that is legal, and amendment 71 to clause 13 makes that expressly clear. Does he agree that reports suggesting that the Bill mandates censorship of legal content are completely inaccurate?
I am grateful to my hon. Friend, and as I said earlier, he is absolutely right. There is no requirement for platforms to take down legal speech, and they cannot be directed to do so. What we have is a transparency requirement to set out their policies, with particular regard to some of the offences I mentioned earlier, and a wide schedule of things that are offences in law that are enforced through the Bill itself. This is a very important distinction to make. I said to him on Second Reading that I thought the general term “legal but harmful” had added a lot of confusion to the way the Bill was perceived, because it created the impression that the removal of legal speech could be required by order of the regulator, and that is not the case.
I congratulate the Minister on his promotion and on his excellent chairmanship of the prelegislative scrutiny Committee, which I also served on. Is he satisfied with the Bill in relation to disinformation? It was concerning that there was only one clause on disinformation, and we know the impact—particularly the democratic impact—that that has on our society at large. Is he satisfied that the Bill will address that?
It was a pleasure to serve alongside the hon. Lady on the Joint Committee. There are clear new offences relating to knowingly false information that will cause harm. As she will know, that was a Law Commission recommendation; it was not in the draft Bill but it is now in the Bill. The Government have also said that as a consequence of the new National Security Bill, which is going through Parliament, we will bring in a new priority offence relating to disinformation spread by hostile foreign states. As she knows, one of the most common areas for organised disinformation has been at state level. As a consequence of the new national security legislation, that will also be reflected in schedule 7 of this Bill, and that is a welcome change.
The Bill requires all services to take robust action to tackle the spread of illegal content and activity. Providers must proactively reduce the risk on their services of illegal activity and the sharing of illegal content, and they must identify and remove illegal content once it appears on their services. That is a proactive responsibility. We have tabled several interrelated amendments to reinforce the principle that companies must take a safety-by-design approach to managing the risk of illegal content and activity on their services. These amendments require platforms to assess the risk of their services being used to commit, or to facilitate the commission of, a priority offence and then to design and operate their services to mitigate that risk. This will ensure that companies put in place preventive measures to mitigate a broad spectrum of factors that enable illegal activity, rather than focusing solely on the removal of illegal content once it appears.
I congratulate my hon. Friend on his appointment to his position. On harmful content, there are all too many appalling examples of animal abuse on the internet. What are the Government’s thoughts on how we can mitigate such harmful content, which is facilitating wildlife crime? Might similar online protections be provided for animals to the ones that clause 53 sets out for children?
My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.
In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.
We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.
I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?
As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.
I congratulate my hon. Friend on taking his new position; we rarely have a new Minister so capable of hitting the ground running. He makes a crucial point about clearness and transparency for both users and the social media providers and other platforms, because it is important that we make sure they are 100% clear about what is expected of them and the penalties for not fulfilling their commitments. Does he agree that opaqueness—a veil of secrecy—has been one of the obstacles, and that a whole raft of content has been taken down for the wrong reasons while other content has been left to proliferate because of the lack of clarity?
That is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.
It is an honour to respond on the first group of amendments on behalf of the Opposition.
For those of us who have been working on this Bill for some time now, it has been extremely frustrating to see the Government take such a siloed approach in navigating this complex legislation. I remind colleagues that in Committee Labour tabled a number of hugely important amendments that sought to make the online space safer for us all, but the Government responded by voting against each and every one of them. I certainly hope the new Minister—I very much welcome him to his post—has a more open-minded approach than his predecessor and indeed the Secretary of State; I look forward to what I hope will be a more collaborative approach to getting this legislation right.
With that in mind, it must be said that time and again this Government claim that the legislation is world-leading but that is far from the truth. Instead, once again the Government have proposed hugely significant and contentious amendments only after line-by-line scrutiny in Committee; it is not the first time this has happened in this Parliament, and it is extremely frustrating for those of us who have debated this Bill for more than 50 hours over the past month.
I will begin by touching on Labour’s broader concerns around the Bill. As the Minister will be aware, we believe that the Government have made a fundamental mistake in their approach to categorisation, which undermines the very structure of the Bill. We are not alone in this view and have the backing of many advocacy and campaign groups including the Carnegie UK Trust, Hope Not Hate and the Antisemitism Policy Trust. Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.
We all know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote very dangerous content. Their aim is to promote radicalisation and to spread hate and harm.
My hon. Friend is absolutely right, and has touched on elements that I will address later in my speech. I will look at cross-platform harm and breadcrumbing; the Government have taken action to address that issue, but they need to go further.
I am sorry to intervene so early in the hon. Lady’s speech, and thank her for her kind words. I personally agree that the question of categorisation needs to be looked at again, and the Government have agreed to do so. We will hopefully discuss it next week during consideration of the third group of amendments.
I welcome the Minister’s commitment, which is something that the previous Minister, the hon. Member for Croydon South (Chris Philp) also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.
Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is
“reasonably available to a provider”,
with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.
The second problem arises from the fact that the platforms will need to have
“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.
That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.
Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made
“on the basis of all relevant information that is reasonably available to a provider.”
However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.
I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.
We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.
Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a
“tsunami of online child abuse”.
We now have the first ever opportunity to legislate for a safer world online for our children.
However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend the Member for Oldham East and Saddleworth (Debbie Abrahams). In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.
I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:
“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”
I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.
It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.
Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that
“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”
Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.
Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.
I think we would all agree that when we look at the priority harms set out in the Bill, women and girls are disproportionately the victims of those offences. The groups in society that the Bill will most help are women and girls in our community. I am happy to work with the hon. Lady and all hon. Members to look at what more we can do on this point, both during the passage of the Bill and in future, but as it stands the Bill is the biggest step forward in protecting women and girls, and all users online, that we have ever seen.
I am grateful to the Minister for the offer to work on that further, but we have an opportunity now to make real and lasting change. We talk about how we tackle this issue going forward. How can we solve the problem of violence against women and girls in our community? Three women a week are murdered at the hands of men in this country—that is shocking. How can we truly begin to tackle a culture change? This is how it starts. We have had enough of words. We have had enough of Ministers standing at the Dispatch Box saying, “This is how we are going to tackle violence against women and girls; this is our new plan to do it.” They have an opportunity to create a new law that makes it a priority harm, and that makes women and girls feel like they are being listened to, finally. I urge the Minister and Members in all parts of the House, who know that this is a chance for us finally to take that first step, to vote for new clause 3 today and make women and girls a priority by showing understanding that they receive a disproportionate level of abuse and harm online, and by making them a key component of the Bill.
I join everybody else in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), to the Front Bench. He is astonishingly unusual in that he is both well-intentioned and well-informed, a combination we do not always find among Ministers.
I will speak to my amendments to the Bill. I am perfectly willing to be in a minority of one—one of my normal positions in this House. To be in a minority of one on the issue of free speech is an honourable place to be. I will start by saying that I think the Bill is fundamentally mis-designed. It should have been several Bills, not one. It is so complex that it is very difficult to forecast the consequences of what it sets out to do. It has the most fabulously virtuous aims, but unfortunately the way things will be done under it, with the use of Government organisations to make decisions that, properly, should be taken on the Floor of the House, is in my view misconceived.
We all want the internet to be safe. Right now, there are too many dangers online—we have been hearing about some of them from the hon. Member for Pontypridd (Alex Davies-Jones), who made a fabulous speech from the Opposition Front Bench—from videos propagating terror to posts promoting self-harm and suicide. But in its well-intentioned attempts to address those very real threats, the Bill could actually end up being the biggest accidental curtailment of free speech in modern history.
There are many reasons to be concerned about the Bill. Not all of them are to be dealt with in this part of the Report stage—some will be dealt with later—and I do not have time to mention them all. I will make one criticism of the handling of the Bill at this point. I have seen much smaller Bills have five days on Report in the past. This Bill demands more than two days. That was part of what I said in my point of order at the beginning.
One of the biggest problems is the “duties of care” that the Bill seeks to impose on social media firms to protect users from harmful content. That is a more subtle issue than the tabloid press have suggested. My hon. Friend the Member for Croydon South (Chris Philp), the previous Minister, made that point and I have some sympathy with him. I have spoken to representatives of many of the big social media firms, some of which cancelled me after speeches that I made at the Conservative party conference on vaccine passports. I was cancelled for 24 hours, which was an amusing process, and they put me back up as soon as they found out what they had done. Nevertheless, that demonstrated how delicate and sensitive this issue is. That was a clear suppression of free speech without any of the pressures that are addressed in the Bill.
When I spoke to the firms, they made it plain that they did not want the role of online policemen, and I sympathise with them, but that is what the Government are making them do. With the threat of huge fines and even prison sentences if they consistently fail to abide by any of the duties in the Bill—I am using words from the Bill—they will inevitably err on the side of censorship whenever they are in doubt. That is the side they will fall on.
Worryingly, the Bill targets not only illegal content, which we all want to tackle—indeed, some of the practice raised by the Opposition Front Bencher, the hon. Member for Pontypridd should simply be illegal full stop—but so-called “legal but harmful” content. Through clause 13, the Bill imposes duties on companies with respect to legal content that is “harmful to adults”. It is true that the Government have avoided using the phrase “legal but harmful” in the Bill, preferring “priority content”, but we should be clear about what that is.
The Bill’s factsheet, which is still on the Government’s website, states on page 1:
“The largest, highest-risk platforms will have to address named categories of legal but harmful material”.
This is not just a question of transparency—they will “have to” address that. It is simply unacceptable to target lawful speech in this way. The “Legal to Say, Legal to Type” campaign, led by Index on Censorship, sums up this point: it is both perverse and dangerous to allow speech in print but not online.
As I said, a company may be asked to address this, which means that it has to set out what its policies are, how it would deal with that content and its terms of service. The Bill does not require a company to remove legal speech that it has no desire to remove. The regulator cannot insist on that, nor can the Government or the Bill. There is nothing to make legal speech online illegal.
That is exactly what the Minister said earlier and, indeed, said to me yesterday when we spoke about this issue. I do not deny that, but this line of argument ignores the unintended consequences that the Bill may have. Its stated aim is to achieve reductions in online harm, not just illegal content. Page 106 of the Government’s impact assessment lists a reduction in the prevalence of legal but harmful content as a “key evaluation” question. The Bill aims to reduce that—the Government say that both in the online guide and the impact assessment. The impact assessment states that an increase in “content moderation” is expected because of the Bill.
A further concern is that the large service providers already have terms and conditions that address so-called legal but harmful content. A duty to state those clearly and enforce them consistently risks legitimising and strengthening the application of those terms and conditions, possibly through automated scanning and removal. That is precisely what happened to me before the Bill was even dreamed of. That was done under an automated system, backed up by somebody in Florida, Manila or somewhere who decided that they did not like what I said. We have to bear in mind how cautious the companies will be. That is especially worrying because, as I said, providers will be under significant pressure from outside organisations to include restrictive terms and conditions. I say this to Conservative Members, and we have some very well-intentioned and very well-informed Members on these Benches: beware of the gamesmanship that will go on in future years in relation to this.
Ofcom and the Department see these measures as transparency measures—that is the line. Lord Michael Grade, who is an old friend of mine, came to see me and he talked about this not as a pressure, but as a transparency measure. However, these are actually pressure measures. If people are made to announce things and talk about them publicly, that is what they become.
It is worth noting that several free speech and privacy groups have expressed scepticism about the provisions, yet they were not called to give oral evidence in Committee. A lot of other people were, including pressure groups on the other side and the tech companies, which we cannot ignore, but free speech advocates were not.
I did of course hear what was said by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis). To be honest, I think that increased scrutiny of content which might constitute abuse of harassment, whether of women or of ethnic minorities, is to be warmly welcomed. The Bill provides that the risk assessors must pay attention to the characteristics of the user. There is no cross-reference to the Equality Act—I know the hon. and learned Lady has submitted a request on that, to which my successor Minister will now be responding—but there are references to characteristics in the provisions on safety duties, and those characteristics do of course include gender and race.
In relation to the risk that these duties are over-interpreted or over-applied, for the first time ever there is a duty for social media firms to have regard to freedom of speech. At present these firms are under no obligation to have regard to it, but clause 19(2) imposes such a duty, and anyone who is concerned about free speech should welcome that. Clauses 15 and 16 go further: clause 15 creates special protections for “content of democratic importance”, while clause 16 does the same for content of journalistic importance. So while I hugely respect and admire my right hon. Friend the Member for Haltemprice and Howden, I do not agree with his analysis in this instance.
I would now like to ask a question of my successor. He may wish to refer to it later or write to me, but if he feels like intervening, I will of course give way to him. I note that four Government amendments have been tabled; I suppose I may have authorised them at some point. Amendments 72, 73, 78 and 82 delete some words in various clauses, for example clauses 13 and 15. They remove the words that refer to treating content “consistently”. The explanatory note attached to amendment 72 acknowledges that, and includes a reference to new clause 14, which defines how providers should go about assessing illegal content, what constitutes illegal content, and how content is to be determined as being in one of the various categories.
As far as I can see, new clause 14 makes no reference to treating, for example, legal but harmful content “consistently”. According to my quick reading—without the benefit of highly capable advice—amendments 72, 73, 78 and 82 remove the obligation to treat content “consistently”, and it is not reintroduced in new clause 14. I may have misread that, or misunderstood it, but I should be grateful if, by way of an intervention, a later speech or a letter, my hon. Friend the Minister could give me some clarification.
I think that the codes of practice establish what we expect the response of companies to be when dealing with priority illegal harm. We would expect the regulator to apply those methods consistently. If my hon. Friend fears that that is no longer the case, I shall be happy to meet him to discuss the matter.
Clause 13(6)(b), for instance, states that the terms of service must be
“applied consistently in relation to content”,
and so forth. As far as I can see, amendment 72 removes the word “consistently”, and the explanatory note accompanying the amendment refers to new clause 14, saying that it does the work of the previous wording, but I cannot see any requirement to act consistently in new clause 14. Perhaps we could pick that up in correspondence later.
If there is any area of doubt, I shall be happy to follow it up, but, as I said earlier, I think we would expect that if the regulator establishes through the codes of practice how a company will respond proactively to identify illegal priority content on its platform, it is inherent that that will be done consistently. We would accept the same approach as part of that process. As I have said, I shall be happy to meet my hon. Friend and discuss any gaps in the process that he thinks may exist, but that is what we expect the outcome to be.
I am grateful to my hon. Friend for his comments. I merely observe that the “consistency” requirements were written into the Bill, and, as far as I can see, are not there now. Perhaps we could discuss it further in correspondence.
Let me turn briefly to clause 40 and the various amendments to it—amendments 44, 45, 13, 46 and others—and the remarks made by the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), about the Secretary of State’s powers. I intervened on the hon. Lady earlier on this subject. It also arose in Committee, when she and many others made important points on whether the powers in clause 40 went too far and whether they impinged reasonably on the independence of the regulator, in this case Ofcom. I welcome the commitments made in the written ministerial statement laid last Thursday—coincidentally shortly after my departure—that there will be amendments in the Lords to circumscribe the circumstances in which the Secretary of State can exercise those powers to exceptional circumstances. I heard the point made by the hon. Member for Ochil and South Perthshire that it was unclear what “exceptional” meant. The term has a relatively well defined meaning in law, but the commitment in the WMS goes further and says that the bases upon which the power can be exercised will be specified and limited to certain matters such as public health or matters concerning international relations. That will severely limit the circumstances in which those powers can be used, and I think it would be unreasonable to expect Ofcom, as a telecommunications regulator, to have expertise in those other areas that I have just mentioned. I think that the narrowing is reasonable, for the reasons that I have set out.
I agree with my hon. Friend on both points. I discussed the point about researcher access with him last week, when our roles were reversed, so I am sympathetic to that. There is a difference between that and the researcher access that the Digital Services Act in Europe envisages, which will not have the legal powers that Ofcom will have to compel and demand access to information. It will be complementary but it will not replace the primary powers that Ofcom will have, which will really set our regime above those elsewhere. It is certainly my belief that the algorithmic amplification of harmful content must be addressed in the transparency reports and that, where it relates to illegal activities, it must absolutely be within the scope of the regulator to state that actively promoting illegal content to other people is an offence under this legislation.
On my hon. Friend’s first point, he is right to remind the House that the obligations to disclose information to Ofcom are absolute; they are hard-edged and they carry criminal penalties. Researcher access in no way replaces that; it simply acts as a potential complement to it. On his second point about algorithmic promotion, of course any kind of content that is illegal is prohibited, whether algorithmically promoted or otherwise. The more interesting area relates to content that is legal but perceived as potentially harmful. We have accepted that the judgments on whether that content stays up or not are for the platforms to make. If they wish, they can choose to allow that content simply to stay up. However, it is slightly different when it comes to algorithmically promoting it, because the platform is taking a proactive decision to promote it. That may be an area that is worth thinking about a bit more.
On that point, if a platform has a policy not to accept a certain sort of content, I think the regulators should expect it to say in its transparency report what it is doing to ensure that it is not actively promoting that content through a newsfeed, on Facebook or “next up” on YouTube. I expect that to be absolutely within the scope of the powers we have in place.
In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.
I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.
That is why I am giving the Bill a cautious welcome, but I still stand by my very legitimate concerns about the chilling effect of aspects of this Bill. I will give some examples in a moment about the problems that have arisen when organisations such as Twitter are left to their own devices on their moderation of content policy.
As all hon. Members will be aware, under the Equality Act there are a number of protected characteristics. These include: age; gender reassignment; being married or in a civil partnership; being pregnant or on maternity leave; disability; race, including colour, nationality, ethnic or national origin; religion or belief; sex and sexual orientation. It is against the law to discriminate, victimise or harass anyone because of any of those protected characteristics, but Twitter does discriminate against some of the protected characteristics. It often discriminates against women in the way that I described in an intervention earlier. It takes down expressions of feminist belief, but refuses to take down expressions of the utmost violent intent against women. It also discriminates against women who hold gender-critical beliefs. I remind hon. Members that, in terms of the Employment Appeal Tribunal’s decision in the case of Maya Forstater, the belief that sex matters is worthy of respect in a democratic society and, under the Equality Act, people cannot lawfully discriminate against women, or indeed men, who hold those views.
Twitter also sometimes discriminates against lesbians, gay men and bisexual people who assert that their sexual orientation is on the basis of sex, not gender, despite the fact that same-sex orientation, such as I hold, is a protected characteristic under the Equality Act.
At present, Twitter claims not to be covered by the Equality Act. I have seen correspondence from its lawyers that sets out the purported basis for that claim, partly under reference to schedule 25 to the Equality Act, and partly because it says:
“Twitter UK is included in an Irish Company and is incorporated in the Republic of Ireland. It does pursue economic activity through a fixed establishment in the UK but that relates to income through sales and marketing with the main activity being routed through Ireland.”
I very much doubt whether that would stand up in court, since Twitter is clearly providing a service in the United Kingdom, but it would be good if we took the opportunity of this Bill to clarify that the Equality Act applies to Twitter, so that when it applies moderation of content under the Bill, it will not discriminate against any of the protected characteristics.
The Joint Committee on Human Rights, of which I am currently the acting Chair, looked at this three years ago. We had a Twitter executive before our Committee and I questioned her at length about some of the content that Twitter was content to support in relation to violent threats against women and girls and, on the other hand, some of the content that Twitter took down because it did not like the expression of certain beliefs by feminists or lesbians.
We discovered on the Joint Committee on Human Rights that Twitter’s hateful conduct policy does not include sex as a protected characteristic. It does not reflect the domestic law of the United Kingdom in relation to anti-discrimination law. Back in October 2019, in the Committee’s report on democracy, freedom of expression and freedom of association, we recommended that Twitter should include sex as a protected characteristic in its hateful conduct policy, but Twitter has not done that. It seems Twitter thinks it is above the domestic law of the United Kingdom when it comes to anti-discrimination.
At that Committee, the Twitter executive assured me that certain violent memes that often appear on Twitter directed against women such as me and against many feminists in the United Kingdom, threatening us with death by shooting, should be removed. However, just in the past 48 hours I have seen an example of Twitter’s refusing to remove that meme. Colleagues should be assured that there is a problem here, and I would like us to direct our minds to it, as the Bill gives us an opportunity to do.
Whether or not Twitter is correctly praying in aid the loophole it says there is in the Equality Act—I think that is questionable—the Bill gives us the perfect opportunity to clarify matters. Clause 3 of clearly brings Twitter and other online service providers within the regulatory scheme of the Bill as a service with
“a significant number of United Kingdom users”.
The Bill squarely recognises that Twitter provides a service in the United Kingdom to UK users, so it is only a very small step to amend the Bill to make it absolutely clear that when it does so it should be subject to the Equality Act. That is what my new clause 24 seeks to do.
I have also tabled new clauses 193 and 191 to ensure that Twitter and other online platforms obey non-discrimination law regarding Ofcom’s production of codes of practice and guidance. The purpose of those amendments is to ensure that Ofcom consults with persons who have expertise in the Equality Act before producing those codes of conduct.
I will not push the new clauses to a vote. I had a very productive meeting with the Minister’s predecessor, the hon. Member for Croydon South (Chris Philp), who expressed a great deal of sympathy when I explained the position to him. I have been encouraged by the cross-party support for the new clauses, both in discussions before today with Members from all parties and in some of the comments made by various hon. Members today.
I am really hoping that the Government will take my new clauses away and give them very serious consideration, that they will look at the Joint Committee’s report from October 2019 and that either they will adopt these amendments or perhaps somebody else will take them forward in the other place.
I can assure the hon. and learned Lady that I am happy to carry on the dialogue that she had with my predecessor and meet her to discuss this at a further date.
I am delighted to hear that. I must tell the Minister that I have had a huge number of approaches from women, from lesbians and from gay men across the United Kingdom who are suffering as a result of Twitter’s moderation policy. There is a lot of support for new clause 24.
Of course, it is important to remember that the Equality Act protects everyone. Gender reassignment is there with the protected characteristics of sex and sexual orientation. It is really not acceptable for a company such as Twitter, which provides a service in the United Kingdom, to seek to flout and ignore the provisions of our domestic law on anti-discrimination. I am grateful to the Minister for the interest he has shown and for his undertaking to meet me, and I will leave it at that for now.
The right hon. Gentleman makes a very important point and, as he knows, there is a wider ongoing Government review related to advertising online, which is a very serious issue. I assure him that we will follow up with colleagues in the Department of Health and Social Care to discuss the points he has raised.
I join everyone else in the House in welcoming the Minister to his place.
I rise to speak in support of amendments 15 and 16. At the core of this issue is the first duty of any Government: to keep people safe. Too often in debates, which can become highly technical, we lose sight of that fact. We are not just talking about technology and regulation; we are talking about real lives and real people. It is therefore incumbent on all of us in this place to have that at the forefront of our minds when discussing such legislation.
Labelling social media as the wild west of today is hardly controversial—that is plain and obvious for all to see. There has been a total failure on the part of social media companies to make their platforms safe for everyone to use, and that needs to change. Regulation is not a dirty word, but a crucial part of ensuring that as the internet plays a bigger role in every generation’s lives, it meets the key duty of keeping people safe. It has been a decade since we first heard of this Bill, and almost four years since the Government committed to it, so I am afraid that there is nothing even slightly groundbreaking about the Bill as it is today. We have seen progress being made in this area around the world, and the UK is falling further and further behind.
Of particular concern to me is the impact on children and young people. As a mother, I worry for the world that my young daughter will grow up in, and I will do all I can in this place to ensure that children’s welfare is at the absolute forefront. I can see no other system or institution that children are allowed to engage with that has such a wanting lack of safeguards and regulation. If there was a faulty slide in a playground, it would be closed off and fixed. If a sports field was covered with glass or litter, that would be reported and dealt with. Whether we like it or not, social media has become the streets our children hang out in, the world they grow up in and the playground they use. It is about time we started treating it with the same care and attention.
There are far too many holes in the Bill that allow for the continued exploitation of children. Labour’s amendments 15 and 16 tackle the deeply troubling issue of “breadcrumbing”. That is where child abusers use social networks to lay trails to illegal content elsewhere online and share videos of abuse edited to fall within content moderation guidelines. The amendments would give the regulators powers to tackle that disgusting practice and ensure that there is a proactive response to it. They would bring into regulatory scope the millions of interactions with accounts that actively enable child abuse. Perhaps most importantly, they would ensure that social media companies tackled child abuse at the earliest possible stage.
In its current form, even with Government amendment 14, the Bill merely reinforces companies’ current focus only on material that explicitly reaches the criminal threshold. That is simply not good enough. Rather than acknowledging that issue, Government amendments 71 and 72 let social media companies off the hook. They remove the requirement for companies to apply their terms and conditions “consistently”. That was addressed very eloquently by the hon. Member for Croydon South (Chris Philp) and the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), who highlighted that Government amendment 14 simply does not go far enough.
On the amendments that the former Minister, my hon. Friend the Member for Croydon South (Chris Philp), spoke to, the word “consistently” has not been removed from the text. There is new language that follows the use of “consistently”, but the use of that word will still apply in the context of the companies’ duties to act against illegal content.
I welcome the Minister’s clarification and look forward to the amendments being made to the Bill. Other than tying one of our hands behind our back in relation to trying to keep children safe, however, the proposals as they stand do not achieve very much. This will undermine the entire regulatory system, practically rendering it completely ineffective.
Although I welcome the Bill and some of the Government amendments, it still lacks a focus on ensuring that tech companies have the proper systems in place to fulfil their duty of care and keep our children safe. The children of this country deserve better. That is why I wholeheartedly welcome the amendments tabled by my hon. Friend the Member for Pontypridd (Alex Davies-Jones) and urge Government Members to support them.
I rise to speak to new clauses 25 and 26 in my name. The Government rightly seek to make the UK the safest place in the world to go online, especially for our children, and some of their amendments will start to address previous gaps in the Bill. However, I believe that the Bill still falls short in its aim not only to protect children from harm and abuse, but, importantly, to empower and enable young people to make the most of the online world.
I welcome the comments that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) made about how we achieve the balance between rights and protecting children from harm. I also welcome his amendments on children’s wellbeing, which seek to achieve that balance.
With one in five children going online, keeping them safe is more difficult but more important than ever. I speak not only as the mother of two very young children who are growing up with iPads in their hands, but as—like everyone else in the Chamber—a constituency Member of Parliament who speaks regularly to school staff and parents who are concerned about the harms caused by social media in particular, but also those caused by games and other services to which children have access.
The Bill proffers a broad and vague definition of content that is legal yet harmful. As many have already said, it should not be the responsibility of the Secretary of State, in secondary legislation, to make decisions about how and where to draw the line; Parliament should set clear laws that address specific, well-defined harms, based on strong evidence. The clear difficulty that the Government have in defining what content is harmful could have been eased had the Bill focused less on removing harmful content and more on why service providers allow harmful content to spread so quickly and widely. Last year, the 5Rights Foundation conducted an experiment in which it created several fake Instagram profiles for children aged between 14 and 17. When the accounts were searched for the term “skinny”, while a warning pop-up message appeared, among the top results were
“accounts promoting eating disorders and diets, as well as pages advertising appetite-suppressant gummy bears.”
Ultimately, the business models of these services profit from the spread of such content. New clause 26 requires the Government and Ofcom to focus on ensuring that internet services are safe by design. They should not be using algorithms that give prominence to harmful content. The Bill should focus on harmful systems rather than on harmful content.
It does focus on systems as well as content. We often talk about content because it is the exemplar for the failure of the systems, but the systems are entirely within the scope of the Bill.
I thank the Minister for that clarification, but there are still many organisations out there, not least the Children’s Charities Coalition, that feel that the Bill does not go far enough on safety by design. Concerns have rightly been expressed about freedom of expression, but if we focus on design rather than content, we can protect freedom of expression while keeping children safe at the same time. New clause 26 is about tackling harms downstream, safeguarding our freedoms and, crucially, expanding participation among children and young people. I fear that we will always be on the back foot when trying to tackle harmful content. I fear that regulators or service providers will become over-zealous in taking down what they consider to be harmful content, removing legal content from their platforms just in case it is harmful, or introducing age gates that deny children access to services outright.
Of course, some internet services are clearly inappropriate for children, and illegal content should be removed—I think we all agree on that—but let us not lock children out of the digital world or let their voices be silenced. Forty-three per cent. of girls hold back their opinions on social media for fear of criticism. Children need a way to exercise their rights. Even the Children’s Commissioner for England has said that heavy-handed parental controls that lock children out of the digital world are not the solution.
I tabled new clause 25 because the Bill’s scope, focusing on user-to-user and search services, is too narrow and not sufficiently future-proof. It should cover all digital technology that is likely to be accessed by children. The term
“likely to be accessed by children”
appears in the age-appropriate design code to ensure that the privacy of children’s data is protected. However, that more expansive definition is not included in the Bill, which imposes duties on only a subset of services to keep children safe. Given rapidly expanding technologies such as the metaverse—which is still in its infancy—and augmented reality, as well as addictive apps and games that promote loot boxes and gambling-type behaviour, we need a much more expansive definition
I will try to avoid too much preamble, but I thank the former Minister, the hon. Member for Croydon South (Chris Philp), for all his work in Committee and for listening to my nearly 200 contributions, for which I apologise. I welcome the new Minister to his place.
As time has been short today, I am keen to meet the Minister to discuss my new clauses and amendments. If he cannot meet me, I would be keen for him to meet the NSPCC, in particular, on some of my concerns.
Amendment 196 is about using proactive technology to identify CSEA content, which we discussed at some length in Committee. The hon. Member for Croydon South made it very clear that we should use scanning to check for child sexual abuse images. My concern is that new clause 38, tabled by the Lib Dems, might exclude proactive scanning to look for child sexual abuse images. I hope that the Government do not lurch in that direction, because we need proactive scanning to keep children protected.
New clause 18 specifically addresses child user empowerment duties. The Bill currently requires that internet service providers have user empowerment duties for adults but not for children, which seems bizarre. Children need to be able to say yes or no. They should be able to make their own choices about excluding content and not receiving unsolicited comments or approaches from anybody not on their friend list, for example. Children should be allowed to do that, but the Bill explicitly says that user empowerment duties apply only to adults. New clause 18 is almost a direct copy of the adult user empowerment duties, with a few extra bits added. It is important that children have access to user empowerment.
Amendment 190 addresses habit-forming features. I have had conversations about this with a number of organisations, including The Mix. I regularly accessed its predecessor, The Site, more than 20 years ago, and it is concerned that 42% of young people surveyed by YoungMinds show addiction-like behaviour in what they are accessing on social media. There is nothing on that in this Bill. The Mix, the Mental Health Foundation, the British Psychological Society, YoungMinds and the Royal College of Psychiatrists are all unhappy about the Bill’s failure to regulate habit-forming features. It is right that we provide support for our children, and it is right that our children are able to access the internet safely, so it is important to address habit-forming behaviour.
Amendment 162 addresses child access assessments. The Bill currently says that providers need to do a child access assessment only if there is a “significant” number of child users. I do not think that is enough and I do not think it is appropriate, and the NSPCC agrees. The amendment would remove the word “significant.” OnlyFans, for example, should not be able to dodge the requirement to child risk assess its services because it does not have a “significant” number of child users. These sites are massively harmful, and we need to ensure changes are made so they cannot wriggle out of their responsibilities.
Finally, amendment 161 is about live, one-to-one oral communications. I understand why the Government want to exempt live, one-to-one oral communications, as they want to ensure that phone calls continue to be phone calls, which is totally fine, but they misunderstand the nature of things like Discord and how people communicate on Fortnite, for example. People are having live, one-to-one oral communications, some of which are used to groom children. We cannot explicitly exempt them and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way so that children can be protected from the grooming behaviour we see on some online platforms.
Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot meet me, will he please meet the NSPCC? We cannot explicitly exempt those and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way, in order that children can be protected from that grooming behaviour that we see on some of those platforms that are coming online. Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot do that, I ask that the NSPCC have a meeting with him.
We have had a wide-ranging debate of passion and expert opinion from Members in all parts of the House, which shows the depth of interest in this subject, and the depth of concern that the Bill is delivered and that we make sure we get it right. I speak as someone who only a couple of days ago became the Minister for online safety, although I was previously involved in engaging with the Government on this subject. As I said in my opening remarks, this has been an iterative process, where Members from across the House have worked successfully with the Government to improve the Bill. That is the spirit in which we should complete its stages, both in the Commons and in the Lords, and look at how we operate this regime when it has been created.
I wish to start by addressing remarks made by the hon. Member for Pontypridd (Alex Davies-Jones), the shadow Minister, and by the hon. Member for Cardiff North (Anna McMorrin) about violence against women and girls. There is a slight assumption that if the Government do not accept an amendment that writes, “Violence against women and girls” into the priority harms in the Bill, somehow the Bill does not address that issue. I think we would all agree that that is not the case. The provisions on harmful content that is directed at any individual, particularly the new harms offences approved by the Law Commission, do create offences in respect of harm that is likely to lead to actual physical harm or severe psychological harm. As the father of a teenage girl, who was watching earlier but has now gone to do better things, I say that the targeting of young girls, particularly vulnerable ones, with content that is likely to make them more vulnerable is one of the most egregious aspects of the way social media works. It is right that we are looking to address serious levels of self-harm and suicide in the Bill and in the transparency requirements. We are addressing the self-harm and suicide content that falls below the illegal threshold but where a young girl who is vulnerable is being sent content and prompted with content that can make her more vulnerable, could lead her to harm herself or worse. It is absolutely right that that was in the scope of the Bill.
New clause 3, perfectly properly, cites international conventions on violence against women and girls, and how that is defined. At the moment, with the way the Bill is structured, the schedule 7 offences are all based on existing areas of UK law, where there is an existing, clear criminal threshold. Those offences, which are listed extensively, will all apply as priority areas of harm. If there is, through the work of the Law Commission or elsewhere, a clear legal definition of misogyny and violence against women and girls that is not included, I think it should be included within scope. However, if new clause 3 was approved, as tabled, it would be a very different sort of offence, where it would not be as clear where the criminal threshold applied, because it is not cited against existing legislation. My view, and that of the Government, is that existing legislation covers the sorts of offences and breadth of offences that the shadow Minister rightly mentioned, as did other Members. We should continue to look at this—
The Minister is not giving accurate information there. Violence against women and girls is defined by article 3 of the Council of Europe convention on preventing violence against women and domestic violence—the Istanbul convention. So there is that definition and it would be valid to put that in the Bill to ensure that all of that is covered.
I was referring to the amendment’s requirement to list that as part of the priority illegal harms. The priority illegal harms set out in the Bill are all based on existing UK Acts of Parliament where there is a clear established criminal threshold—that is the difference. The spirit of what that convention seeks to achieve, which we would support, is reflected in the harm-based offences written into the Bill. The big change in the structure of the Bill since the draft Bill was published—the Joint Committee on the Draft Online Safety Bill and I pushed for this at the time—is that far more of these offences have been clearly written into the Bill so that it is absolutely clear what they apply to. The new offences proposed by the Law Commission, particularly those relating to self-harm and suicide, are another really important addition. We know what the harms are. We know what we want this Bill to do. The breadth of offences that the hon. Lady and her colleagues have set out is covered in the Bill. But of course as law changes and new offences are put in place, the structure of the Bill, through the inclusion of new schedule 7 on priority offences, gives us the mechanism in the future, through instruments of this House, to add new offences to those primary illegal harms as they occur. I expect that that is what would happen. I believe that the spirit of new clause 3 is reflected in the offences that are written into the Bill.
The hon. Member for Pontypridd mentioned Government new clause 14. It is not true that the Government came up with it out of nowhere. There has been extensive consultation with Ofcom and others. The concern is that some social media companies, and some users of services, may have sought to interpret the criminal threshold as being based on whether a court of law has found that an offence has been committed, and only then might they act. Actually, we want them to pre-empt that, based on a clear understanding of where the legal threshold is. That is how the regulatory codes work. So it is an attempt, not to weaken the provision but to bring clarity to the companies and the regulator over the application.
The hon. Member for Ochil and South Perthshire (John Nicolson) raised an important point with regard to the Modern Slavery Act. As the Bill has gone along, we have included existing migration offences and trafficking offences. I would be happy to meet him further to discuss that aspect. Serious offences that exist in law should have an application, either as priority harms or as non-priority legal harms, and we should consider how we do that. I do not know whether he intends to press the amendment, but either way, I would be happy to meet him and to discuss this further.
My hon. Friend the Member for Solihull, the Chair of the Digital, Culture, Media and Sport Committee, raised an important matter with regard to the power of the Secretary of State, which was a common theme raised by several other Members. The hon. Member for Ochil and South Perthshire rightly quoted me, or my Committee’s report, back to me—always a chilling prospect for a politician. I think we have seen significant improvement in the Bill since the draft Bill was published. There was a time when changes to the codes could be made by the negative procedure; now they have to be by a positive vote of both Houses. The Government have recognised that they need to define the exceptional circumstances in which that provision might be used, and to define specifically the areas that are set out. I accept from the Chair of the Select Committee and my right hon. and learned Friend the Member for Kenilworth and Southam that those things could be interpreted quite broadly—maybe more broadly than people would like—but I believe that progress has been made in setting out those powers.
I would also say that this applies only to the period when the codes of practice are being agreed, before they are laid before Parliament. This is not a general provision. I think sometimes there has been a sense that the Secretary of State can at any time pick up the phone to Ofcom and have it amend the codes. Once the codes are approved by the House they are fixed. The codes do not relate to the duties. The duties are set out in the legislation. This is just the guidance that is given to companies on how they comply. There may well be circumstances in which the Secretary of State might look at those draft codes and say, “Actually, we think Ofcom has given the tech companies too easy a ride here. We expected the legislation to push them further.” Therefore it is understandable that in the draft form the Secretary of State might wish to have the power to raise that question, and not dictate to Ofcom but ask it to come back with amendments.
I take on board the spirit of what Members have said and the interest that the Select Committee has shown. I am happy to continue that dialogue, and obviously the Government will take forward the issues that they set out in the letter that was sent round last week to Members, showing how we seek to bring in that definition.
A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are. Where we have transparency requirements, it is absolutely clear what they apply to. The amendment that the Government tabled reflects the work that he and his colleagues have done, setting out that if we are discussing the terms of service of tech companies, it should be perfectly possible for them to say that this is not an area where they intend to take enforcement action and the Bill does not require them to do so.
The hon. Member for Batley and Spen (Kim Leadbeater) mentioned Zach’s law. The hon. Member for Ochil and South Perthshire raised that before the Joint Committee. So, too, did my hon. Friend the Member for Watford (Dean Russell); he and the hon. Member for Ochil and South Perthshire are great advocates on that. It is a good example of how a clear offence, something that we all agree to be wrong, can be tackled through this legislation; in this case, a new offence will be created, to prevent the pernicious targeting of people with epilepsy with flashing images.
Finally, in response to the speech by the hon. Member for Aberdeen North (Kirsty Blackman), I certainly will continue dialogue with the NSPCC on the serious issues that she has raised. Obviously, child protection is foremost in our mind as we consider the legislation. She made some important points about the ability to scan for encrypted images. The Government have recently made further announcements on that, to be reflected as the Bill progresses through the House.
To assist the House, I anticipate two votes on this first section and one vote immediately on the next, because it has already been moved and debated.
I am anticipating another Division, as I said, and then I understand there may be some points of order, which I will hear after that Division.
That concludes proceedings on new clauses, new schedules and amendments to those parts of the Bill that have to be concluded by 4.30 pm.
It has been pointed out to me that, in this unusually hot weather, Members should please remember to drink more water. I tried it myself once. [Laughter.]
In accordance with the programme (No. 2) order of today, we now come to new clauses, new schedules and amendments relating to those parts of the Bill to be concluded by 7 pm. We begin with new clause 14, which the House has already debated. I therefore call the Minister to move new clause 14 formally.
New Clause 14
Providers’ judgements about the status of content
“(1) This section sets out the approach to be taken where—
(a) a system or process operated or used by a provider of a Part 3 service for the purpose of compliance with relevant requirements, or
(b) a risk assessment required to be carried out by Part 3, involves a judgement by a provider about whether content is content of a particular kind.
(2) Such judgements are to be made on the basis of all relevant information that is reasonably available to a provider.
(3) In construing the reference to information that is reasonably available to a provider, the following factors, in particular, are relevant—
(a) the size and capacity of the provider, and
(b) whether a judgement is made by human moderators, by means of automated systems or processes or by means of automated systems or processes together with human moderators.
(4) Subsections (5) to (7) apply (as well as subsection (2)) in relation to judgements by providers about whether content is—
(a) illegal content, or illegal content of a particular kind, or
(b) a fraudulent advertisement.
(5) In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is content of the kind in question (and a provider must treat content as content of the kind in question if reasonable grounds for that inference exist).
(6) Reasonable grounds for that inference exist in relation to content and an offence if, following the approach in subsection (2), a provider—
(a) has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied, and
(b) does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.
(7) In the case of content generated by a bot or other automated tool, the tests mentioned in subsection (6)(a) and (b) are to be applied in relation to the conduct or mental state of a person who may be assumed to control the bot or tool (or, depending what a provider knows in a particular case, the actual person who controls the bot or tool).
(8) In considering a provider’s compliance with relevant requirements to which this section is relevant, OFCOM may take into account whether providers’ judgements follow the approaches set out in this section (including judgements made by means of automated systems or processes, alone or together with human moderators).
(9) In this section—
“fraudulent advertisement” has the meaning given by section 34 or 35 (depending on the kind of service in question);
“illegal content” has the same meaning as in Part 3 (see section 52);
“relevant requirements” means—
(a) duties and requirements under this Act, and
(b) requirements of a notice given by OFCOM under this Act.”—(Damian Collins.)
This new clause clarifies how providers are to approach judgements (human or automated) about whether content is content of a particular kind, and in particular, makes provision about how questions of mental state and defences are to be approached when considering whether content is illegal content or a fraudulent advertisement.
Brought up.
Question put, That the clause be added to the Bill.
I absolutely agree. We can also look at this from the point of view of gambling reform and age verification for that. The technology is there, and we can harness and use it to protect people. All I am asking is that we do not let this slip through the cracks this evening.
We have had an important debate raising a series of extremely important topics. While the Government may not agree with the amendments that have been tabled, that is not because of a lack of seriousness of concern about the issues that have been raised.
The right hon. Member for Kingston upon Hull North (Dame Diana Johnson) spoke very powerfully. I have also met Leigh Nicol, the lady she cited, and she discussed with me the experience that she had. Sadly, it was during lockdown and it was a virtual meeting rather than face to face. There are many young women, in particular, who have experienced the horror of having intimate images shared online without their knowledge or consent and then gone through the difficult experience of trying to get them removed, even when it is absolutely clear that they should be removed and are there without their consent. That is the responsibility of the companies and the platforms to act on.
Thinking about where we are now, before the Bill passes, the requirement to deal with illegal content, even the worst illegal content, on the platforms is still largely based on the reporting of that content, without the ability for us to know how effective they are at actually removing it. That is largely based on old legislation. The Bill will move on significantly by creating proactive responsibilities not just to discover illegal content but to act to mitigate it and to be audited to see how effectively it is done. Under the Bill, that now includes not just content that would be considered to be an abuse of children. A child cannot give consent to have sex or to appear in pornographic content. Companies need to make sure that what they are doing is sufficient to meet that need.
It should be for the regulator, Ofcom, as part of putting together the codes of practice, to understand, even on more extreme content, what systems companies have in place to ensure that they are complying with the law and certainly not knowingly hosting content that has been flagged to them as being non-consensual pornography or child abuse images, which is effectively what pornography with minors would be; and to understand what systems they have in place to make sure that they are complying with the law and, as hon. Members have said, making sure that they are using available technologies in order to deliver that.
We have an opportunity here today to make sure that the companies are doing it. I am not entirely sure why we would not take that opportunity to legislate to make sure that they are. With the greatest of respect to the Minister back in a position of authority, it sounds an awful lot like the triumph of hope over experience.
It is because of the danger of such a sentiment that this Bill is so important. It not just sets the targets and requirements of companies to act against illegal content, but enables a regulator to ensure that they have the systems and processes in place to do it, that they are using appropriate technology and that they apply the principle that their system should be effective at addressing this issue. If they are defective, that is a failure on the company’s part. It cannot be good enough that the company says, “It is too difficult to do”, when they are not using technologies that would readily solve that problem. We believe that the technologies that the companies have and the powers of the regulator to have proper codes of practice in place and to order the companies to make sure they are doing it will be sufficient to address the concern that the hon. Lady raises.
I am a little taken aback that the Minister believes that the legislation will be sufficient. I do not understand why he has not responded to the point that my hon. Friend the Member for Birmingham, Yardley (Jess Phillips) was making that we could make this happen by putting the proposal in the Bill and saying, “This is a requirement.” I am not sure why he thinks that is not the best way forward.
It is because the proposal would not make such content more illegal than it is now. It is already illegal and there are already legal duties on companies to act. The regulator’s job is to ensure they have the systems in place to do that effectively, and that is what the Bill sets out. We believe that the Bill addresses the serious issue that the right hon. Lady raises in her amendments. That legal requirement is there, as is the ability to have the systems in place.
If I may, I will give a different example based on the fraud example given by the shadow Minister, the hon. Member for Worsley and Eccles South (Barbara Keeley). On the Joint Committee that scrutinised the Bill, we pushed hard to have fraudulent ads included within the scope of the Bill, which has been one of the important amendments to it. The regulator can consider what systems the company should have in place to identify fraud, but also what technologies it employs to make it far less likely that fraud would be there in the first place. Google has a deal with the Financial Conduct Authority, whereby it limits advertisers from non-accredited companies advertising on its platform. That makes it far less likely that fraud will be discovered because, if the system works, only properly recognised organisations will be advertising.
Facebook does not have such a system in place. As a consequence, since the Google system went live, we have seen a dramatic drop in fraud ads on Google, but a substantial increase in fraud ads on Facebook and platforms such as Instagram. That shows that if we have the right systems in place, we can have a better outcome and change the result. The job of the regulator with illegal pornography and other illegal content should be to look at those systems and say, “Do the companies have the right technology to deliver the result that is required?” If they do not, that would still be a failure of the codes.
The Minister is quoting a case that I quoted in Committee, and the former Minister, the hon. Member for Croydon South (Chris Philp), would not accept amendments on this issue. We could have tightened up on fraudulent advertising. If Google can do that for financial ads, other platforms can do it. We tabled an amendment that the Government did not accept. I do not know why this Minister is quoting something that we quoted in Committee—I know he was not there, but he needs to know that we tried this and the former Minister did not accept what we called for.
I am quoting that case merely because it is a good example of how, if we have better systems, we can get a better result. As part of the codes of practice, Ofcom will be able to look at some of these other systems and say to companies, “This is not just about content moderation; it is about having better systems that detect known illegal activity earlier and prevent it from getting on to the platform.” It is not about how quickly it is removed, but how effective companies are at stopping it ever being there in the first place. That is within the scope of regulation, and my belief is that those powers exist at the moment and therefore should be used.
Just to push on this point, images of me have appeared on pornographic sites. They were not necessarily illegal images of anything bad happening to me, but other Members of Parliament in this House and I have suffered from that. Is the Minister telling me that this Bill will allow me to get in touch with that site and have an assurance that that image will be taken down and that it would be breaking the law if it did not do so?
The Bill absolutely addresses the sharing of non-consensual images in that way, so that would be something the regulator should take enforcement action against—
Well, the regulator is required, and has the power, to take enforcement action against companies for failing to do so. That is what the legislation sets out, and we will be in a very different place from where we are now. That is why the Bill constitutes a very significant reform.
Could the Minister give me a reassurance about when consent is withdrawn? The image may initially have been there “consensually”—I would put that in inverted commas—so the platform is okay to put it there. However, if someone contacts the platform saying that they now want to change their consent—they may want to take a role in public life, having previously had a different role; I am not saying that about my hon. Friend the Member for Birmingham, Yardley (Jess Phillips)—my understanding is that there is no ability legally to enforce that content coming down. Can the Minister correct me, and if not, why is he not supporting new clause 7?
With people who have appeared in pornographic films consensually and signed contracts to do so, that would be a very different matter from the question of intimate images being shared without consent. When someone has not consented for such images to be there, that would be a very different matter. I am saying that the Bill sets out very clearly—it did not do so in draft form—that non-consensual sexual images and extreme pornography are within the scope of the regulator’s power. The regulator should be taking action not just on what a company does to take such content down when it is discovered after the event, but on what systems the company has in place and whether it deploys all available technology to make sure that such content is never there in the first place.
Before closing, I want to touch briefly on the point raised about the Secretary of State’s powers to designate priority areas of harm. This is now under the affirmative procedure in the Bill, and it requires the approval of both Houses of Parliament. The priority illegal harms will be based on offences that already exist in law, and we are writing those priority offences into the Bill. The other priorities will be areas where the regulator will seek to test whether companies adhere to their terms of service. The new transparency requirements will set that out, and the Government have said that we will set out in more detail which of those priority areas of harm such transparency will apply to. There is still more work to be done on that, but we have given an indicative example. However, when it comes to adding a new priority illegal offence to the Bill, the premise is that it will already be an offence that Parliament has created, and writing it into the Bill will be done with the positive consent of Parliament. I think that is a substantial improvement on where the Bill was before. I am conscious that I have filled my time.
Question put, That the clause be read a Second time.
(1 year, 12 months ago)
Commons ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
Government new clause 12—Warning notices.
Government new clause 20—OFCOM’s reports about news publisher content and journalistic content.
Government new clause 40—Amendment of Enterprise Act 2002.
Government new clause 42—Former providers of regulated services.
Government new clause 43—Amendments of Part 4B of the Communications Act.
Government new clause 44—Repeal of Part 4B of the Communications Act: transitional provision etc.
Government new clause 51—Publication by providers of details of enforcement action.
Government new clause 52—Exemptions from offence under section 152.
Government new clause 53—Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2).
New clause 1—Provisional re-categorisation of a Part 3 service—
“(1) This section applies in relation to OFCOM’s duty to maintain the register of categories of regulated user-to-user services and regulated search services under section 83.
(2) If OFCOM—
(a) consider that a Part 3 service not included in a particular part of the register is likely to meet the threshold conditions relevant to that part, and
(b) reasonably consider that urgent application of duties relevant to that part is necessary to avoid or mitigate significant harm,
New clause 16—Communication offence for encouraging or assisting self-harm—
“(1) In the Suicide Act 1961, after section 3 insert—
“3A Communication offence for encouraging or assisting self-harm
(1) A person (“D”) commits an offence if—
(a) D sends a message,
(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and
(c) D’s act was intended to encourage or assist the infliction of serious physical harm.
(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.
(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.
(4) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;
(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.
(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.
(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.
(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).
(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—
(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and
(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and
(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.””
This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.
New clause 17—Liability of directors for compliance failure—
“(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.
(2) If OFCOM considers that the failure results from any—
(a) action,
(b) direction,
(c) neglect, or
(d) with the consent
This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.
New clause 23—Financial support for victims support services—
“(1) The Secretary of State must by regulations make provision for penalties paid under Chapter 6 to be used for funding for victims support services.
(2) Those regulations must—
(a) specify criteria setting out which victim support services are eligible for financial support under this provision;
(b) set out a means by which the amount of funding available should be determined;
(c) make provision for the funding to be reviewed and allocated on a three year basis.
(3) Regulations under this section—
(a) shall be made by statutory instrument, and
(b) may not be made unless a draft has been laid before and approved by resolution of each House of Parliament.”
New clause 28—Establishment of Advocacy Body—
“(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.
(2) A “child user”—
(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and
(b) includes both any existing child user and any future child user.
(3) The work of the Advocacy Body may include—
(a) representing the interests of child users;
(b) the protection and promotion of these interests;
(c) any other matter connected with those interests.
(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—
(a) safety duties about illegal content, in particular CSEA content;
(b) safety duties protecting children;
(c) “enforceable requirements” relating to children.
(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.
(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.
(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.
(8) The Advocacy Body may undertake research on their own account.
(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.
(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.
(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”
New clause 29—Duty to promote media literacy: regulated user-to-user services and search services—
“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.
(2) This section applies only in relation to OFCOM’s duty to regulate—
(a) user-to-user services, and
(b) search services.
(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—
(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;
(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;
(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;
(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—
(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;
(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;
(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);
(e) to promote better coordination within the media literacy sector.
(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.
(5) Where OFCOM prepares guidance under subsection (4) it must—
(a) publish the guidance (and any revised or replacement guidance); and
(b) keep the guidance under review.
(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 30—Media literacy strategy—
“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).
(2) The strategy must—
(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),
(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;
(c) explain why OFCOM considers that the steps it proposes to take will be effective;
(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.
(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.
(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.
(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—
(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;
(b) the advisory committee on disinformation and misinformation, and
(c) any other person that OFCOM consider appropriate.
(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—
(a) revise the strategy, or
(b) publish an explanation of why they have decided not to revise it.
(7) If OFCOM decides to revise the strategy they must—
(a) consult in accordance with subsection (3), and
(b) publish the revised strategy.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 31—Research conducted by regulated services—
“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.
(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—
(a) a specific piece of research held by the service, or
(b) all research the service holds on a topic specified by OFCOM.”
New clause 34—Factual Accuracy—
“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.
(2) Any Regulated Service must provide an index of the historic factual accuracy of material published by each user who has—
(a) produced user-generated content,
(b) news publisher content, or
(c) comments and reviews on provider contact
(3) The index under subsection (1) must—
(a) satisfy minimum quality criteria to be set by OFCOM, and
(b) be displayed in a way which allows any user easily to reach an informed view of the likely factual accuracy of the content at the same time as they encounter it.”
New clause 35—Duty of balance—
“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.
(2) Any Regulated Service which selects or prioritises particular—
(a) user-generated content,
(b) news publisher content, or
(c) comments and reviews on provider content
New clause 36—Identification of information incidents by OFCOM—
“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.
(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—
(a) identifying, and assessing the severity of, actual or potential information incidents; and
(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).
(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—
(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and
(b) publish such recommendations or other information that OFCOM considers appropriate.
(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.
(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—
(a) the matters it will take into account in determining whether an information incident has arisen;
(b) the matters it will take into account in determining the severity of an incident; and
(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.
(6) For the purposes of this section—
“harmful misinformation or disinformation” means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;
“information incident” means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”
This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.
New clause 37—Duty to promote media literacy: regulated user-to-user services and search services—
“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.
(2) This section applies only in relation to OFCOM’s duty to regulate—
(a) user-to-user services, and
(b) search services.
(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—
(a) to encourage the development and use of technologies and systems in relation to user-to-user services and search services which help to improve the media literacy of members of the public, including in particular technologies and systems which—
(i) indicate the nature of content on a service (for example, show where it is an advertisement);
(ii) indicate the reliability and accuracy of the content; and
(iii) facilitate control over what content is received;
(b) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;
(c) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public.
(4) OFCOM must prepare guidance about—
(a) the matters referred to in subsection (3) as it considers appropriate; and
(b) minimum standards that media literacy initiatives must meet.
(5) Where OFCOM prepares guidance under subsection (4) it must—
(a) publish the guidance (and any revised or replacement guidance); and
(b) keep the guidance under review.
(6) Every report under paragraph 12 of the Schedule to the Office of Communications Act 2002 (OFCOM’s annual report) for a financial year must contain a summary of the steps that OFCOM have taken under subsection (1) in that year.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 45—Sharing etc intimate photographs or film without consent—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) A person (A) does not commit an offence under this section if A shares a photograph or film of another person (B) with B or a third person (C) if—
(a) the photograph or film only shows activity that would be ordinarily seen on public street, except for a photograph or film of breastfeeding;
(b) the photograph or film was taken in public, where the person depicted was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;
(c) A reasonably believed that the photograph or film, taken in public, showed a person depicted who was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;
(d) the photograph or film has been previously shared with consent in public;
(e) A reasonably believed that the photograph or film had been previously shared with consent in public;
(f) the photograph or film shows a young child and is of a kind ordinarily shared by family and friends;
(g) the photograph or film is of a child shared for that child’s medical care or treatment, where there is parental consent.
(4) A person (A) does not commit an offence under this section if A shares information about where to access a photograph or film where this photograph or film has already been made available to A.
(5) It is a defence for a person charged with an offence under this section to prove that they—
(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime;
(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings;
(c) reasonably believed that the sharing was necessary for the administration of justice;
(d) reasonably believed that the sharing was necessary for a genuine medical, scientific or educational purpose; and
(e) reasonably believed that the sharing was in the public interest.
(6) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(7) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(8) “Photograph” includes the negative as well as the positive version.
(9) “Film” means a moving image.
(10) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(11) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(12) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding 6 months or a fine (or both).”
This new clause creates the offence of sharing an intimate image without consent, providing the necessary exclusions such as for children’s medical care or images taken in public places, and establishing the penalty as triable by magistrates only with maximum imprisonment of 6 months.
New clause 46—Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents; and
(c) A intends that the subject of the photograph or film will be caused alarm, distress or humiliation by the sharing of the photograph or film.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(4) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(5) “Photograph” includes the negative as well as the positive version.
(6) “Film” means a moving image.
(7) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(9) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 47—Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents; and
(c) A shared the photograph or film for the purpose of obtaining sexual gratification (whether for the sender or recipient).
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(4) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(5) “Photograph” includes the negative as well as the positive version.
(6) “Film” means a moving image.
(7) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(9) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 48—Threatening to share etc intimate photographs or film—
“(1) A person (A) commits an offence if—
(a) A threatens to share an intimate photograph or film of another person (B) with B or a third person (C); and
(i) A intends B to fear that the threat will be carried out; or A is reckless as to whether B will fear that the threat will be carried out.
(2) “Threatening to share” should be read to include threatening to share an intimate photograph or film that does not exist and other circumstances where it is impossible for A to carry out the threat.
(3) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(4) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(5) References to sharing, or threatening to share, such a photograph or film with another person include—
(a) sending, or threatening to send, it to another person by any means, electronically or otherwise;
(b) showing, or threatening to show, it to another person;
(c) placing, or threatening to place, it for another person to find; or
(d) sharing, or threatening to share, it on or uploading it to a user-to-user service, including websites or online public forums.
(6) “Photograph” includes the negative as well as the positive version.
(7) “Film” means a moving image.
(8) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(9) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(10) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates another more serious offence of threatening to share an intimate image, regardless of whether such an image actually exists, and where the sender intends to cause fear, or is reckless to whether they would cause fear, punishable by 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 49—Special measures in criminal proceedings for offences involving the sharing of intimate images—
“(1) Chapter 1 of Part 2 of the Youth Justice and Criminal Evidence Act 1999 (giving of evidence or information for purposes of criminal proceedings: special measures directions in case of vulnerable and intimidated witnesses) is amended as follows.
(2) In section 17 (witnesses eligible for assistance on grounds of fear or distress about testifying), in subsection (4A) after paragraph (b) insert “(c) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”
This new clause inserts intimate image abuse into legislation that qualifies victims for special measures when testifying in court (such as partitions to hide them from view, video testifying etc.) which is already prescribed by law.
New clause 50—Anonymity for victims of offences involving the sharing of intimate images—
“(1) Section 2 of the Sexual Offences (Amendment) Act 1992 (Offences to which this Act applies) is amended as follows.
(2) In subsection 1 after paragraph (db) insert—
(dc) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”
Similar to NC49, this new clause allows victims of intimate image abuse the same availability for anonymity as other sexual offences to protect their identities and give them the confidence to testify against their abuser without fear of repercussions.
New clause 54—Report on the effect of Virtual Private Networks on OFCOM’s ability to enforce requirements—
“(1) The Secretary of State must publish a report on the effect of the use of Virtual Private Networks on OFCOM’s ability to enforce requirements under section 112.
(2) The report must be laid before Parliament within six months of the passing of this Act.”
New clause 55—Offence of sending communication facilitating modern slavery and illegal immigration—
‘(1) A person (A) commits an offence if—
(a) (A) intentionally shares with a person (B) or with a third person (C) a photograph or film which is reasonably considered to be, or to be intended to be, facilitating or promoting any activities which do, or could reasonably be expected to, give rise to an offence under—
(i) sections 1 (Slavery, servitude and forced labour), 2 (Human trafficking) or 4 (Committing offence with intent to commit an offence under section 2) of the Modern Slavery Act 2015; or
(ii) sections 24 (Illegal Entry and Similar Offences) or 25 (Assisting unlawful immigration etc) of the Immigration Act 1971; and
(a) (A) does so knowing, or when they reasonably ought to have known, that the activities being depicted are unlawful.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) A person (A) does not commit an offence under this section if—
(a) the sharing is undertaken by or on behalf of a journalist or for journalistic purposes;
(b) the sharing is by a refugee organisation registered in the UK and which falls within the scope of sub-section (3) or section 25A of the Immigration Act 1971;
(c) the sharing is by or on behalf of a duly elected Member of Parliament or other elected representative in the UK.
(4) It is a defence for a person charged under this section to provide that they—
(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime and
(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings.
(5) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).”
This new clause would create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration.
Government amendments 234 and 102 to 117.
Amendment 195, in clause 104, page 87, line 10, leave out subsection 1 and insert—
“(1) If OFCOM consider that it is necessary and proportionate to do so, they may—
(a) give a notice described in subsection (2), (3) or (4) relating to a regulated user to user service or a regulated search service to the provider of the service;
(b) give a notice described in subsection (2), (3) or (4) to a provider or providers of Part 3 services taking into account risk profiles produced by OFCOM under section 84.”
Amendment 152, page 87, line 18, leave out ‘whether’.
This amendment is consequential on Amendment 153.
Amendment 153, page 87, line 19, leave out ‘or privately’.
This amendment removes the ability to monitor encrypted communications.
Government amendment 118.
Amendment 204, in clause 105, page 89, line 17, at end insert—
“(ia) the level of risk of the use of the specified technology accessing, retaining or disclosing the identity or provenance of any confidential journalistic source or confidential journalistic material.”
This amendment would require Ofcom to consider the risk of the use of accredited technology by a Part 3 service accessing, retaining or disclosing the identity or provenance of journalistic sources or confidential journalistic material, when deciding whether to give a notice under Clause 104(1) of the Bill.
Government amendments 119 to 130, 132 to 134, 212, 213, 135 and 214.
Amendment 23, in clause 130, page 114, line 3, leave out paragraph (a).
Government amendment 175.
Amendment 160, in clause 141, page 121, line 9, leave out subsection (2).
This amendment removes the bar of conditionality that must be met for super complaints that relate to a single regulated service.
Amendment 24, page 121, line 16, leave out “The Secretary of State” and insert “OFCOM”.
Amendment 25, page 121, line 21, leave out from “(3),” to end of line 24 and insert “OFCOM must consult—
“(a) The Secretary of State, and
“(b) such other persons as OFCOM considers appropriate.”
This amendment would provide that regulations under clause 141 are to be made by OFCOM rather than by the Secretary of State.
Amendment 189, in clause 142, page 121, line 45, leave out from “including” to end of line 46 and insert
“90 day maximum time limits in relation to the determination and notification to the complainant of—”.
This requires the Secretary of State’s guidance to require Ofcom to determine whether a complaint is eligible for the super-complaints procedure within 90 days.
Amendment 26, in clause 146, page 123, line 33, leave out
“give OFCOM a direction requiring”
and insert “may make representations to”.
Amendment 27, page 123, line 36, leave out subsection (2) and insert—
“(2) OFCOM must have due regard to any representations made by the Secretary of State under subsection (1).”
Amendment 28, page 123, line 38, leave out from “committee” to end of line 39 and insert
“established under this section is to consist of the following members—”.
Amendment 29, page 124, line ], leave out from “committee” to “publish” in line 2 and insert
“established under this section must”.
Amendment 30, page 124, line 4, leave out subsection (5).
Amendment 32, page 124, line 4, leave out clause 148.
Government amendments 176, 239, 138, 240, 215, 241, 242, 217, 218, 243, 219, 244, 245, 220, 221, 140, 246, 222 to 224, 247, 225, 248, 226 and 227.
Amendment 194, in clause 157, page 131, line 16, leave out from beginning to end of line 17 and insert—
“(a) B has not consented for A to send or give the photograph or film to B, and”.
Government amendments 249 to 252, 228, 229 and 235 to 237.
Government new schedule 2—Amendments of Part 4B of the Communications Act.
Government new schedule 3—Video-sharing platform services: transitional provision etc.
Government amendment 238
Amendment 35, schedule 11, page 198, line 5, leave out “The Secretary of State” and insert “OFCOM”.
This amendment would give the power to make regulations under Schedule 11 to OFCOM.
Amendment 2, page 198, line 9, leave out “functionalities” and insert “characteristics”.
Amendment 1, page 198, line 9, at end insert—
“(1A) In this schedule, “characteristics” of a service include its functionalities, user base, business model, governance and other systems and processes.”
Amendment 159, page 198, line 9, at end insert—
“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”
This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.
Amendment 36, page 198, line 10, leave out “The Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 37, page 198, line 16, leave out “The Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 3, page 198, line 2, leave out “functionalities” and insert “characteristics”.
Amendment 9, page 198, line 28, leave out “and” and insert “or”.
Amendment 4, page 198, line 29, leave out “functionality” and insert “characteristic”.
Amendment 38, page 198, line 32, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 5, page 198, line 34, leave out “functionalities” and insert “characteristics”.
Amendment 39, page 198, line 37, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 40, page 198, line 41, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 6, page 198, line 4, leave out “functionalities” and insert “characteristics”.
Amendment 7, page 199, line 11, leave out “functionalities” and insert “characteristics”.
Amendment 8, page 199, line 28, leave out “functionalities” and insert “characteristics”.
Amendment 41, page 199, line 3, leave out subparagraphs (5) to (11).
This amendment is consequential on Amendment 35.
Government amendments 230, 253 to 261 and 233.
I was about to speak to the programme motion, Mr Speaker, but you have outlined exactly what I was going to say, so thank you for that—I am glad to get the process right.
I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.
The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.
Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.
I am listening to my hon. Friend with great interest on this aspect of child sexual abuse and exploitation, which is a heinous crime. Will he go on to speak about how the Ofcom role will interact with law enforcement, in particular the National Crime Agency, when dealing with these awful crimes?
It is important that we tackle this in a number of ways. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and I spoke earlier, and I will come to some of what he will outline. It is important that Ofcom recognises the technologies that are available and—with the Children’s Commissioner as one of the statutory consultees—liaises with the social media platforms, and the agencies, to ensure that there are codes of practice that work, and that we get this absolutely right. It is about enforcing the terms and conditions of the companies and being able to produce the evidence and track the exchanges, as I will outline later, for the agency to use for enforcement.
With the rapid developments in technology, on occasions there will be no existing accredited technology available that will satisfactorily mitigate the risks. Similarly, tech companies might be able to better design solutions that integrate more easily with their services than those that are already accredited. The new regulatory framework must incentivise tech companies to ensure that their safety measures keep pace with the evolving threat, and that they design their services to be safe from the outset. It is for these reasons that the Government have tabled the amendments that we are discussing.
New clauses 11 and 12 establish options for Ofcom when deploying its powers under notices to deal with terrorism content and CSEA content. These notices will empower Ofcom to require companies to use accredited technology to identify and remove illegal terrorism and CSEA content or to prevent users from encountering that content or, crucially, to use their best endeavours to develop or to source technology to tackle CSEA. That strikes the right balance of supporting the adoption of new technology, while ensuring that it does not come at the expense of children’s physical safety.
Terrorism is often linked to non-violent extremism, which feeds into violent extremism and terrorism. How does the Bill define extremism? Previous Governments failed to define it, although it is often linked to terrorism.
This Bill links with other legislation, and obviously the agencies. We do not seek to redefine extremism where those definitions already exist. As we expand on the changes that we are making, we will first ensure that anything that is already illegal goes off the table. Anything that is against the terms and conditions of those platforms that are hosting that content must not be seen. I will come to the safety net and user protection later.
Since Elon Musk’s takeover of Twitter, hate speech has ballooned on the platform and the number of staff members at Twitter identifying images of child sexual abuse and exploitation has halved. How can the Minister be sure that the social media companies are able to mark their own homework in the way that he suggests?
Because if those companies do not, they will get a fine of up to £18 million or 10% of their global turnover, whichever is higher. As we are finding with Twitter, there is also a commercial impetus, because advertisers are fleeing that platform as they see the uncertainty being caused by those changes. A lot of things are moving here to ensure that safety is paramount; it is not just for the Government to act in this area. All we are doing is making sure that those companies enforce their own terms and conditions.
This point is important: we are speaking about terrorism and counter-terrorism and the state’s role in preventing terrorist activity. For clarity, will the Minister update the House later on the work that takes place between his Department and the platforms and, importantly, between the Home Office and the security services. In particular, some specialist work takes place with the Global Internet Forum to Counter Terrorism, which looks at online terrorist and extremist content. That work can ensure that crimes are prevented and that the right kinds of interventions take place.
My right hon. Friend talks with experience from her time at the Home Office. She is absolutely right that the Bill sets a framework to adhere to the terms and conditions of the platforms. It also sets out the ability for the services to look at things such as terrorism and CSEA, which I have been talking about—for example, through the evidence of photos being exchanged. The Bill is not re-examining and re-prosecuting the interaction between all the agencies, however, because that is apparent for all to see.
New clauses 11 and 12 bring those powers in line with the wider safety duties by making it clear that the tools may seek to proactively prevent CSEA content from appearing on a service, rather than focusing only on identification and removal after the fact. That will ensure the best possible protection for children, including on services that offer livestreaming.
The safeguards around those powers remain as strong as before to protect user privacy. Any tools that are developed will be accredited using a rigorous assessment process to ensure that they are highly accurate before the company is asked to use them. That will avoid any unnecessary intrusions into user privacy by minimising the risk that the tools identify false positives.
Crucially, the powers do not represent a ban on or seek to undermine any specific type of technology or design, such as end-to-end encryption. They align with the UK Government’s view that online privacy and cyber-security must be protected, but that technological changes should not be implemented in a way that diminishes public safety.
Can the Minister expand on the notion of “accredited technology”? The definition in the Bill is pretty scant as to where it will emerge from. Is he essentially saying that he is relying on the same industry that has thus far presided over the problem to produce the technology that will police it for us? Within that equation, which seems a little self-defeating, is it the case that if the technology does not emerge for one reason or another—commercial or otherwise—the Government will step in and devise, fund or otherwise create the technology required to be implemented?
I thank my right hon. Friend. It is the technology sector that develops technology—it is a simple, circular definition—not the Government. We are looking to make sure that it has that technology in place, but if we prescribed it in the Bill, it would undoubtedly be out of date within months, never mind years. That is why it is better for us to have a rounded approach, working with the technology sector, to ensure that it is robust enough.
I may not have been clear in my original intervention: my concern is that the legislation relies on the same sector that has thus far failed to regulate itself and failed to invent the technology that is required, even though it is probably perfectly capable of doing so, to produce the technology that we will then accredit to be used. My worry is that the sector, for one reason or another—the same reason that it has not moved with alacrity already to deal with these problems in the 15 years or so that it has existed—may not move at the speed that the Minister or the rest of us require to produce the technology for accreditation. What happens if it does not?
Clearly, the Government can choose to step in. We are setting up a framework to ensure that we get the right balance and are not being prescriptive. I take issue with the idea that a lot of this stuff has not been invented, because there is some pretty robust work on age assurance and verification, and other measures to identify harmful and illegal material, although my right hon. Friend is right that it is not being used as robustly as it could be. That is exactly what we are addressing in the Bill.
My intervention is on the same point as that raised by my right hon. Friend the Member for North West Hampshire (Kit Malthouse), but from the opposite direction, in effect. What if it turns out that, as many security specialists and British leaders in security believe—not just the companies, but professors of security at Cambridge and that sort of thing—it is not possible to implement such measures without weakening encryption? What will the Minister’s Bill do then?
The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.
I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?
My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.
To take that one step further, is it correct that Ofcom would set minimum standards for operators? For example, the Content Authenticity Initiative does not need primary legislation, but is an industry open-standard, open-source format. That is an example of modern technology that all companies could sign up to use, and Ofcom would therefore determine what needs to be done in primary legislation.
Can I be helpful? We did say that our discussions should be within scope, but the Minister is tempting everybody to intervene out of scope. From his own point of view, I would have thought that it would be easier to keep within scope.
Thank you, Mr Speaker; I will just respond to my hon. Friend the Member for Bosworth (Dr Evans). There is a minimum standard in so far as the operators have to adhere to the terms of the Bill. Our aim is to exclude illegal content and ensure that children are as safe as possible within the remit of the Bill.
The changes will ensure a flexible approach so that companies can use their expertise to develop or source the most effective solution for their service, rather than us being prescriptive. That, in turn, supports the continued growth of our digital economy while keeping our citizens safe online.
My hon. Friend may know that there are third-party technology companies—developers of this accredited technology, as he calls it—that do not have access to all the data that might be necessary to develop technology to block the kind of content we are discussing. They need to be given the right to access that data from the larger platforms. Will Ofcom be able to instruct large platforms that have users’ data to make it available to third-party developers of technology that can help to block such content?
Ofcom will be working with the platforms over the next few months—in the lead-up to the commencement of the Bill and afterwards—to ensure that the provisions are operational, so that we get them up and running as soon as practicably possible. My right hon. Friend is right to raise the point.
In Northern Ireland we face the specific issue of the glorification of terrorism. Glorifying terrorism encourages terrorism. Is it possible that the Bill will stop that type of glorification, and therefore stop the terrorism that comes off the back of it?
I will try to cover the hon. Member’s comments a little bit later, if I may, when I talk about some of the changes coming up later in the process.
Moving away from CSEA, I am pleased to say that new clause 53 fulfils a commitment given by my predecessor in Committee to bring forward reforms to address epilepsy trolling. It creates the two specific offences of sending and showing flashing images to an individual with epilepsy with the intention of causing them harm. Those offences will apply in England, Wales and Northern Ireland, providing people with epilepsy with specific protection from this appalling abuse. I would like to place on record our thanks to the Epilepsy Society for working with the Ministry of Justice to develop the new clause.
The offence of sending flashing images captures situations in which an individual sends a communication in a scatter-gun manner—for example, by sharing a flashing image on social media—and the more targeted sending of flashing images to a person who the sender knows or suspects is a person with epilepsy. It can be committed by a person who forwards or shares such an electronic communication as well as by the person sending it. The separate offence of showing flashing images will apply if a person shows flashing images to someone they know or suspect to have epilepsy by means of an electronic communications device—for example, on a mobile phone or a TV screen.
The Government have listened to parliamentarians and stakeholders about the impact and consequences of this reprehensible behaviour, and my thanks go to my hon. Friends the Members for Watford (Dean Russell), for Stourbridge (Suzanne Webb), for Blackpool North and Cleveleys (Paul Maynard) and for Ipswich (Tom Hunt) for their work and campaigning. [Interruption.] Indeed, and the hon. Member for Batley and Spen (Kim Leadbeater), who I am sure will be speaking on this later.
New clause 53 creates offences that are legally robust and enforceable so that those seeking to cause harm to people with epilepsy will face appropriate criminal sanctions. I hope that will reassure the House that the deeply pernicious activity of epilepsy trolling will be punishable by law.
The Minister is thanking lots of hon. Members, but should not the biggest thanks go, first, to the Government for the inclusion of this amendment; and secondly, to Zach Eagling, the inspirational now 11-year-old who was the victim of a series of trolling incidents when flashing images were pushed his way after a charity walk? We have a huge amount to thank Zach Eagling for, and of course the amazing Epilepsy Society too.
A number of Members across the House have been pushing for Zach’s law, and I am really delighted that Zach’s family can see in Hansard that that campaigning has really made a direct change to the law.
I just want to echo the previous points. This has been a hard-fought decision, and I am so proud that the Government have done this, but may I echo the thanks to Zach for being a true hero? We talk about David and Goliath, the giant—the beast—who was taken down, but Zach has beaten the tech giants, and I think this is an incredible success.
I absolutely echo my hon. Friend’s remarks, and I again thank him for his work.
We are also taking steps to strengthen Ofcom’s enforcement powers, which is why we are giving Ofcom a discretionary power to require non-compliant services to publish or notify their users of enforcement action that it has taken against the service. Ofcom will be able to use this power to direct a service to publish details or notify its UK users about enforcement notices it receives from Ofcom. I thank the Antisemitism Policy Trust for bringing this proposal to our attention and for its helpful engagement on the issue. This new power will promote transparency by increasing awareness among users about breaches of the duty in the Bill. It will help users make much more informed decisions about the services they use, and act as an additional deterrent factor for service providers.
It is fantastic to have the data released. Does the Minister have any idea how many of these notifications are likely to be put out there when the Bill comes in? Has any work been done on that? Clearly, having thousands of these come out would be very difficult for the public to understand, but half a dozen over a year might be very useful to understand which companies are struggling.
I think this is why Ofcom has discretion, so that it can determine that. The most egregious examples are the ones people can learn from, and it is about doing this in proportion. My hon. Friend is absolutely right that if we are swamped with small notifications, this will be hidden in plain sight. That would not be useful, particularly for parents, to best understand what is going on. It is all about making more informed decisions.
The House will be aware that we recently announced our intention to make a number of other changes to the Bill. We are making those changes because we believe it is vital that people can continue to express themselves freely and engage in pluralistic debate online. That is why the Bill will be amended to strengthen its provisions relating to children and to ensure that the Bill’s protections for adults strike the right balance with its protections for free speech.
The Minister is alluding, I assume, to the legal but harmful provision, but what does he think about this as an example? People are clever; they do not use illegal language. They will not say, “I want to kill all Jews”, but they may well—and do—say, “I want to harm all globalists.” What is the Minister’s view of that?
The right hon. Lady and I have had a detailed chat about some of the abuse that she and many others have been suffering, and there were some particularly egregious examples. This Bill is not, and never will be, a silver bullet. This has to be worked through, with the Government acting with media platforms and social media platforms, and parents also have a role. This will evolve, but we first need to get back to the fundamental point that social media platforms are not geared up to enforce their own terms and conditions. That is ridiculous, a quarter of a century after the world wide web kicked in, and when social media platforms have been around for the best part of 20 years. We are shutting the stable door afterwards, and trying to come up with legislation two decades later.
Order. I am really bothered. I am trying to help the Minister, because although broadening discussion of the Bill is helpful, it is also allowing Members to come in with remarks that are out of scope. If we are going to go out of scope, we could be here a long time. I am trying to support the Minister by keeping him in scope.
Thank you, Mr Speaker; I will try to keep my remarks very much in scope.
The harmful communications offence in clause 151 was a reform to communication offences proposed in the Bill. Since the Bill has been made public, parliamentarians and stakeholders have expressed concern that the threshold that would trigger prosecution for the offence of causing serious distress could bring robust but legitimate conversation into the illegal space. In the light of that concern, we have decided not to take forward the harmful communications offence for now. That will give the Government an opportunity to consider further how the criminal law can best protect individuals from harmful communications, and ensure that protections for free speech are robust.
This is about the protection of young people, and we are all here for the same reason, including the Minister. We welcome the changes that he is putting forward, but the Royal College of Psychiatrists has expressed a real concern about the mental health of children, and particularly about how screen time affects them. NHS Digital has referred to one in eight 11 to 16-year-olds being bullied. I am not sure whether we see in the Bill an opportunity to protect them, so perhaps the Minister can tell me the right way to do that.
The hon. Gentleman talks about the wider use of screens and screen time, and that is why Ofcom’s media literacy programme, and DCMS’s media literacy strategy—
That is because we have a detailed strategy that tackles many of these issues. Again, none of this is perfect, and as I have said, the Government are working in tandem with the platforms, and with parents and education bodies, to make sure we get that bit right. The hon. Gentleman is right to highlight that as a big issue.
I talked about harmful communications, recognising that we could leave a potential gap in the criminal law. The Government have also decided not to repeal existing communications offences in the Malicious Communications Act 1988, or those under section 127(1) of the Communications Act 2003. That will ensure that victims of domestic abuse or other extremely harmful communications will still be robustly protected by the criminal law. Along with planned changes to the harmful communications offence, we are making a number of additional changes to the Bill—that will come later, Mr Speaker, and I will not tread too much into that, as it includes the removal of the adult safety duties, often referred to as the legal but harmful provision. The amended Bill offers adults a triple shield of protection that requires platforms to remove illegal content and material that violates their terms and conditions, and gives adults user controls to help them avoid seeing certain types of content.
The Bill’s key objective, above everything else, is the safety of children online, and we will be making a number of changes to strengthen the Bill’s existing protections for children. We will make sure that we expect platforms to use age assurance technology when identifying the age of their users, and we will also require platforms with minimum age restrictions to explain in their terms of service what measures they have in place to prevent access to those below their minimum age, and enforce those measures consistently. We are planning to name the Children’s Commissioner as a statutory consultee for Ofcom in its development of the codes of practice, ensuring that children’s views and needs are represented.
That is the Children’s Commissioner for England, specifically because they have particular reserved duties for the whole of the UK. None the less, Ofcom must also have regard to a wider range of voices, which can easily include the other Children’s Commissioners.
On age reassurance, does the Minister not see a weakness? Lots of children and young people are far more sophisticated than many of us in the Chamber and will easily find a workaround, as they do now. The onus is being put on the children, so the Bill is not increasing regulation or the safety of those children.
As I said, the social media platforms will have to put in place robust age assurance and age verification for material in an accredited form that is acceptable to Ofcom, which will look at that.
Tackling violence against women and girls is a key priority for the Government. It is unacceptable that women and girls suffer disproportionately from abuse online, and it is right that we go further to address that through the Bill. That is why we will name the commissioner for victims and witnesses and the Domestic Abuse Commissioner as statutory consultees for the code of practice and list “coercive or controlling behaviour” as a priority offence. That offence disproportionately affects women and girls, and that measure will mean that companies will have to take proactive measures to tackle such content.
Finally, we are making a number of criminal law reforms, and I thank the Law Commission for the great deal of important work that it has done to assess the law in these areas.
I strongly welcome some of the ways in which the Bill has been strengthened to protect women and girls, particularly by criminalising cyber-flashing, for example. Does the Minister agree that it is vital that our laws keep pace with the changes in how technology is being used? Will he therefore assure me that the Government will look to introduce measures along the lines set out in new clauses 45 to 50, standing in the name of my right hon. Friend the Member for Basingstoke (Dame Maria Miller), who is leading fantastic work in this area, so that we can build on the Government’s record in outlawing revenge porn and threats to share it?
I thank my hon. Friend, and indeed I thank my right hon. Friend the Member for Basingstoke (Dame Maria Miller) for the amazing work that she has done in this area. We will table an amendment to the Bill to criminalise more behaviour relating to intimate image abuse, so more perpetrators will face prosecution and potentially time in jail. My hon. Friend has worked tirelessly in this area, and we have had a number of conversations. I thank her for that. I look forward to more conversations to ensure that we get the amendment absolutely right and that it does exactly what we all want.
The changes we are making will include criminalising the non-consensual sharing of manufactured intimate images, which, as we have heard, are more commonly known as deepfakes. In the longer term, the Government will also take forward several of the Law Commission’s recommendations to ensure that the legislation is coherent and takes account of advancements in technology.
We will also use the Bill to bring forward a further communication offence to make the encouragement of self-harm illegal. We have listened to parliamentarians and stakeholders concerned about such behaviour and will use the Bill to criminalise that activity, providing users with protections from that harmful content. I commend my right hon. Friend the Member for Haltemprice and Howden on his work in this area and his advocacy for such a change.
Intimate image abuse has been raised with me a number of times by younger constituents, who are particularly vulnerable to such abuse. Within the scope of what we are discussing, I am concerned that we have seen only one successful conviction for revenge porn, so if the Government base their intimate image work on the existing legislative framework for revenge porn, it will do nothing and protect no one, and will instead be a waste of everyone’s time and further let down victims who are already let down by the system.
We will actually base that work on the independent Law Commission’s recommendations, and have been working with it on that basis.
On images that promote self-harm, does the Minister agree that images that promote or glamourise eating disorders should be treated just as seriously as any other content promoting self-harm?
I thank my right hon. Friend, who spoke incredibly powerfully at Digital, Culture, Media and Sport questions, and on a number of other occasions, about her particular experience. That is always incredibly difficult. Absolutely that area will be tackled, especially for children, but it is really important—as we will see from further changes in the Bill—that, with the removal of the legal but harmful protections, there are other protections for adults.
I think last year over 6,000 people died from suicide in the UK. Much of that, sadly, was encouraged by online content, as we saw from the recent coroner’s report into the tragic death of Molly Russell. On new clause 16, tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), will the Minister confirm that the Government agree with the objectives of new clause 16 and will table an amendment to this Bill—to no other parliamentary vehicle, but specifically to this Bill—to introduce such a criminal offence? Will the Government amendment he referred to be published before year end?
On self-harm, I do not think there is any doubt that we are absolutely aligned. On suicide, I have some concerns about how new clause 16 is drafted—it amends the Suicide Act 1961, which is not the right place to introduce measures on self-harm—but I will work to ensure we get this measure absolutely right as the Bill goes through the other place.
I thank my hon. Friend for giving way. He is almost being given stereo questions from across the House, but I think they might be slightly different. I am very grateful to him for setting out his commitment to tackling suicide and self-harm content, and for his commitment to my right hon. Friend the Member for Chelmsford (Vicky Ford) on eating disorder content. My concern is that there is a really opaque place in the online world between what is legal and illegal, which potentially could have been tackled by the legal but harmful restrictions. Can he set out a little more clearly—not necessarily now, but as we move forward—how we really are going to begin to tackle the opaque world between legal and illegal content?
If my hon. Friend will bear with me—I need to make some progress—I think that will be teased out today and in Committee, should the Bill be recommitted, as we amend the clauses relating directly to what she is talking about, and then as the Bill goes through the other place.
I am grateful to the Minister, who has taken a number of interventions. I fully agree with my hon. Friend the Member for Gosport (Dame Caroline Dinenage). This is a grey area and has consistently been so—many Members have given their views on that in previous stages of the Bill. Will the Minister come back in the later stages on tackling violence against women and girls, and show how the Bill will incorporate key aspects of the Domestic Abuse Act 2021, and tie up with the criminal justice system and the work of the forthcoming victims Bill? We cannot look at these issues in isolation—I see that the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is also on the Front Bench. Rather, they all have to be put together in a golden thread of protecting victims, making sure that people do not become victims, and ensuring that we go after the perpetrators—we must not forget that at all. The Minister will not be able to answer that now, but I would ask him to please do so in the latter stages.
I talked about the fact that the Commissioner for Victims and Witnesses and the Domestic Abuse Commissioner will be statutory consultees, because it is really important that their voice is heard in the implementation of the Bill. We are also bringing in coercive control as one of the areas. That is so important when it comes to domestic abuse. Domestic abuse does not start with a slap, a hit, a punch; it starts with emotional abuse—manipulation, coercion and so on. That is why coercive abuse is an important point not just for domestic abuse, but for bullying, harassment and the wider concerns that the Bill seeks to tackle.
I am one of three Scottish Members present, and the Scottish context concerns me. If time permits me in my contribution later, I will touch on a particularly harrowing case. The school involved has been approached but has done nothing. Education is devolved, so the Minister may want to think about that. It would be too bad if the Bill failed in its good intentions because of a lack of communication in relation to a function delivered by the Scottish Government. Can I take it that there will be the closest possible co-operation with the Scottish Government because of their educational responsibilities?
There simply has to be. These are global companies and we want to make the Bill work for the whole of the UK. This is not an England-only Bill, so the changes must happen for every user, whether they are in Scotland, Northern Ireland, Wales or England.
Will the Minister give way?
I will make a bit of progress, because I am testing Mr Speaker’s patience.
We are making a number of technical amendments to ensure that the new communications offences are targeted and effective. New clause 52 seeks to narrow the exemptions for broadcast and wireless telegraphy licence holders and providers of on-demand programme services, so that the licence holder is exempt only to the extent that communication is within the course of a licensed activity. A separate group of technical amendments ensure that the definition of sending false and threatening communications will capture all circumstances—that is far wider than we have at the moment.
We propose a number of consequential amendments to relevant existing legislation to ensure that new offences operate consistently with the existing criminal law. We are also making a number of wider technical changes to strengthen the enforcement provisions and ensure consistency with other regulatory frameworks. New clause 42 ensures that Ofcom has the power to issue an enforcement notice to a former service provider, guarding against service providers simply shutting down their business and reappearing in a slightly different guise to avoid regulatory sanction. A package of Government amendments will set out how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework.
Finally, new clause 40 will enable the CMA to share information with Ofcom for the purpose of facilitating Ofcom’s online safety functions. That will help to ensure effective co-operation between Ofcom and the CMA.
I thank my hon. Friend for giving way. In the past 40 minutes or so, he has demonstrated the complexity of the changes that are being proposed for the Bill, and he has done a very good job in setting that out. However, will he join me and many other right hon. and hon. Members who feel strongly that a Standing Committee should look at the Bill’s implementation, because of the complexities that he has so clearly demonstrated? I know that is a matter for the House rather than our consideration of the Bill, but I hope that other right hon. and hon. Members will join me in looking for ways to put that right. We need to be able to scrutinise the measures on an ongoing basis.
Indeed, there will be, and are, review points in the Bill. I have no doubt that my right hon. Friend will raise that on other occasions as well.
I want to ensure that there is plenty of time for Members to debate the Bill at this important stage, and I have spoken for long enough. I appreciate the constructive and collaborative approach that colleagues have taken throughout the Bill’s passage.
I am grateful to the Minister. Does he support Baroness Kidron’s amendment asking for swift, humane access to data where there is a suspicion that online information may have contributed to a child’s suicide? That has not happened in previous instances; does he support that important amendment?
I am glad that I gave way so that the hon. Lady could raise that point. Baroness Kidron and her organisation have raised that issue with me directly, and they have gathered media support. We will look at that as the Bill goes through this place and the Lords, because we need to see what the powers are at the moment and why they are not working.
Now is the time to take this legislation forward to ensure that it can deliver the safe and transparent online environment that children and adults so clearly deserve.
Exactly that. My hon. Friend is absolutely right. I come back to the point about drafting this legislation, which is not straightforward and easy because of the definitions. It is not just about what is in scope of the Bill but about the implications of the definitions and how they could be applied in law.
The Minister touched on the criminal side of things; interpretation in the criminal courts and how that would be applied in case law are the points that need to be fleshed out. This is where our work on CT is so important, because across the world with Five Eyes we have been consistent. Again, there are good models out there that can be built upon. We will not fix all this through one Bill—we know that. This Bill is foundational, which is why we must move forward.
On new clause 11, I seek clarity—in this respect, I need reassurance not from the Minister but from other parts of government—on how victims and survivors, whether of terrorist activity, domestic abuse or violence against women and girls, will be supported and protected by the new safeguards in the Bill, and by the work of the Victims’ Commissioner.
I thank my right hon. Friend for sharing her remarks with the House. She is making an excellent speech based on her considerable experience. On the specific issue of child sexual abuse and exploitation, many organisations, such as the Internet Watch Foundation, are instrumental in removing reports and web pages containing that vile and disgusting material. In the April 2020 White Paper, the Government committed to look at how the Internet Watch Foundation could use its technical expertise in that field. Does she agree that it would be good to hear from the Minister about how the Internet Watch Foundation could work with Ofcom to assist victims?
I was not aware of that, but I am now. I thank my hon. Friend for that information. This is a crucial point. We need the accountability of the named director associated with the company, the platform and the product in order to introduce the necessary accountability. I do not know whether the Minister will accept this new clause today, but I very much hope that we will look further at how we can make this possible, perhaps in another place.
I very much support the Bill. We need to get it on the statute book, although it will probably need further work, and I support the Government amendments. However, given the link between children viewing pornography and child sexual abuse, I hope that when the Bill goes through the other place, their lordships will consider how regulations around pornographic content can be strengthened, in order to drastically reduce the number of children viewing porn and eventually being drawn into criminal activities themselves. In particular, I would like their lordships to look at tightening and accelerating the age verification and giving equal treatment to all pornography, whether it is on a porn site or a user-to-user service and whether it is online or offline. Porn is harmful to children in whatever form it comes, so the liability on directors and the criminality must be exactly the same. I support the Bill and the amendments in the Government’s name, but it needs to go further when it goes to the other place.
I thank Members for their contributions during today’s debate and for their ongoing engagement with such a crucial piece of legislation. I will try to respond to as many of the issues raised as possible.
My right hon. Friend the Member for Haltemprice and Howden (Mr Davis), who is not in his place, proposed adding in promoting self-harm as a criminal offence. The Government are sympathetic to the intention behind that proposal; indeed, we asked the Law Commission to consider how the criminal law might address that, and have agreed in principle to create a new offence of encouraging or assisting serious self-harm. The form of the offence recommended by the Law Commission is based on the broadly comparable offence of encouraging or assisting suicide. Like that offence, it covers the encouragement of, or assisting in, self-harm by means of communication and in other ways. When a similar amendment was tabled by the hon. Members for Ochil and South Perthshire (John Nicolson) and for Aberdeen North (Kirsty Blackman) in Committee, limiting the offence to encouragement or assistance by means of sending a message, the then Minister, my right hon. Friend the Member for Croydon South, said it would give only partial effect to the Law Commission’s recommendation. It remains the Government’s intention to give full effect to the Law Commission’s recommend-ations in due course.
I have raised this on a number of occasions in the past few hours, as have my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) and the right hon. Member for Barking (Dame Margaret Hodge). Will the Minister be good enough to ensure that this matter is thoroughly looked at and, furthermore, that the needed clarification is thought through?
I was going to come to my hon. Friend in two seconds.
In the absence of clearly defined offences, the changes we are making to the Bill mean that it is likely to be almost impossible to take enforcement action against individuals. We are confident that Ofcom will have all the tools necessary to drive the necessary culture change in the sector, from the boardroom down.
This is not the last stage of the Bill. It will be considered in Committee—assuming it is recommitted today—and will come back on Report and Third Reading before going to the House of Lords, so there is plenty of time further to discuss this and to give my hon. Friend the clarification he needs.
Is the Minister saying he is open to changing his view on why he is minded to reject new clause 17 tonight?
I do not think I am changing my view. I am saying that this is not the last stage of the Bill, so there will be plenty of opportunity further to test this, should Members want to do so.
On new clause 28, the Government recognise and agree with the intent behind this amendment to ensure that the interests of child users of regulated services are represented. Protecting children online is the top priority in this Bill, and its key measures will ensure that children are protected from harmful content. The Bill appoints a regulator with comprehensive powers to force tech companies to keep children safe online, and the Bill’s provisions will ensure that Ofcom will listen and respond to the needs of children when identifying priority areas for regulatory action, setting out guidance for companies, taking enforcement action and responding to super-complaints.
Right from the outset, Ofcom must ensure that its risk assessment and priorities reflect the needs of children. For example, Ofcom is required to undertake research that will help understand emerging risks to child safety. We have heard a lot today about the emerging risks with changing technology, and it is important that we keep on top of those and have that children’s voice at the heart of this. The Bill also expands the scope of the Communications Consumer Panel to online safety matters. That independent panel of experts ensures that user needs are at the heart of Ofcom’s regulatory approach. Ofcom will also have the flexibility to choose other mechanisms to better understand user experiences and emerging threats. For example, it may set up user panels or focus groups.
Importantly, Ofcom will have to engage with expert bodies representing children when developing codes of practice and other regulatory guidance. For example, Ofcom will be required to consult persons who represent the interests of children when developing its codes of practice. That means that Ofcom’s codes will be fully informed by how children behave online, how they experience harm and what impact the proposed measures will have on their online experience. The super-complaints process will further enable independent bodies advocating for children to have their voices heard, and will help Ofcom to recognise and eliminate systemic failures.
As we have heard, the Government also plan to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it develops its code of practice. That amendment will be tabled in the House of Lords. Through this consultation, the commissioner will be able to flag systemic issues or issues of particular importance to the regulator, helping Ofcom to target investigations and, if necessary, sanctions at matters that most affect children’s online experience.
As such, there are ample opportunities in the framework for children’s voices to be heard, and the Government are not convinced of the need to legislate for another child user advocacy body. There are plenty of bodies out there that Ofcom will already be reaching out to and there is an abundance of experience in committed representative groups that are already engaged and will be engaged with the online safety framework. They include the existing statutory body responsible for promoting the interests of children, the Children’s Commissioner. Adding an additional statutory body would duplicate existing provision, creating a confusing landscape, and that would not be in the best interests of children.
I hear what the Minister is saying about creating a statutory body, but will he assure this House that there is a specific vehicle for children’s voices to be heard in this? I ask because most of us here are not facing the daily traumas and constant recreation of different apps and social media ways to reach out to children that our children are. So unless we have their voice heard, this Bill is not going to be robust enough.
As I say, we are putting the Children’s Commissioner as a statutory consultee in the Bill. Ofcom will also have to have regard to all these other organisations, such as the 5Rights Foundation and the NSPCC, that are already there. It is in the legislation that Ofcom will have to have regard to those advocates, but we are not specifically suggesting that there should be a separate body duplicating that work. These organisations are already out there and Ofcom will have to reach out to them when coming up with its codes of practice.
We also heard from my hon. Friend the Member for Dover (Mrs Elphicke) about new clause 55. She spoke powerfully and I commend her for all the work she is doing to tackle the small boats problem, which is affecting so many people up and down this country. I will continue to work closely with her as the Bill continues its passage, ahead of its consideration in the Lords, to ensure that this legislation delivers the desired impact on the important issues of illegal immigration and modern slavery. The legislation will give our law enforcement agencies and the social media companies the powers and guidance they need to stop the promotion of organised criminal activity on social media. Clearly, we have to act.
My right hon. Friend the Member for Witham (Priti Patel), who brings to bear her experience as a former Home Secretary, spoke eloquently about the need to have joined-up government, to make sure that lots of bits of legislation and all Departments are working on this space. This is a really good example of joined-up government, where we have to join together.
Will the Minister confirm that, in line with the discussions that have been had, the Government will look to bring back amendments, should they be needed, in line with new clause 55 and perhaps schedule 7, as the Bill goes to the Lords or returns for further consideration in this House?
All that I can confirm is that we will work with my hon. Friend and with colleagues in the Home Office to make sure that this legislation works in the way that she intends.
We share with my right hon. Friend the Member for Basingstoke (Dame Maria Miller) the concern about the abuse of deep fake images and the need to tackle the sharing of intimate images where the intent is wider than that covered by current offences. We have committed to bring forward Government amendments in the Lords to do just that, and I look forward to working with her to ensure that, again, we get that part of the legislation exactly right.
We also recognise the intent behind my right hon. Friend’s amendment to provide funding for victim support groups via the penalties paid by entities for failing to comply with the regulatory requirements. Victim and survivor support organisations play a critical role in providing support and tools to help people rebuild their lives. That is why the Government continue to make record investments in this area, increasing the funding for victim and witness support services to £192 million a year by 2024-25. We want to allow the victim support service to provide consistency for victims requiring support.
I thank my hon. Friend for giving way and for his commitment to look at this matter before the Bill reaches the House of Lords. Can he just clarify to me that it is his intention to implement the Law Commission’s recommendations that are within the scope of the Bill prior to the Bill reaching the House of Lords? If that is the case, I am happy to withdraw my amendments.
I cannot confirm today at what stage we will legislate. We will continue to work with my right hon. Friend and the Treasury to ensure that we get this exactly right. We will, of course, give due consideration to the Law Commission’s recommendations.
Unless I am mistaken, no other stages of the Bill will come before the House where this can be discussed. Either it will be done or it will not. I had hoped that the Minister would answer in the affirmative.
I understand. We are ahead of the Lords on publication, so yes is the answer.
I have two very quick points for my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright). He was right to speak about acting with humility. We will bring forward amendments for recommittal to amend the approach for category 1 designation—not just the smaller companies that he was talking about, but companies that are pushing that barrier to get to category 1. I very much get his view that the process could be delayed unduly, and we want to make sure that we do not get the unintended consequences that he describes. I look forward to working with him to get the changes to the Bill to work exactly as he describes.
Finally, let me go back to the point that my right hon. Friend the Member for Haltemprice and Howden made about encrypted communications. We are not talking about banning end-to-end encryption or about breaking encryption—for the reasons set out about open banking and other areas. The amendment would leave Ofcom powerless to protect thousands of children and could leave unregulated spaces online for offenders to act, and we cannot therefore accept that.
Just briefly, because I know that the Minister is about to finish, can he respond on amendment 204 with regard to the protection of journalists?
I am happy to continue talking to the right hon. Gentleman, but I believe that we have enough protections in the Bill, with the human touch that we have added after the automatic flagging up of inquiries. The NCA will also have to have due regard to protecting sources. I will continue to work with him on that. ‘Online Safety Act 2022.”’—(Paul Scully.)
I have not covered everybody’s points, but this has been a very productive debate. I thank everyone for their contributions. We are really keen to get the Bill on the books and to act quickly to ensure that we can make children as safe as possible online.
Question put and agreed to.
New clause 11 accordingly read a Second time, and added to the Bill.
New Clause 12
Warning notices
‘(1) OFCOM may give a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) to a provider relating to a service or part of a service only after giving a warning notice to the provider that they intend to give such a notice relating to that service or that part of it.
(2) A warning notice under subsection (1) relating to the use of accredited technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a) and (3)(a)) must—
(a) contain details of the technology that OFCOM are considering requiring the provider to use,
(b) specify whether the technology is to be required in relation to terrorism content or CSEA content (or both),
(c) specify any other requirements that OFCOM are considering imposing (see section 106(2) to (4)),
(d) specify the period for which OFCOM are considering imposing the requirements (see section 106(6)),
(e) state that the provider may make representations to OFCOM (with any supporting evidence), and
(f) specify the period within which representations may be made.
(3) A warning notice under subsection (1) relating to the development or sourcing of technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) and (3)(b)) must—
(a) describe the proposed purpose for which the technology must be developed or sourced (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a)(iii) and (iv) and (3)(a)(ii)),
(b) specify steps that OFCOM consider the provider needs to take in order to comply with the requirement described in section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) or (3)(b), or both those requirements (as the case may be),
(c) specify the proposed period within which the provider must take each of those steps,
(d) specify any other requirements that OFCOM are considering imposing,
(e) state that the provider may make representations to OFCOM (with any supporting evidence), and
(f) specify the period within which representations may be made.
(4) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) that relates to both the user-to-user part of a combined service and the search engine of the service (as described in section (Notices to deal with terrorism content or CSEA content (or both))(4)(c) or (d)) may be given to the provider of the service only if—
(a) two separate warning notices have been given to the provider (one relating to the user-to-user part of the service and the other relating to the search engine), or
(b) a single warning notice relating to both the user-to-user part of the service and the search engine has been given to the provider.
(5) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) may not be given to a provider until the period allowed by the warning notice for the provider to make representations has expired.’—(Paul Scully.)
This clause, which would follow NC11, also replaces part of existing clause 104. There are additions to the warning notice procedure to take account of the new options for notices under NC11.
Brought up, read the First and Second time, and added to the Bill.
New Clause 20
OFCOM’s reports about news publisher content and journalistic content
‘(1) OFCOM must produce and publish a report assessing the impact of the regulatory framework provided for in this Act on the availability and treatment of news publisher content and journalistic content on Category 1 services (and in this section, references to a report are to a report described in this subsection).
(2) Unless the Secretary of State requires the production of a further report (see subsection (6)), the requirement in subsection (1) is met by producing and publishing one report within the period of two years beginning with the day on which sections (Duties to protect news publisher content) and 16 come into force (or if those sections come into force on different days, the period of two years beginning with the later of those days).
(3) A report must, in particular, consider how effective the duties to protect such content set out in sections (Duties to protect news publisher content) and 16 are at protecting it.
(4) In preparing a report, OFCOM must consult—
(a) persons who represent recognised news publishers,
(b) persons who appear to OFCOM to represent creators of journalistic content,
(c) persons who appear to OFCOM to represent providers of Category 1 services, and
(d) such other persons as OFCOM consider appropriate.
(5) OFCOM must send a copy of a report to the Secretary of State, and the Secretary of State must lay it before Parliament.
(6) The Secretary of State may require OFCOM to produce and publish a further report if the Secretary of State considers that the regulatory framework provided for in this Act is, or may be, having a detrimental effect on the availability and treatment of news publisher content or journalistic content on Category 1 services.
(7) But such a requirement may not be imposed—
(a) within the period of three years beginning with the date on which the first report is published, or
(b) more frequently than once every three years.
(8) For further provision about reports under this section, see section 138.
(9) In this section—
“journalistic content” has the meaning given by section 16;
“news publisher content” has the meaning given by section 49;
“recognised news publisher” has the meaning given by section 50.
(10) For the meaning of “Category 1 service”, see section 82 (register of categories of services).’—(Paul Scully.)
This inserts a new clause (after clause 135) which requires Ofcom to publish a report on the impact of the regulatory framework provided for in the Bill within two years of the relevant provisions coming into force. It also allows the Secretary of State to require Ofcom to produce further reports.
Brought up, read the First and Second time, and added to the Bill.
New Clause 40
Amendment of Enterprise Act 2002
‘In Schedule 15 to the Enterprise Act 2002 (enactments relevant to provisions about disclosure of information), at the appropriate place insert—
This amendment has the effect that the information gateway in section 241 of the Enterprise Act 2002 allows disclosure of certain kinds of information by a public authority (such as the Competition and Markets Authority) to OFCOM for the purposes of OFCOM’s functions under this Bill.
Brought up, read the First and Second time, and added to the Bill.
New Clause 42
Former providers of regulated services
‘(1) A power conferred by Chapter 6 of Part 7 (enforcement powers) to give a notice to a provider of a regulated service is to be read as including power to give a notice to a person who was, at the relevant time, a provider of such a service but who has ceased to be a provider of such a service (and that Chapter and Schedules 13 and 15 are to be read accordingly).
(2) “The relevant time” means—
(a) the time of the failure to which the notice relates, or
(b) in the case of a notice which relates to the requirement in section 90(1) to co-operate with an investigation, the time of the failure or possible failure to which the investigation relates.’—(Paul Scully.)
This new clause, which is intended to be inserted after clause 162, provides that a notice that may be given under Chapter 6 of Part 7 to a provider of a regulated service may also be given to a former provider of a regulated service.
Brought up, read the First and Second time, and added to the Bill.
New Clause 43
Amendments of Part 4B of the Communications Act
‘Schedule (Amendments of Part 4B of the Communications Act) contains amendments of Part 4B of the Communications Act.’—(Paul Scully.)
This new clause introduces a new Schedule amending Part 4B of the Communications Act 2003 (see NS2).
Brought up, read the First and Second time, and added to the Bill.
New Clause 44
Repeal of Part 4B of the Communications Act: transitional provision etc
‘(1) Schedule (Video-sharing platform services: transitional provision etc) contains transitional, transitory and saving provision—
(a) about the application of this Act and Part 4B of the Communications Act during a period before the repeal of Part 4B of the Communications Act (or, in the case of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), in respect of charging years as mentioned in that Part);
(b) in connection with the repeal of Part 4B of the Communications Act.
(2) The Secretary of State may by regulations make transitional, transitory or saving provision of the kind mentioned in subsection (1)(a) and (b).
(3) Regulations under subsection (2) may amend or repeal—
(a) Part 2A of Schedule3;
(b) Schedule (Video-sharing platform services: transitional provision etc).
(4) Regulations under subsection (2) may, in particular, make provision about—
(a) the application of Schedule (Video-sharing platform services: transitional provision etc) in relation to a service if the transitional period in relation to that service ends on a date before the date when section 172 comes into force;
(b) the application of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), including further provision about the calculation of a provider’s non-Part 4B qualifying worldwide revenue for the purposes of paragraph 19 of that Schedule;
(c) the application of Schedule 10 (recovery of OFCOM’s initial costs), and in particular how fees chargeable under that Schedule may be calculated, in respect of charging years to which Part 3 of Schedule (Video-sharing platform services: transitional provision etc) relates.’—(Paul Scully.)
This new clause introduces a new Schedule containing transitional provisions (see NS3), and provides a power for the Secretary of State to make regulations containing further transitional provisions etc.
Brought up, read the First and Second time, and added to the Bill.
New Clause 51
Publication by providers of details of enforcement action
‘(1) This section applies where—
(a) OFCOM have given a person (and not withdrawn) any of the following—
(i) a confirmation decision;
(ii) a penalty notice under section 119;
(iii) a penalty notice under section 120(5);
(iv) a penalty notice under section 121(6), and
(b) the appeal period in relation to the decision or notice has ended.
(2) OFCOM may give to the person a notice (a “publication notice”) requiring the person to—
(a) publish details describing—
(i) the failure (or failures) to which the decision or notice mentioned in subsection (1)(a) relates, and
(ii) OFCOM’s response, or
(b) otherwise notify users of the service to which the decision or notice mentioned in subsection (1)(a) relates of those details.
(3) A publication notice may require a person to publish details under subsection (2)(a) or give notification of details under subsection (2)(b) or both.
(4) A publication notice must—
(a) specify the decision or notice mentioned in subsection (1)(a) to which it relates,
(b) specify or describe the details that must be published or notified,
(c) specify the form and manner in which the details must be published or notified,
(d) specify a date by which the details must be published or notified, and
(e) contain information about the consequences of not complying with the notice.
(5) Where a publication notice requires a person to publish details under subsection (2)(a) the notice may also specify a period during which publication in the specified form and manner must continue.
(6) Where a publication notice requires a person to give notification of details under subsection (2)(b) the notice may only require that notification to be given to United Kingdom users of the service (see section 184).
(7) A publication notice may not require a person to publish or give notification of anything that, in OFCOM’s opinion—
(a) is confidential in accordance with subsections (8) and (9), or
(b) is otherwise not appropriate for publication or notification.
(8) A matter is confidential under this subsection if—
(a) it relates specifically to the affairs of a particular body, and
(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that body.
(9) A matter is confidential under this subsection if—
(a) it relates to the private affairs of an individual, and
(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that individual.
(10) A person to whom a publication notice is given has a duty to comply with it.
(11) The duty under subsection (10) is enforceable in civil proceedings by OFCOM—
(a) for an injunction,
(b) for specific performance of a statutory duty under section 45 of the Court of Session Act 1988, or
(c) for any other appropriate remedy or relief.
(12) For the purposes of subsection (1)(b) “the appeal period”, in relation to a decision or notice mentioned in subsection (1)(a), means—
(a) the period during which any appeal relating to the decision or notice may be made, or
(b) where such an appeal has been made, the period ending with the determination or withdrawal of that appeal.’—(Paul Scully.)
This new clause, which is intended to be inserted after clause 129, gives OFCOM the power to require a person to whom a confirmation decision or penalty notice has been given to publish details relating to the decision or notice or to otherwise notify service users of those details.
Brought up, read the First and Second time, and added to the Bill.
New Clause 52
Exemptions from offence under section 152
‘(1) A recognised news publisher cannot commit an offence under section 152.
(2) An offence under section 152 cannot be committed by the holder of a licence under the Broadcasting Act 1990 or 1996 in connection with anything done under the authority of the licence.
(3) An offence under section 152 cannot be committed by the holder of a multiplex licence in connection with anything done under the authority of the licence.
(4) An offence under section 152 cannot be committed by the provider of an on-demand programme service in connection with anything done in the course of providing such a service.
(5) An offence under section 152 cannot be committed in connection with the showing of a film made for cinema to members of the public.’—(Paul Scully.)
This new clause contains exemptions from the offence in clause 152 (false communications). The clause ensures that holders of certain licences are only exempt if they are acting as authorised by the licence and, in the case of Wireless Telegraphy Act licences, if they are providing a multiplex service.
Brought up, read the First and Second time, and added to the Bill.
New Clause 53
Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2)
‘(1) A person (A) commits an offence if—
(a) A sends a communication by electronic means which consists of or includes flashing images (see subsection (13)),
(b) either condition 1 or condition 2 is met, and
(c) A has no reasonable excuse for sending the communication.
(2) Condition 1 is that—
(a) at the time the communication is sent, it is reasonably foreseeable that an individual with epilepsy would be among the individuals who would view it, and
(b) A sends the communication with the intention that such an individual will suffer harm as a result of viewing the flashing images.
(3) Condition 2 is that, when sending the communication—
(a) A believes that an individual (B)—
(i) whom A knows to be an individual with epilepsy, or
(ii) whom A suspects to be an individual with epilepsy,
will, or might, view it, and
(b) A intends that B will suffer harm as a result of viewing the flashing images.
(4) In subsections (2)(a) and (3)(a), references to viewing the communication are to be read as including references to viewing a subsequent communication forwarding or sharing the content of the communication.
(5) The exemptions contained in section (Exemptions from offence under section 152) apply to an offence under subsection (1) as they apply to an offence under section 152.
(6) For the purposes of subsection (1), a provider of an internet service by means of which a communication is sent is not to be regarded as a person who sends a communication.
(7) In the application of subsection (1) to a communication consisting of or including a hyperlink to other content, references to the communication are to be read as including references to content accessed directly via the hyperlink.
(8) A person (A) commits an offence if—
(a) A shows an individual (B) flashing images by means of an electronic communications device,
(b) when showing the images—
(i) A knows that B is an individual with epilepsy, or
(ii) A suspects that B is an individual with epilepsy,
(c) when showing the images, A intends that B will suffer harm as a result of viewing them, and
(d) A has no reasonable excuse for showing the images.
(9) An offence under subsection (1) or (8) cannot be committed by a healthcare professional acting in that capacity.
(10) A person who commits an offence under subsection (1) or (8) is liable—
(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);
(b) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding six months or a fine not exceeding the statutory maximum (or both);
(c) on conviction on indictment, to imprisonment for a term not exceeding five years or a fine (or both).
(11) It does not matter for the purposes of this section whether flashing images may be viewed at once (for example, a GIF that plays automatically) or only after some action is performed (for example, pressing play).
(12) In this section—
(a) references to sending a communication include references to causing a communication to be sent;
(b) references to showing flashing images include references to causing flashing images to be shown.
(13) In this section—
“electronic communications device” means equipment or a device that is capable of transmitting images by electronic means;
“flashing images” means images which carry a risk that an individual with photosensitive epilepsy who viewed them would suffer a seizure as a result;
“harm” means—
(a) a seizure, or
(b) alarm or distress;
“individual with epilepsy” includes, but is not limited to, an individual with photosensitive epilepsy;
“send” includes transmit and publish (and related expressions are to be read accordingly).
(14) This section extends to England and Wales and Northern Ireland.’—(Paul Scully.)
This new clause creates (for England and Wales and Northern Ireland) a new offence of what is sometimes known as “epilepsy trolling” - sending or showing flashing images electronically to people with epilepsy intending to cause them harm.
Brought up, read the First and Second time, and added to the Bill.
New Clause 16
Communication offence for encouraging or assisting self-harm
‘(1) In the Suicide Act 1961, after section 3 insert—
“3A Communication offence for encouraging or assisting self-harm
(1) A person (“D”) commits an offence if—
(a) D sends a message,
(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and
(c) D’s act was intended to encourage or assist the infliction of serious physical harm.
(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.
(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.
(4) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;
(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.
(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.
(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.
(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).
(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—
(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and
(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and
(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.”’—(Mr Davis.)
This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.
Brought up, and read the First time.
Question put, That the clause be read a Second time.
(1 year, 12 months ago)
Commons ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move,
That the following provisions shall apply to the Online Safety Bill for the purpose of varying and supplementing the Order of 19 April 2022 in the last session of Parliament (Online Safety Bill: Programme) as varied by the Orders of 12 July 2022 (Online Safety Bill: Programme (No.2)) and today (Online Safety Bill: Programme (No.3)).
Re-committal
(1) The Bill shall be re-committed to a Public Bill Committee in respect of the following Clauses and Schedules—
(a) in Part 3, Clauses 11 to 14, 17 to 20, 29, 45, 54 and 55 of the Bill as amended in Public Bill Committee;
(b) in Part 4, Clause 64 of, and Schedule 8 to, the Bill as amended in Public Bill Committee;
(c) in Part 7, Clauses 78, 81, 86, 89 and 112 of, and Schedule 11 to, the Bill as amended in Public Bill Committee;
(d) in Part 9, Clause 150 of the Bill as amended in Public Bill Committee;
(e) in Part 11, Clause 161 of the Bill as amended in Public Bill Committee;
(f) in Part 12, Clauses 192, 195 and 196 of the Bill as amended in Public Bill Committee;
(g) New Clause [Repeal of Part 4B of the Communications Act: transitional provision etc], if it has been added to the Bill, and New Schedule [Video-sharing platform services: transitional provision etc], if it has been added to the Bill.
Proceedings in Public Bill Committee on re-committal
(2) Proceedings in the Public Bill Committee on re-committal shall (so far as not previously concluded) be brought to a conclusion on Thursday 15 December 2022.
(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.
Consideration following re-committal and Third Reading
(4) Proceedings on Consideration following re-committal shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.
(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.
(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration following re-committal.
I know that colleagues across the House have dedicated a huge amount of time to getting the Bill to this point, especially my predecessor, my right hon. Friend the Member for Mid Bedfordshire (Ms Dorries), who unfortunately could not be with us today. I thank everybody for their contributions through the pre-legislative scrutiny and passage and for their engagement with me since I took office. Since then, the Bill has been my No. 1 priority.
Does the right hon. Member not agree that it is regrettable that her junior Minister—the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Sutton and Cheam (Paul Scully)—failed to acknowledge in his winding-up speech that there had been any contributions to the debate on Report from Labour Members?
As the right hon. Member will note, the Minister had to stop at a certain point and he had spoken for 45 minutes in his opening remarks. I think that he gave a true reflection of many of the comments that were made tonight. The right hon. Member will also know that all the comments from Opposition Members are on the parliamentary record and were televised.
The sooner that we pass the Bill, the sooner we can start protecting children online. This is a groundbreaking piece of legislation that, as hon. Members have said, will need to evolve as technology changes.
Will my right hon. Friend confirm that the Department will consider amendments, in relation to new clause 55, to stop the people smugglers who trade their wares on TikTok?
I commit to my hon. Friend that we will consider those amendments and work very closely with her and other hon. Members.
We have to get this right, which is why we are adding a very short Committee stage to the Bill. We propose that there will be four sittings over two days. That is the right thing to do to allow scrutiny. It will not delay or derail the Bill, but Members deserve to discuss the changes.
With that in mind, I will briefly discuss the new changes that make recommittal necessary. Children are at the very heart of this piece of legislation. Parents, teachers, siblings and carers will look carefully at today’s proceedings, so for all those who are watching, let me be clear: not only have we kept every single protection for children intact, but we have worked with children’s organisations and parents to create new measures to protect children. Platforms will still have to shield children and young people from both illegal content and a whole range of other harmful content, including pornography, violent content and so on. However, they will also face new duties on age limits. No longer will social media companies be able to claim to ban users under 13 while quietly turning a blind eye to the estimated 1.6 million children who use their sites under age. They will also need to publish summaries of their risk assessments relating to illegal content and child safety in order to ensure that there is greater transparency for parents, and to ensure that the voice of children is injected directly into the Bill, Ofcom will consult the Children’s Commissioner in the development of codes of practice.
These changes, which come on top of all the original child protection measures in the Bill, are completely separate from the changes that we have made in respect of adults. For many people, myself included, the so-called “legal but harmful” provisions in the Bill prompted concerns. They would have meant that the Government were creating a quasi-legal category—a grey area—and would have raised the very real risk that to avoid sanctions, platforms would carry out sweeping take-downs of content, including legitimate posts, eroding free speech in the process.
Will the Secretary of State join me in congratulating the work of the all-party parliamentary group against antisemitism? Does she agree with the group, and with us, that by removing parts of the Bill we are allowing the kind of holocaust denial that we all abhor to continue online?
I have worked very closely with a range of groups backing the causes that the hon. Lady mentions in relation to cracking down on antisemitism, including the Board of Deputies, the Antisemitism Policy Trust and members of the APPG. [Hon. Members: “They don’t back it.”] They do indeed back the Bill. They have said that it is vital that we progress this further. We have adopted their clause in relation to breach notifications, to increase transparency, and we have injected a triple shield that will ensure that antisemitism does not remain on these platforms.
I return to the concerns around “legal but harmful”. Worryingly, it meant that users could run out of road. If a platform allowed legal but harmful material, users would therefore face a binary choice between not using the platform at all or facing abuse and harm that they did not want to see. We, however, have added a third shield that transfers power away from silicon valley algorithms to ordinary people. Our new triple shield mechanism puts accountability, transparency and choice at the heart of the way we interact with each other online. If it is illegal, it has to go. If it violates a company’s terms and conditions, it has to go. Under the third and final layer of the triple shield, platforms must offer users tools to allow them to choose what kind of content they want to see and engage with.
These are significant changes that I know are of great interest to hon. Members. As they were not in scope on Report, I propose that we recommit a selection of clauses for debate by a Public Bill Committee in a very short Committee stage, so that this House of Commons can scrutinise them line by line.
I assure hon. Members that the Bill is my absolute top priority. We are working closely with both Houses to ensure that it completes the remainder of its passage and reaches Royal Assent by the end of this parliamentary Session. It is absolutely essential that we get proper scrutiny. I commend the motion to the House.
With the leave of the House, in making my closing remarks, I want to remind all Members and all those watching these proceedings exactly why we are here today. The children and families who have had their lives irreparably damaged by social media giants need to know that we are on their side, and that includes the families who sat in the Gallery here today and who I had the opportunity to talk to. I want to take this opportunity to pay tribute to the work they have done, including Ian Russell. They have shone a spotlight and campaigned on this issue. As many Members will know, in 2017, Ian’s 14-year-old daughter Molly took her own life after being bombarded by self-harm content on Instagram and Pinterest. She was a young and innocent girl.
To prevent other families from going through this horrendous ordeal, we must all move the Bill forward together. And we must work together to get the Bill on the statute book as soon as possible by making sure this historic legislation gets the proper scrutiny it deserves, so that we can start protecting children and young people online while also empowering adults.
For too long, the fierce debate surrounding the Bill has been framed by an assumption that protecting children online must come at the expense of free speech for adults. Today we can put an end to this dispute once and for all. Our common-sense amendments to the Bill overcome these barriers by strengthening the protections for children while simultaneously protecting free speech and choice for adults.
However, it is right that the House is allowed to scrutinise these changes in Committee, which is why we need to recommit a selection of clauses for a very short Committee stage. This will not, as the Opposition suggest, put the Bill at risk. I think it is really wrong to make such an assertion. As well as being deeply upsetting to the families who visited us this evening, it is a low blow by the Opposition to play politics with such an important Bill.
We will ensure the Bill completes all stages by the end of this Session, and we need to work together to ensure that children come first. We can then move the Bill forward, so that we can start holding tech companies to account for their actions and finally stop them putting profits before people and before our children.
Question put.
(1 year, 11 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Of course, Sir Roger. Without addressing the other amendments, I would like us to move away from the overly content-focused approach that the Government seem intent on taking in the Bill more widely. I will leave my comments there on the SNP amendment, but we support our SNP colleagues on it.
It is a pleasure to serve under your chairmanship, Sir Roger.
Being online can be a hugely positive experience for children and young people, but we recognise the challenge of habit-forming behaviour or designed addiction to some digital services. The Bill as drafted, however, would already deliver the intent of the amendment from the hon. Member for Aberdeen North. If service providers identify in their risk assessment that habit-forming or addictive-behaviour risks cause significant harm to an appreciable number of children on a service, the Bill will require them to put in place measures to mitigate and manage that risk under clause 11(2)(a).
To meet the child safety risk assessment duties under clause 10, services must assess the risk of harm to children from the different ways in which the service is used; the impact of such use; the level of risk of harm to children; how the design and operation of the service may increase the risks identified; and the functionalities that facilitate the presence or dissemination of content of harm to children. The definition of “functionality” at clause 200 already includes an expression of a view on content, such as applying a “like” or “dislike” button, as at subsection (2)(f)(i).
I thank the Minister for giving way so early on. He mentioned an “appreciable number”. Will he clarify what that is? Is it one, 10, 100 or 1,000?
I do not think that a single number can be put on that, because it depends on the platform and the type of viewing. It is not easy to put a single number on that. An “appreciable number” is basically as identified by Ofcom, which will be the arbiter of all this. It comes back to what the hon. Member for Aberdeen North said about the direction that we, as she rightly said, want to give Ofcom. Ofcom has a range of powers already to help it assess whether companies are fulfilling their duties, including the power to require information about the operation of their algorithms. I would set the direction that the hon. Lady is looking for, to ensure that Ofcom uses those powers to the fullest and can look at the algorithms. We should bear in mind that social media platforms face criminal liability if they do not supply the information required by Ofcom to look under the bonnet.
If platforms do not recognise that they have an issue with habit-forming features, even though we know they have, will Ofcom say to them, “Your risk assessment is insufficient. We know that the habit-forming features are really causing a problem for children”?
We do not want to wait for the Bill’s implementation to start those conversations with the platforms. We expect companies to be transparent about their design practices that encourage extended engagement and to engage with researchers to understand the impact of those practices on their users.
The child safety duties in clause 11 apply across all areas of a service, including the way it is operated and used by children and the content present on the service. Subsection (4)(b) specifically requires services to consider the
“design of functionalities, algorithms and other features”
when complying with the child safety duties. Given the direction I have suggested that Ofcom has, and the range of powers that it will already have under the Bill, I am unable to accept the hon. Member’s amendment, and I hope she will therefore withdraw it.
I would have preferred it had the Minister been slightly more explicit that habit-forming features are harmful. That would have been slightly more helpful.
I thank the Minister. Absolutely—they are not always harmful. With that clarification, I am happy to beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 1, in clause 11, page 10, line 22, leave out
“, or another means of age assurance”.
This amendment omits words which are no longer necessary in subsection (3)(a) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.
The Bill’s key objective, above everything else, is the safety of young people online. That is why the strongest protections in the Bill are for children. Providers of services that are likely to be accessed by children will need to provide safety measures to protect child users from harmful content, such as pornography, and from behaviour such as bullying. We expect companies to use age verification technologies to prevent children from accessing services that pose the highest risk of harm to them, and age assurance technologies and other measures to provide children with an age-appropriate experience.
The previous version of the Bill already focused on protecting children, but the Government are clear that the Bill must do more to achieve that and to ensure that the requirements on providers are as clear as possible. That is why we are strengthening the Bill and clarifying the responsibilities of providers to provide age-appropriate protections for children online. We are making it explicit that providers may need to use age assurance to identify the age of their users in order to meet the child safety duties for user-to-user services.
The Bill already set out that age assurance may be required to protect children from harmful content and activity, as part of meeting the duty in clause 11(3), but the Bill will now clarify that it may also be needed to meet the wider duty in subsection (2) to
“mitigate and manage the risks of harm to children”
and to manage
“the impact of harm to children”
on such services. That is important so that only children who are old enough are able to use functionalities on a service that poses a risk of harm to younger children. The changes will also ensure that children are signposted to support that is appropriate to their age if they have experienced harm. For those reasons, I recommend that the Committee accepts the amendments.
I have a few questions regarding amendments 1 to 3, which as I mentioned relate to the thorny issue of age verification and age assurance, and I hope the Minister can clarify some of them.
We are unclear about why, in subsection (3)(a), the Government have retained the phrase
“for example, by using age verification, or another means of age assurance”.
Can that difference in wording be taken as confirmation that the Government want harder forms of age verification for primary priority content? The Minister will be aware that many in the sector are unclear about what that harder form of age verification may look like, so some clarity would be useful for all of us in the room and for those watching.
In addition, we would like to clarify the Minister’s understanding of the distinction between age verification and age assurance. They are very different concepts in reality, so we would appreciate it if he could be clear, particularly when we consider the different types of harm that the Bill will address and protect people from, how that will be achieved and what technology will be required for different types of platform and content. I look forward to clarity from the Minister on that point.
That is a good point. In essence, age verification is the hard access to a service. Age assurance ensures that the person who uses the service is the same person whose age was verified. Someone could use their parent’s debit card or something like that, so it is not necessarily the same person using the service right the way through. If we are to protect children, in particular, we have to ensure that we know there is a child at the other end whom we can protect from the harm that they may see.
On the different technologies, we are clear that our approach to age assurance or verification is not technology-specific. Why? Because otherwise the Bill would be out of date within around six months. By the time the legislation was fully implemented it would clearly be out of date. That is why it is incumbent on the companies to be clear about the technology and processes they use. That information will be kept up to date, and Ofcom can then look at it.
Thank you, Sir Roger. It was helpful to hear the Minister’s clarification of age assurance and age verification, and it was useful for him to put on the record the difference between the two.
I have a couple of points. In respect of Ofcom keeping up to date with the types of age verification and the processes, new ones will come through and excellent new methods will appear in coming years. I welcome the Minister’s suggestion that Ofcom will keep up to date with that, because it is incredibly important that we do not rely on, say, the one provider that there is currently, when really good methods could come out. We need the legislation to ensure that we get the best possible service and the best possible verification to keep children away from content that is inappropriate for them.
This is one of the most important parts of the Bill for ensuring that we can continue to have adult sections of the internet—places where there is content that would be disturbing for children, as well as for some adults—and that an age-verification system is in place to ensure that that content can continue to be there. Websites that require a subscription, such as OnlyFans, need to continue to have in place the age-verification systems that they currently have. By writing into legislation the requirement for them to continue to have such systems in place, we can ensure that children cannot access such services but adults can continue to do so. This is not about what is banned online or about trying to make sure that this content does not exist anywhere; it is specifically about gatekeeping to ensure that no child, as far as we can possibly manage, can access content that is inappropriate for kids.
There was a briefing recently on children’s access to pornography, and we heard horrendous stories. It is horrendous that a significant number of children have seen inappropriate content online, and the damage that that has caused to so many young people cannot be overstated. Blocking access to adult parts of the internet is so important for the next generation, not just so that children are not disturbed by the content they see, but so that they learn that it is not okay and normal and understand that the depictions of relationships in pornography are not the way reality works, not the way reality should work and not how women should be treated. Having a situation in which Ofcom or anybody else is better able to take action to ensure that adult content is specifically accessed only by adults is really important for the protection of children and for protecting the next generation and their attitudes, particularly towards sex and relationships.
I wish to add some brief words in support of the Government’s proposals and to build on the comments from Members of all parties.
We know that access to extreme and abusive pornography is a direct factor in violence against women and girls. We see that play out in the court system every day. People claim to have watched and become addicted to this type of pornography; they are put on trial because they seek to play that out in their relationships, which has resulted in the deaths of women. The platforms already have technology that allows them to figure out the age of people on their platforms. The Bill seeks to ensure that they use that for a good end, so I thoroughly support it. I thank the Minister.
There are two very important and distinct issues here. One is age verification. The platforms ask adults who have identification to verify their age; if they cannot verify their age, they cannot access the service. Platforms have a choice within that. They can design their service so that it does not have adult content, in which case they may not need to build in verification systems—the platform polices itself. However, a platform such as Twitter, which allows adult content on an app that is open to children, has to build in those systems. As the hon. Member for Aberdeen North mentioned, people will also have to verify their identity to access a service such as OnlyFans, which is an adult-only service.
On these platforms, the age verification requirements are clear: they must age-gate the adult content or get rid of it. They must do one or the other. Rightly, the Bill does not specify technologies. Technologies are available. The point is that a company must demonstrate that it is using an existing and available technology or that it has some other policy in place to remedy the issue. It has a choice, but it cannot do nothing. It cannot say that it does not have a policy on it.
Age assurance is always more difficult for children, because they do not have the same sort of ID that adults have. However, technologies exist: for instance, Yoti uses facial scanning. Companies do not have to do that either; they have to demonstrate that they do something beyond self-certification at the point of signing up. That is right. Companies may also demonstrate what they do to take robust action to close the accounts of children they have identified on their platforms.
If a company’s terms of service state that people must be 13 or over to use the platform, the company is inherently stating that the platform is not safe for someone under 13. What does it do to identify people who sign up? What does it do to identify people once they are on the platform, and what action does it then take? The Bill gives Ofcom the powers to understand those things and to force a change of behaviour and action. That is why—to the point made by the hon. Member for Pontypridd—age assurance is a slightly broader term, but companies can still extract a lot of information to determine the likely age of a child and take the appropriate action.
I think we are all in agreement, and I hope that the Committee will accept the amendments.
Amendment 1 agreed to.
Amendments made: 2, in clause 11, page 10, line 25, leave out
“(for example, by using age assurance)”.
This amendment omits words which are no longer necessary in subsection (3)(b) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.
Amendment 3, in clause 11, page 10, line 26, at end insert—
“(3A) Age assurance to identify who is a child user or which age group a child user is in is an example of a measure which may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).”—(Paul Scully.)
This amendment makes it clear that age assurance measures may be used to comply with duties in clause 11(2) as well as (3) (safety duties protecting children).
I beg to move amendment 99, in clause 11, page 10, line 34, leave out paragraph (d) and insert—
“(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,”.
This amendment is intended to make clear that if it is proportionate to do so, services should have policies that include blocking access to parts of a service, rather than just the entire service or particular content on the service.
If someone on a PlayStation wants to play online games, they must sign up to PlayStation Plus—that is how the model works. Once they pay that subscription, they can access online games and play Fortnite or Rocket League or whatever they want online. They then also have access to a suite of communication features; they can private message people. It would be disproportionate to ban somebody from playing any PlayStation game online in order to stop them from being able to private message inappropriate things. That would be a disproportionate step. I do not want PlayStation to be unable to act against somebody because it could not ban them, as that would be disproportionate, but was unable to switch off the direct messaging features because the clause does not allow it that flexibility. A person could continue to be in danger on the PlayStation platform as a result of private communications that they could receive. That is one example of how the provision would be key and important.
Again, the Government recognise the intent behind amendment 99, which, as the hon. Member for Aberdeen North said, would require providers to be able to block children’s access to parts of a service, rather than the entire service. I very much get that. We recognise the nature and scale of the harm that can be caused to children through livestreaming and private messaging, as has been outlined, but the Bill already delivers what is intended by these amendments. Clause 11(4) sets out examples of areas in which providers will need to take measures, if proportionate, to meet the child safety duties. It is not an exhaustive list of every measure that a provider might be required to take. It would not be feasible or practical to list every type of measure that a provider could take to protect children from harm, because such a list could become out of date quickly as new technologies emerge, as the hon. Lady outlined with her PlayStation example.
I have a concern. The Minister’s phrasing was “to block children’s access”. Surely some of the issues would be around blocking adults’ access, because they are the ones causing risk to the children. From my reading of the clause, it does not suggest that the action could be taken only against child users; it could be taken against any user in order to protect children.
I will come to that in a second. The hon. Member for Luton North talked about putting the onus on the victim. Any element of choice is there for adults; the children will be protected anyway, as I will outline in a second. We all agree that the primary purpose of the Bill is to be a children’s protection measure.
Ofcom will set out in codes of practice the specific steps that providers can take to protect children who are using their service, and the Government expect those to include steps relating to children’s access to high-risk features, such as livestreaming or private messaging. Clause 11(4)(d) sets out that that providers may be required to take measures in the following areas:
“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.
The other areas listed are intentionally broad categories that allow for providers to take specific measures. For example, a measure in the area of blocking user access to particular content could include specific measures that restrict children’s access to parts of a service, if that is a proportionate way to stop users accessing that type of content. It can also apply to any of the features of a service that enable children to access particular content, and could therefore include children’s access to livestreaming and private messaging features. In addition, the child safety duties make it clear that providers need to use proportionate systems and processes that prevent children from encountering primary priority content that is harmful to them, and protect children and age groups at risk of harm from other content that is harmful to them.
While Ofcom will set out in codes of practice the steps that providers can take to meet these duties, we expect those steps, as we have heard, to include the use of age verification to prevent children accessing content that poses the greatest risk of harm to them. To meet that duty, providers may use measures that restrict children from accessing parts of the service. The Bill therefore allows Ofcom to require providers to take that step where it is proportionate. I hope that that satisfies the hon. Member for Aberdeen North, and gives her the direction that she asked for—that is, a direction to be more specific that Ofcom does indeed have the powers that she seeks.
The Bill states that we can expect little impact on child protection before 2027-28 because of the enforcement road map and when Ofcom is planning to set that out. Does the Minister not think that in the meantime, that sort of ministerial direction would be helpful? It could make Ofcom’s job easier, and would mean that children could be protected online before 2027-28.
The ministerial direction that the various platforms are receiving from the Dispatch Box, from our conversations with them and from the Bill’s progress as it goes through the House of Lords will be helpful to them. We do not expect providers to wait until the very last minute to implement the measures. They are starting to do so now, but we want them to go them further, quicker.
Government amendment 4 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details of the measures that they use to restrict access in their terms of service and apply them consistently. Providers will also need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service, as well as reviewing children’s use of higher-risk features, as I have said.
To meet the child safety risk assessment duties in clause 10, providers must assess: the risk of harm to children from functionalities that facilitate the presence or dissemination of harmful content; the level of risk from different kinds of harmful content, giving separate consideration to children in different age groups; the different ways in which the service is used, and the impact of such use on the level of risk of harm; and how the design and operation of the service may increase the risks identified.
The child safety duties in clause 11 apply across all areas of the service, including the way it is operated and used by children, as well as the content present on the service. For the reasons I have set out, I am not able to accept the amendments, but I hope that the hon. Member for Aberdeen North will take on board my assurances.
That was quite helpful. I am slightly concerned about the Minister’s focus on reducing children’s access to the service or to parts of it. I appreciate that is part of what the clause is intended to do, but I would also expect platforms to be able to reduce the ability of adults to access parts of the service or content in order to protect children. Rather than just blocking children, blocking adults from accessing some features—whether that is certain adults or adults as a group—would indeed protect children. My reading of clause 11(4) was that users could be prevented from accessing some of this stuff, rather than just child users. Although the Minister has given me more questions, I do not intend to push the amendment to a vote.
May I ask a question of you, Sir Roger? I have not spoken about clause stand part. Are we still planning to have a clause stand part debate?
I beg to move amendment 4, in clause 11, page 11, line 9, at end insert—
“(6A) If a provider takes or uses a measure designed to prevent access to the whole of the service or a part of the service by children under a certain age, a duty to—
(a) include provisions in the terms of service specifying details about the operation of the measure, and
(b) apply those provisions consistently.”
This amendment requires providers to give details in their terms of service about any measures they use which prevent access to a service (or part of it) by children under a certain age, and to apply those terms consistently.
With this it will be convenient to discuss the following:
Government amendment 5.
Amendment 100, in clause 11, page 11, line 15, after “accessible” insert “for child users.”
This amendment makes clear that the provisions of the terms of service have to be clear and accessible for child users.
Although the previous version of the Bill already focused on protecting children, as I have said, the Government are clear that it must do more to achieve that and to ensure that requirements for providers are as clear as possible. That is why we are making changes to strengthen the Bill. Amendments 4 and 5 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details in their terms of services of the measures that they use to ensure that children below the minimum age are prevented access. Those terms must be applied consistently and be clear and accessible to users. The change will mean that providers can be held to account for what they say in their terms of service, and will no longer do nothing to prevent underage access.
The Government recognise the intent behind amendment 100, which is to ensure that terms of service are clear and accessible for child users, but the Bill as drafted sets an appropriate standard for terms of service. The duty in clause 11(8) sets an objective standard for terms of service to be clear and accessible, rather than requiring them to be clear for particular users. Ofcom will produce codes of practice setting out how providers can meet that duty, which may include provisions about how to tailor the terms of service to the user base where appropriate.
The amendment would have the unintended consequence of limiting to children the current accessibility requirement for terms of service. As a result, any complicated and detailed information that would not be accessible for children—for example, how the provider uses proactive technology—would probably need to be left out of the terms of service, which would clearly conflict with the duty in clause 11(7) and other duties relating to the terms of service. It is more appropriate to have an objective standard of “clear and accessible” so that the terms of service can be tailored to provide the necessary level of information and be useful to other users such as parents and guardians, who are most likely to be able to engage with the more detailed information included in the terms of service and are involved in monitoring children’s online activities.
Ofcom will set out steps that providers can take to meet the duty and will have a tough suite of enforcement powers to take action against companies that do not meet their child safety duties, including if their terms of service are not clear and accessible. For the reasons I have set out, I am not able to accept the amendment tabled by the hon. Member for Aberdeen North and I hope she will withdraw it.
As I said, I will also talk about clause 11. I can understand why the Government are moving their amendments. It makes sense, particularly with things like complying with the provisions. I have had concerns all the way along—particularly acute now as we are back in Committee with a slightly different Bill from the one that we were first presented with—about the reliance on terms of service. There is a major issue with choosing to go down that route, given that providers of services can choose what to put in their terms of service. They can choose to have very good terms of service that mean that they will take action on anything that is potentially an issue and that will be strong enough to allow them to take the actions they need to take to apply proportionate measures to ban users that are breaking the terms of service. Providers will have the ability to write terms of service like that, but not all providers will choose to do that. Not all providers will choose to write the gold standard terms of service that the Minister expects everybody will write.
We have to remember that these companies’ and organisations’ No. 1 aim is not to protect children. If their No. 1 aim was to protect children, we would not be here. We would not need an Online Safety Bill because they would be putting protection front and centre of every decision they make. Their No. 1 aim is to increase the number of users so that they can get more money. That is the aim. They are companies that have a duty to their shareholders. They are trying to make money. That is the intention. They will not therefore necessarily draw up the best possible terms of service.
I heard an argument on Report that market forces will mean that companies that do not have strong enough terms of service, companies that have inherent risks in their platforms, will just not be used by people. If that were true, we would not be in the current situation. Instead, the platforms that are damaging people and causing harm—4chan, KiwiFarms or any of those places that cause horrendous difficulties—would not be used by people because market forces would have intervened. That approach does not work; it does not happen that the market will regulate itself and people will stay away from places that cause them or others harm. That is not how it works. I am concerned about the reliance on terms of service and requiring companies to stick to their own terms of service. They might stick to their own terms of service, but those terms of service might be utterly rubbish and might not protect people. Companies might not have in place what we need to ensure that children and adults are protected online.
It is absolutely the case that those companies still have to do a risk assessment, and a child risk assessment if they meet the relevant criteria. The largest platforms, for example, will still have to do a significant amount of work on risk assessments. However, every time a Minister stands up and talks about what they are requiring platforms and companies to do, they say, “Companies must stick to their terms of service. They must ensure that they enforce things in line with their terms of service.” If a company is finding it too difficult, it will just take the tough things out of their terms of service. It will take out transphobia, it will take out abuse. Twitter does not ban anyone for abuse anyway, it seems, but it will be easier for Twitter to say, “Ofcom is going to try to hold us for account for the fact that we are not getting rid of people for abusive but not illegal messages, even though we say in our terms of service, ‘You must act with respect’, or ‘You must not abuse other users’. We will just take that out of our terms of service so that we are not held to account for the fact that we are not following our terms of service.” Then, because the abuse is not illegal—because it does not meet that bar—those places will end up being even less safe than they are right now.
For example, occasionally Twitter does act in line with its terms of service, which is quite nice: it does ban people who are behaving inappropriately, but not necessarily illegally, on its platform. However, if it is required to implement that across the board for everybody, it will be far easier for Twitter to say, “We’ve sacked all our moderators—we do not have enough people to be able to do this job—so we will just take it all out of the terms of service. The terms of service will say, ‘We will ban people for sharing illegal content, full stop.’” We will end up in a worse situation than we are currently in, so the reliance on terms of service causes me a big, big problem.
Turning to amendment 100, dealing specifically with the accessibility of this feature for child users, I appreciate the ministerial clarification, and agree that my amendment could have been better worded and potentially causes some problems. However, can the Minister talk more about the level of accessibility? I would like children to be able to see a version of the terms of service that is age-appropriate, so that they understand what is expected of them and others on the platform, and understand when and how they can make a report and how that report will be acted on. The kids who are using Discord, TikTok or YouTube are over 13—well, some of them are—so they are able to read and understand, and they want to know how to make reports and for the reporting functions to be there. One of the biggest complaints we hear from kids is that they do not know how to report things they see that are disturbing.
A requirement for children to have an understanding of how reporting functions work, particularly on social media platforms where people are interacting with each other, and of the behaviour that is expected of them, does not mean that there cannot be a more in-depth and detailed version of the terms of service, laying out potential punishments using language that children may not be able to understand. The amendment would specifically ensure that children have an understanding of that.
We want children to have a great time on the internet. There are so many ace things out there and wonderful places they can access. Lego has been in touch, for example; its website is really pretty cool. We want kids to be able to access that stuff and communicate with their friends, but we also want them to have access to features that allow them to make reports that will keep them safe. If children are making reports, then platforms will say, “Actually, there is real problem with this because we are getting loads of reports about it.” They will then be able to take action. They will be able to have proper risk assessments in place because they will be able to understand what is disturbing people and what is causing the problems.
I am glad to hear the Minister’s words. If he were even more clear about the fact that he would expect children to be able to understand and access information about keeping themselves safe on the platforms, then that would be even more helpful.
On terms and conditions, it is clearly best practice to have a different level of explanation that ensures children can fully understand what they are getting into. The hon. Lady talked about the fact that children do not know how to report harm. Frankly, judging by a lot of conversations we have had in our debates, we do not know how to report harm because it is not transparent. On a number of platforms, how to do that is very opaque.
A wider aim of the Bill is to make sure that platforms have better reporting patterns. I encourage platforms to do exactly what the hon. Member for Aberdeen North says to engage children, and to engage parents. Parents are well placed to engage with reporting and it is important that we do not forget parenting in the equation of how Government and platforms are acting. I hope that is clear to the hon. Lady. We are mainly relying on terms and conditions for adults, but the Bill imposes a wider set of protections for children on the platforms.
Amendment 4 agreed to.
Amendment made: 5, in clause 11, page 11, line 15, after “(5)” insert “, (6A)”.—(Paul Scully.)
This amendment ensures that the duty in clause 11(8) to have clear and accessible terms of service applies to the terms of service mentioned in the new subsection inserted by Amendment 4.
Clause 11, as amended, ordered to stand part of the Bill.
Clause 12
Adults’ risk assessment duties
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Clause 13 stand part.
Government amendments 18, 23 to 25, 32, 33 and 39.
Clause 55 stand part.
Government amendments 42 to 45, 61 to 66, 68 to 70, 74, 80, 85, 92, 51 and 52, 54, 94 and 60.
To protect free speech and remove any possibility that the Bill could cause tech companies to censor legal content, I seek to remove the so-called “legal but harmful” duties from the Bill. These duties are currently set out in clauses 12 and 13 and apply to the largest in-scope services. They require services to undertake risk assessments for defined categories of harmful but legal content, before setting and enforcing clear terms of service for each category of content.
I share the concerns raised by Members of this House and more broadly that these provisions could have a detrimental effect on freedom of expression. It is not right that the Government define what legal content they consider harmful to adults and then require platforms to risk assess for that content. Doing so may encourage companies to remove legal speech, undermining this Government’s commitment to freedom of expression. That is why these provisions must be removed.
At the same time, I recognise the undue influence that the largest platforms have over our public discourse. These companies get to decide what we do and do not see online. They can arbitrarily remove a user’s content or ban them altogether without offering any real avenues of redress to users. On the flip side, even when companies have terms of service, these are often not enforced, as we have discussed. That was the case after the Euro 2020 final where footballers were subject to the most appalling abuse, despite most platforms clearly prohibiting that. That is why I am introducing duties to improve the transparency and accountability of platforms and to protect free speech through new clauses 3 and 4. Under these duties, category 1 platforms will only be allowed to remove or restrict access to content or ban or suspend users when this is in accordance with their terms of service or where they face another legal obligation. That protects against the arbitrary removal of content.
Companies must ensure that their terms of service are consistently enforced. If companies’ terms of service say that they will remove or restrict access to content, or will ban or suspend users in certain circumstances, they must put in place proper systems and processes to apply those terms. That will close the gap between what companies say they will do and what they do in practice. Services must ensure that their terms of service are easily understandable to users and that they operate effective reporting and redress mechanisms, enabling users to raise concerns about a company’s application of the terms of service. We will debate the substance of these changes later alongside clause 18.
Clause 55 currently defines
“content that is harmful to adults”,
including
“priority content that is harmful to adults”
for the purposes of this legislation. As this concept would be removed with the removal of the adult safety duties, this clause will also need to be removed.
My hon. Friend mentioned earlier that companies will not be able to remove content if it is not part of their safety duties or if it was not a breach of their terms of service. I want to be sure that I heard that correctly and to ask whether Ofcom will be able to risk assess that process to ensure that companies are not over-removing content.
Absolutely. I will come on to Ofcom in a second and respond directly to his question.
The removal of clauses 12, 13 and 55 from the Bill, if agreed by the Committee, will require a series of further amendments to remove references to the adult safety duties elsewhere in the Bill. These amendments are required to ensure that the legislation is consistent and, importantly, that platforms, Ofcom and the Secretary of State are not held to requirements relating to the adult safety duties that we intend to remove from the Bill. The amendments remove requirements on platforms and Ofcom relating to the adult safety duties. That includes references to the adult safety duties in the duties to provide content reporting and redress mechanisms and to keep records. They also remove references to content that is harmful to adults from the process for designating category 1, 2A and 2B companies. The amendments in this group relate mainly to the process for the category 2B companies.
I also seek to amend the process for designating category 1 services to ensure that they are identified based on their influence over public discourse, rather than with regard to the risk of harm posed by content that is harmful to adults. These changes will be discussed when we debate the relevant amendments alongside clause 82 and schedule 11. The amendments will remove powers that will no longer be required, such as the Secretary of State’s ability to designate priority content that is harmful to adults. As I have already indicated, we intend to remove the adult safety duties and introduce new duties on category 1 services relating to transparency, accountability and freedom of expression. While they will mostly be discussed alongside clause 18, amendments 61 to 66, 68 to 70 and 74 will add references to the transparency, accountability and freedom of expression duties to schedule 8. That will ensure that Ofcom can require providers of category 1 services to give details in their annual transparency reports about how they comply with the new duties. Those amendments define relevant content and consumer content for the purposes of the schedule.
We will discuss the proposed transparency and accountability duties that will replace the adult safety duties in more detail later in the Committee’s deliberations. For the reasons I have set out, I do not believe that the current adult safety duties with their risks to freedom of expression should be retained. I therefore urge the Committee that clauses 12, 13 and 55 do not stand part and instead recommend that the Government amendments in this group are accepted.
Before we proceed, I emphasise that we are debating clause 13 stand part as well as the litany of Government amendments that I read out.
(1 year, 11 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I thank the shadow Minister for that intervention. She is absolutely right. We have had a discussion about terms of reference and terms of service. Not only do most people not actually fully read them or understand them, but they are subject to change. The moment Elon Musk took over Twitter, everything changed. Not only have we got Donald Trump back, but Elon Musk also gave the keys to a mainstream social media platform to Kanye West. We have seen what happened then.
That is the situation the Government will now not shut the door on. That is regrettable. For all the reasons we have heard today, it is really damaging. It is really disappointing that we are not taking the opportunity to lead in this area.
It is a pleasure to serve under your chairmanship, Dame Angela.
A lot of the discussion has replayed the debate from day two on Report about the removal of “legal but harmful” measures. Some of the discussion this morning and this afternoon has covered really important issues such as self-harm on which, as we said on the Floor of the House, we will introduce measures at a later stage. I will not talk about those measures now, but I would just say that we have already said that if we agree that the promotion of things such as self-harm is illegal, it should be illegal. Let us be very straight about how we deal with the promotion of self-harm.
The Bill will bring huge improvements for adult safety online. In addition to their duty to tackle illegal content, companies will have to provide adult users with tools to keep themselves safer. On some of the other clauses, we will talk about the triple shield that was mentioned earlier. If the content is illegal, it will still be illegal. If content does not adhere to the companies’ terms of service—that includes many of the issues that we have been debating for the last hour—it will have to be removed. We will come to user enforcement issues in further clauses.
The Minister mentions tools for adults to keep themselves safe. Does he not think that that puts the onus on the users—the victims—to keep themselves safe? The measures as they stand in the Bill put the onus on the companies to be more proactive about how they keep people safe.
The onus on adults is very much a safety net—very much a catch-all, after we have put the onus on the social media companies and the platforms to adhere to their own terms and conditions.
We have heard a lot about Twitter and the changes to Twitter. We can see the commercial imperative for mainstream platforms, certainly the category 1 platforms, to have a wide enough catch-all in their terms of service—anything that an advertiser, for example, would see as reasonably sensible—to be able to remain a viable platform in the first place. When Elon Musk first started making changes at Twitter, a comment did the rounds: “How do you build a multimillion-dollar company? You sell it to Elon Musk for £44 billion.” He made that change. He has seen the bottom falling out of his market and has lost a lot of the cash he put into Twitter. That is the commercial impetus that underpins a lot of the changes we are making.
Is the Minister really suggesting that it is reasonable for people to say, “Right, I am going to have to walk away from Facebook because I don’t agree with their terms of service,” to hold the platform to account? How does he expect people to keep in touch with each other if they have to walk away from social media platforms in order to try to hold them to account?
I do not think the hon. Lady is seriously suggesting that people can communicate only via Facebook—via one platform. The point is that there are a variety of methods of communication, of which has been a major one, although it is not one of the biggest now, with its share value having dropped 71% in the last year. That is, again, another commercial impetus in terms of changing its platform in other, usability-related ways.
One of the examples I alluded to, which is particularly offensive for Jewish people, LGBT people and other people who were persecuted in the Nazi holocaust, is holocaust denial. Does the Minister seriously think that it is only Jewish people, LGBT people and other people who were persecuted in the holocaust who find holocaust denial offensive and objectionable and who do not want to see it as part of their online experience? Surely having these sorts of safety nets in place and saying that we do not think that certain kinds of content—although they may not be against the law—have a place online protects everyone’s experience, whether they are Jewish or not. Surely, no one wants to see holocaust denial online.
No, but there is freedom of expression to a point—when it starts to reach into illegality. We have to have the balance right: someone can say something in public—in any session offline—but what the hon. Lady is suggesting is that, as soon as they hit a keyboard or a smartphone, there are two totally different regimes. That is not getting the balance right.
The Minister says that we should have freedom of speech up to a point. Does that point include holocaust denial? He has just suggested that if something is acceptable to say in person, which I do not think holocaust denial should be, it should be acceptable online. Surely holocaust denial is objectionable whenever it happens, in whatever context—online or offline.
I have been clear about where I set the line. [Interruption.] I have said that if something is illegal, it is illegal. The terms of service of the platforms largely cover the list that we are talking about. As my hon. Friend the Member for Folkestone and Hythe and I have both said, the terms of service of the vast majority of platforms—the big category 1 platforms—set a higher bar than was in our original Bill. The hon. Member for Luton North talked about whether we should have more evidence. I understand that the pre-legislative scrutiny committee heard evidence and came to a unanimous conclusion that the “legal but harmful” conditions should not be in the Bill.
A few moments ago, the Minister compared the online world to the real world. Does he agree that they are not the same? Sadly, the sort of thing that someone says in the pub on a Friday night to two or three of their friends is very different from someone saying something dangerously harmful online that can reach millions and billions of people in a very short space of time. The person who spoke in the pub might get up the following morning and regret what they said, but no harm was done. Once something is out there in the online world, very serious damage can be done very quickly.
The hon. Lady makes a good point. I talked about the offline world rather than the real world, but clearly that can happen. That is where the balance has to be struck, as we heard from my hon. Friend the Member for Don Valley. It is not black and white; it is a spectrum of greys. Any sensible person can soon see when they stray into areas that we have talked about such as holocaust denial and extremism, but we do not want to penalise people who invariably are testing their freedom of expression.
It is a fine balance, but I think that we have reached the right balance between protecting freedom of expression and protecting vulnerable adults by having three layers of checks. The first is illegality. The second is enforcing the terms of service, which provide a higher bar than we had in the original Bill for the vast majority of platforms, so that we can see right at the beginning how they will be enforced by the platforms. If they change them and do not adhere them, Ofcom can step in. Ofcom can step in at any point to ensure that they are being enforced. The third is a safety net.
On illegal content, is the Minister proposing that the Government will introduce new legislation to make, for example, holocaust denial and eating disorder content illegal, whether it is online or offline? If he is saying that the bar in the online and offline worlds should be the same, will the Government introduce more hate crime legislation?
Hate crime legislation will always be considered by the Ministry of Justice, but I am not committing to any changes. That is beyond my reach, but the two shields that we talked about are underpinned by a safety net.
Does my hon. Friend agree that the risk assessments that will be done on the priority illegal offences are very wide ranging, in addition to the risk assessments that will be done on meeting the terms of service? They will include racially and religiously motivated harassment, and putting people in fear of violence. A lot of the offences that have been discussed in the debate would already be covered by the adult safety risk assessments in the Bill.
I absolutely agree. As I said in my opening remarks about the racial abuse picked up in relation to the Euro 2020 football championship, that would have been against the terms and conditions of all those platforms, but it still happened as the platforms were not enforcing those terms and conditions. Whether we put them on a list in the Bill or talk about them in the terms of the service, they need to be enforced, but the terms of service are there.
On that point, does my hon. Friend also agree that the priority legal offences are important too? People were prosecuted for what they posted on Twitter and Instagram about the England footballers, so that shows that we understand what racially motivated offences are and that people are prosecuted for them. The Bill will require a minimum regulatory standard that meets that threshold and requires companies to act in cases such as that one, where we know what this content is, what people are posting and what is required. Not only will the companies have to act, but they will have to complete risk assessments to demonstrate how they will do that.
Indeed. I absolutely agree with my hon. Friend and that is a good example of enforcement being used. People can be prosecuted if such abuse appears on social media, but a black footballer, who would otherwise have seen that racial abuse, can choose in the user enforcement to turn that off so that he does not see it. That does not mean that we cannot pursue a prosecution for racial abuse via a third-party complaint or via the platform.
Order. Could the Minister address his remarks through the Chair so that I do not have to look at his back?
I apologise, Dame Angela. I will bring my remarks to a close by saying that with those triple shields, we have the protections and the fine balance that we need.
Question put, That the clause, as amended, stand part of the Bill.
I beg to move amendment 8, in clause 14, page 14, line 3, leave out “harmful content” and insert—
“content to which this subsection applies”.
This amendment, and Amendments 9 to 17, amend clause 14 (user empowerment) as the adult safety duties are removed (see Amendments 6, 7 and 41). New subsections (8B) to (8D) describe the kinds of content which are now relevant to the duty in clause 14(2) - see Amendment 15.
With this it will be convenient to discuss the following:
Government amendments 9 to 14.
Government amendment 15, in clause 14, page 14, line 29, at end insert—
“(8A) Subsection (2) applies to content that—
(a) is regulated user-generated content in relation to the service in question, and
(b) is within subsection (8B), (8C) or (8D).
(8B) Content is within this subsection if it encourages, promotes or provides instructions for—
(a) suicide or an act of deliberate self-injury, or
(b) an eating disorder or behaviours associated with an eating disorder.
(8C) Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—
(a) race,
(b) religion,
(c) sex,
(d) sexual orientation,
(e) disability, or
(f) gender reassignment.
(8D) Content is within this subsection if it incites hatred against people—
(a) of a particular race, religion, sex or sexual orientation,
(b) who have a disability, or
(c) who have the characteristic of gender reassignment.”
This amendment describes the content relevant to the duty in subsection (2) of clause 14. The effect is (broadly) that providers must offer users tools to reduce their exposure to these kinds of content.
Amendment (a), to Government amendment 15, at end insert—
“(8E) Content is within this subsection if it—
(a) incites hateful extremism,
(b) provides false information about climate change, or
(c) is harmful to health.”
Government amendment 16, in clause 14, page 14, line 30, leave out subsection (9) and insert—
“(9) In this section—
‘disability’ means any physical or mental impairment;
‘injury’ includes poisoning;
‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 58(1));
‘race’ includes colour, nationality, and ethnic or national origins.”
This amendment inserts definitions of terms now used in clause 14.
Amendment (a), to Government amendment 16, after “mental impairment;” insert—
“‘hateful extremism’ means activity or materials directed at an out-group who are perceived as a threat to an in-group motivated by or intended to advance a political, religious or racial supremacist ideology—
(a) to create a climate conducive to hate crime, terrorism or other violence, or
(b) to attempt to erode or destroy the rights and freedoms protected by article 17 (Prohibition of abuse of rights) of Schedule 1 of the Human Rights Act 1998.”
Government amendment 17.
The Government recognise the importance of giving adult users greater choice about what they see online and who they interact with, while upholding users’ rights to free expression online. That is why we have removed the “legal but harmful” provisions from the Bill in relation to adults and replaced it with a fairer, simpler approach: the triple shield.
As I said earlier, the first shield will require all companies in scope to take preventive measures to tackle illegal content or activity. The second shield will place new duties on category 1 services to improve transparency and accountability, and protect free speech, by requiring them to adhere to their terms of service when restricting access to content or suspending or banning users. As I said earlier, user empowerment is the key third shield, empowering adults with a greater control over their exposure to legal forms of abuse or hatred, or content that encourages, promotes or provides instructions for suicide, self-harm or eating disorders. That has been done while upholding and protecting freedom of expression.
Amendments 9 and 12 will strengthen the user empowerment duty, so that the largest companies are required to ensure that those tools are effective in reducing the likelihood of encountering the listed content or alerting users to it, and are easy for users to access. That will provide adult users with greater control over their online experience.
We are also setting out the categories of content that those user empowerment tools apply to in the Bill, through amendment 15. Adult users will be given the choice of whether they want to take advantage of those tools to have greater control over content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders, and content that targets abuse or incites hate against people on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. This is a targeted approach, focused on areas where we know that adult users—particularly those who are vulnerable or disproportionately targeted by online hate and abuse—would benefit from having greater choice.
As I said, the Government remain committed to free speech, which is why we have made changes to the adult safety duties. By establishing high thresholds for inclusion in those content categories, we have ensured that legitimate debate online will not be affected by the user empowerment duties.
I want to emphasise that the user empowerment duties do not require companies to remove legal content from their services; they are about giving individual adult users the option to increase their control over those kinds of content. Platforms will still be required to provide users with the ability to filter out unverified users, if they so wish. That duty remains unchanged. For the reasons that I have set out, I hope that Members can support Government amendments 8 to 17.
I turn to the amendments in the name of the hon. Member for Pontypridd to Government amendments 15 and 16. As I have set out in relation to Government amendments 8 to 17, the Government recognise the intent behind the amendments—to apply the user empowerment tools in clause 14(2) to a greater range of content categories. As I have already set out, it is crucial that a tailored approach is taken, so that the user empowerment tools stay in balance with users’ rights to free expression online. I am sympathetic to the amendments, but they propose categories of content that risk being either unworkable for companies or duplicative to the approach already set out in amendment 15.
The category of
“content that is harmful to health”
sets an extremely broad scope. That risks requiring companies to apply the tools in clause 14(2) to an unfeasibly large volume of content. It is not a proportionate approach and would place an unreasonable burden on companies. It might also have concerning implications for freedom of expression, as it may capture important health advice. That risks, ultimately, undermining the intention behind the user empowerment tools in clause 14(2) by preventing users from accessing helpful content, and disincentivising users from using the features.
In addition, the category
“provides false information about climate change”
places a requirement on private companies to be the arbiters of truth on subjective and evolving issues. Those companies would be responsible for determining what types of legal content were considered false information, which poses a risk to freedom of expression and risks silencing genuine debate.
Did the Minister just say that climate change is subjective?
No, not about whether climate change is happening, but we are talking about a wide range. “Provides false information”—how do the companies determine what is false? I am not talking about the binary question of whether climate change is happening, but climate change is a wide-ranging debate. “Provides false information” means that someone has to determine what is false and what is not. Basically, the amendment outsources that to the social media platforms. That is not appropriate.
Would that not also apply to vaccine efficacy? If we are talking about everything being up for debate and nothing being a hard fact, we are entering slightly strange worlds where we undo a huge amount of progress, in particular on health.
The amendment does not talk about vaccine efficacy; it talks about content that is harmful to health. That is a wide-ranging thing.
Order. I am getting increasingly confused. The Minister appears to be answering a debate on an amendment that has not yet been moved. It might be helpful to the Committee, for good debate, if the Minister were to come back with his arguments against the amendment not yet moved by the Opposition spokesperson, the hon. Member for Pontypridd, once she has actually moved it. We can then hear her reasons for it and he can reply.
It is a pleasure to serve under your chairship, Dame Angela. With your permission, I will take this opportunity to make some broad reflections on the Government’s approach to the new so-called triple-shield protection that we have heard so much about, before coming on to the amendment tabled in my name in the group.
Broadly, Labour is disappointed that the system-level approach to content that is harmful to adults is being stripped from the Bill and replaced with a duty that puts the onus on the user to keep themselves safe. As the Antisemitism Policy Trust among others has argued, the two should be able to work in tandem. The clause allows a user to manage what harmful material they see by requiring the largest or most risky service providers to provide tools to allow a person in effect to reduce their likelihood of encountering, or to alert them to, certain types of material. We have concerns about the overall approach of the Government, but Labour believes that important additions can be made to the list of content where user-empowerment tools must be in place, hence our amendment (a) to Government amendment 15.
In July, in a little-noticed written ministerial statement, the Government produced a prototype list of content that would be harmful to adults. The list included priority content that category 1 services need to address in their terms and conditions; online abuse and harassment—mere disagreement with another’s point of view would not reach the threshold for harmful content, and so would not be covered; circulation of real or manufactured intimate images without the subject’s consent; content promoting self-harm; content promoting eating disorders; legal suicide content; and harmful health content that is demonstrably false, such as urging people to drink bleach to cure cancer.
We have concerns about whether listing those harms in the Bill is the most effective mechanism, mostly because we feel that the list should be more flexible and able to change according to the issues of the day, but it is clear that the Government will continue to pursue this avenue despite some very worrying gaps. With that in mind, will the Minister clarify what exactly underpins that list if there have been no risk assessments? What was the basis for drawing up that specific list? Surely the Government should be implored to publish the research that determined the list, at the very least.
I recognise that the false communications offence has remained in the Bill, but the list in Government amendment 15 is not exhaustive. Without the additions outlined in our amendment (a) to amendment 15, the list will do little to tackle some of the most pressing harm of our time, some of which we have already heard about today.
I am pleased that the list from the written ministerial statement has more or less been reproduced in amendment 15, under subsection (2), but there is a key and unexplained omission that our amendment (a) to it seeks to correct: the absence of the last point, on harmful health content. Amendment (a) seeks to reinsert such important content into the Bill directly. It seems implausible that the Government failed to consider the dangerous harm that health misinformation can have online, especially given that back in July they seemed to have a grasp of its importance by including it in the original list.
We all know that health-related misinformation and disinformation can significantly undermine public health, as we have heard. We only have to cast our minds back to the height of the coronavirus pandemic to remind ourselves of how dangerous the online space was, with anti-vax scepticism being rife. Many groups were impacted, including pregnant women, who received mixed messages about the safety of covid vaccination, causing widespread confusion, fear and inaction. By tabling amendment (a) to amendment 15, we wanted to understand why the Government have dropped that from the list and on what exact grounds.
In addition to harmful health content, our amendment (a) to amendment 15 would also add to the list content that incites hateful extremism and provides false information about climate change, as we have heard. In early written evidence from Carnegie, it outlined how serious the threat of climate change disinformation is to the UK. Malicious actors spreading false information on social media could undermine collective action to combat the threats. At present, the Online Safety Bill is not designed to tackle those threats head on.
We all recognise that social media is an important source of news and information for many people, and evidence is emerging of its role in climate change disinformation. The Centre for Countering Digital Hate published a report in 2021 called “The Toxic Ten: How ten fringe publishers fuel 69% of digital climate change denial”, which explores the issue further. Further analysis of activity on Facebook around COP26 undertaken by the Institute for Strategic Dialogue demonstrates the scale of the challenge in dealing with climate change misinformation and disinformation. The research compared the levels of engagement generated by reliable, scientific organisations and climate-sceptic actors, and found that posts from the latter frequently received more traction and reach than the former, which is shocking. For example, in the fortnight in which COP26 took place, sceptic content garnered 12 times the level of engagement that authoritative sources did on the platform, and 60% of the sceptic posts analysed could be classified as actively and explicitly attacking efforts to curb climate change, which just goes to show the importance of ensuring that climate change disinformation is also included in the list in Government amendment 15.
Our two amendments—amendment (a) to amendment 15, and amendment (a) to amendment 16 —seek to ensure that the long-standing omission from the Bill of hateful extremism is put right here as a priority. There is increasing concern about extremism leading to violence and death that does not meet the definition for terrorism. The internet and user-to-user services play a central role in the radicalisation process, yet the Online Safety Bill does not cover extremism.
Colleagues may be aware that Sara Khan, the former lead commissioner for countering extremism, provided a definition of extremism for the Government in February 2021, but there has been no response. The issue has been raised repeatedly by Members across the House, including by my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), following the tragic murders carried out by a radicalised incel in his constituency.
Amendment (a) to amendment 16 seeks to bring a formal definition of hateful extremism into the Bill and supports amendment (a) to amendment 15. The definition, as proposed by Sara Khan, who was appointed as Britain’s first countering extremism commissioner in 2018, is an important first step in addressing the gaps that social media platforms and providers have left open for harm and radicalisation.
Social media platforms have often been ineffective in removing other hateful extremist content. In November 2020, The Guardian reported that research from the Centre for Countering Digital Hate had uncovered how extremist merchandise had been sold on Facebook and Instagram to help fund neo-Nazi groups. That is just one of a huge number of instances, and it goes some way to suggest that a repeatedly inconsistent and ineffective approach to regulating extremist content is the one favoured by some social media platforms.
I hope that the Minister will seriously consider the amendments and will see the merits in expanding the list in Government amendment 15 to include these additional important harms.
I have talked a little already about these amendments, so let me sum up where I think we are. I talked about harmful health content and why it is not included. The Online Safety Bill will force social media companies to tackle health misinformation and disinformation online, where it constitutes a criminal offence. It includes the communications offence, which would capture posts encouraging dangerous hoax cures, where the sender knows the information to be false and intends to cause harm, such as encouraging drinking bleach to cure cancer, which we heard about a little earlier.
The legislation is only one part of the wider Government approach to this issue. It includes the work of the counter-disinformation unit, which brings together cross-Government monitoring and analysis capabilities and engages with platforms directly to ensure that appropriate action is taken, in addition to the Government’s work to build users’ resilience to misinformation through media literacy.
Including harmful health content as a category risks requiring companies to apply the adult user empowerment tools to an unfeasibly large volume of content—way beyond just the vaccine efficacy that was mentioned. That has implications both for regulatory burden and for freedom of expression, as it may capture important health advice. Similarly, on climate change, the Online Safety Bill itself will introduce new transparency, accountability and free speech duties and category one services. If a platform said that certain types of content are not allowed, it will be held to account for their removal.
We recognised that there was a heightened risk of disinformation surrounding the COP26 summit. The counter-disinformation unit led by the Department for Digital, Culture, Media and Sport brought together monitoring and analysis capabilities across Government to understand disinformation that posed a risk to public safety or to delegates or that represented attempts at interference from malign actors. We are clear that free debate is essential to a democracy and that the counter-disinformation unit should not infringe upon political debate. Government already work closely with the major social media platforms to encourage them to collaborate at speed to remove disinformation as per their terms of service.
Amendment (a) to amendment 15 and amendment (a) to amendment 16 would create that new category of content that incites hateful extremism. That is closely aligned with the approach that the Government are already taking with amendment 15, specifically subsections (8C) and (8D), which create a category of content that is abusive or incites hate on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. Those conditions would likely capture the majority of the kinds of content that the hon. Members are seeking to capture through their hateful extremism category. For example, it would capture antisemitic abuse and conspiracy theories, racist abuse and promotion of racist ideologies.
Furthermore, where companies’ terms of service say they prohibit or apply restrictions to the kind of content listed in the Opposition amendments, companies must ensure that those terms are consistently enforced. It comes back so much to the enforcement. They must also ensure that the terms of service are easily understandable.
If this is about companies enforcing what is in their terms of service for the use of their platforms, could it not create a perverse incentive for them to have very little in their terms of service? If they will be punished for not enforcing their terms of service, surely they will want them to be as lax as possible in order to limit their legal liability for enforcing them. Does the Minister follow?
I follow, but I do not agree. The categories of content in proposed new subsections (8C) and (8D), introduced by amendment 15, underpin a lot of this. I answered the question in an earlier debate when talking about the commercial impetus. I cannot imagine many mainstream advertisers wanting to advertise with a company that removed from its terms of service the exclusion of racial abuse, misogyny and general abuse. We have seen that commercial impetus really kicking in with certain platforms. For those reasons, I am unable to accept the amendments to the amendments, and I hope that the Opposition will not press them to a vote.
I am grateful for the opportunity to push the Minister further. I asked him whether he could outline where the list in amendment 15 came from. Will he publish the research that led him to compile that specific list of priority harms?
The definitions that we have taken are ones that strike the right balance and have a comparatively high threshold, so that they do not capture challenging and robust discussions on controversial topics.
Amendment 8 agreed to.
Amendments made: 9, in clause 14, page 14, line 5, after “to” insert “effectively”.
This amendment strengthens the duty in this clause by requiring that the systems or processes used to deal with the kinds of content described in subsections (8B) to (8D) (see Amendment 15) should be designed to effectively increase users’ control over such content.
Amendment 10, in clause 14, page 14, line 6, leave out from “encountering” to “the” in line 7 and insert
“content to which subsection (2) applies present on”.
This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.
Amendment 11, in clause 14, page 14, line 9, leave out from “to” to end of line 10 and insert
“content present on the service that is a particular kind of content to which subsection (2) applies”.—(Paul Scully.)
This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.
I beg to move amendment 102, in clause 14, page 14, line 12, leave out “made available to” and insert “in operation for”.
This amendment, and Amendment 103, relate to the tools proposed in Clause 14 which will be available for individuals to use on platforms to protect themselves from harm. This amendment specifically forces platforms to have these safety tools “on” by default.
I will speak briefly in favour of amendments 102 and 103. As I mentioned a few moments ago, legal but harmful content can act as the gateway to dangerous radicalisation and extremism. Such content, hosted by mainstream social media platforms, should not be permitted unchecked online. I appreciate tható for children the content will be banned, but I strongly believe that the default position should be for such content to be hidden by default to all adult users, as the amendments would ensure.
The chain of events that leads to radicalisation, as I spelt out, relies on groups and individuals reaching people unaware that they are being radicalised. The content is posted in otherwise innocent Facebook groups, forums or Twitter threads. Adding a toggle, hidden somewhere in users’ settings, which few people know about or use, will do nothing to stop that. It will do nothing to stop the harmful content from reaching vulnerable and susceptible users.
We, as legislators, have an obligation to prevent at root that harmful content reaching and drawing in those vulnerable and susceptible to the misinformation and conspiracy spouted by vile groups and individuals wishing to spread their harm. The only way that we can make meaningful progress is by putting the responsibility squarely on platforms, to ensure that by default users do not come across the content in the first place.
In the previous debate, I talked about amendment 15, which brought in a lot of protections against content that encourages and promotes, or provides instruction for, self-harm, suicide or eating disorders, and against content that is abusive or incites hate on the base of race, religion, disability, sex, gender reassignment or sexual orientation. We have also placed a duty on the largest platforms to offer adults the option to filter out unverified users if they so wish. That is a targeted approach that reflects areas where vulnerable users in particular could benefit from having greater choice and control. I come back to the fact that that is the third shield and an extra safety net. A lot of the extremes we have heard about, which have been used as debating points, as important as they are, should very much be wrapped up by the first two shields.
We have a targeted approach, but it is based on choice. It is right that adult users have a choice about what they see online and who they interact with. It is right that this choice lies in the hands of those adults. The Government mandating that these tools be on by default goes against the central aim of users being empowered to choose for themselves whether they want to reduce their engagement with some kinds of legal content.
We have been clear right from the beginning that it is not the Government’s role to say what legal content adults should or should not view online or to incentivise the removal of legal content. That is why we removed the adult legal but harmful duties in the first place. I believe we are striking the right balance between empowering adult users online and protecting freedom of expression. For that reason, I am not able to accept the amendments from the hon. Member for Pontypridd.
It is disappointing that the Government are refusing to back these amendments to place the toggle as “on” by default. It is something that we see as a safety net, as the Minister described. Why would someone have to choose to have the safety net there? If someone does not want it, they can easily take it away. The choice should be that way around, because it is there to protect all of us.
My hon. Friend makes a very good point. It goes to show the nature of this as a protection for all of us, even MPs, from accessing content that could be harmful to our health or, indeed, profession. Given the nature of the amendment, we feel that this is a safety net that should be available to all. It should be on by default.
I should say that in the spirit of choice, companies can also choose to default it to be switched off in the first place as well.
The Minister makes the point that companies can choose to have it off by default, but we would not need this Bill in the first place if companies did the right thing. Let us be clear: we would not have had to be here debating this for the past five years —for me it has been 12 months—if companies were going to do the right thing and protect people from harmful content online. On that basis, I will push the amendments to a vote.
Question put, That the amendment be made.
I beg to move amendment 101, in clause 14, page 14, line 17, at end insert—
“(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.”
This amendment creates a duty that user empowerment functions must be accessible and understandable to adult users with learning disabilities.
This issue was originally brought to my attention by Mencap. It is incredibly important, and it has potentially not been covered adequately by either our previous discussions of the Bill or the Bill itself. The amendment is specifically about ensuring that available features are accessible to adult users with learning disabilities. An awful lot of people use the internet, and people should not be excluded from using it and having access to safety features because they have a learning disability. That should not be the case, for example, when someone is trying to find how to report something on a social media platform. I had an absolute nightmare trying to report a racist gif that was offered in the list of gifs that came up. There is no potential way to report that racist gif to Facebook because it does not take responsibility for it, and GIPHY does not take responsibility for it because it might not be a GIPHY gif.
It is difficult to find the ways to report some of this stuff and to find some of the privacy settings. Even when someone does find the privacy settings, on a significant number of these platforms they do not make much sense—they are not understandable. I am able to read fairly well, I would think, and I am able to speak in the House of Commons, but I still do not understand some of the stuff in the privacy features found on some social media sites. I cannot find how to toggle off things that I want to toggle off on the level of accessibility or privacy that I have, particularly on social media platforms; I will focus on those for the moment. The Bill will not achieve even its intended purpose if all people using these services cannot access or understand the safety features and user empowerment tools.
I am quite happy to talk about the difference between the real world and the online world. My online friends have no problem with me talking about the real world as if it is something different, because it is. In the real world, we have a situation where things such as cuckooing take place and people take advantage of vulnerable adults. Social services, the police and various organisations are on the lookout for that and try to do what they can to put protections in place. I am asking for more parity with the real world here. Let us ensure that we have the protections in place, and that people who are vulnerable and taken advantage of far too often have access to those tools in order to protect themselves. It is particularly reasonable.
Let us say that somebody with a learning disability particularly likes cats; the Committee may have worked out that I also particularly like cats. Let us say that they want to go on TikTok or YouTube and look at videos of cats. They have to sign up to watch videos of cats. They may not have the capacity or understanding to know that there might be extreme content on those sites. They may not be able to grasp that. It may never cross their minds that there could be extreme content on that site. When they are signing up to TikTok, they should not have to go and find the specific toggle to switch off eating disorder content. All they had thought about was that this is a cool place to look at videos of cats.
I am happy to do that. In the same way that we spoke this morning about children’s protection, I am very aware of the terms of service and what people are getting into by looking for cats or whatever they want to do.
The Bill requires providers to make all the usual enforcement and protection tools available to all adults, including those with learning disabilities. Clause 14(4) makes it explicitly clear that features offered by providers, in compliance with the duty for users to be given greater control over the content that they see, must be made available to all adult users. Clause 14(5) further outlines that providers must have clear and accessible terms of service about what tools are offered in their service and how users may take advantage of them. We have strengthened the accessibility of the user enforcement duties through Government amendment 12 as well, to make sure that user enforcement tools and features are easy for users to access.
In addition, clause 58(1) says that providers must offer all adult users the option to verify themselves so that vulnerable users, including those with learning disabilities, are not at a disadvantage as a result of the user empowerment duties. Clause 59(2) and (3) further stipulate that in producing the guidance for providers about the user verification duty, Ofcom must have particular regard to the desirability of making identity verification available to vulnerable adult users, and must consult with persons who represent the interests of vulnerable adult users. That is about getting the thoughts of experts and advocates into their processes to make sure that they can enforce what is going on.
In addition, Ofcom is subject to the public sector equality duty, so it will have to take into account the ways in which people with disabilities may be impacted when performing its duties, such as writing its codes of practice for the user empowerment duty. I hope the hon. Member will appreciate the fact that, in a holistic way, that covers the essence of exactly what she is trying to do in her amendment, so I do not believe her amendment is necessary.
I beg to move amendment 19, in clause 18, page 19, line 32, leave out from “also” to second “section”.
This is a technical amendment relating to Amendment 20.
With this it will be convenient to discuss the following:
Government amendments 20 and 21, 26 and 27, 30, 34 and 35, 67, 71, 46 and 47, 50, 53, 55 to 57, and 95.
Government new clause 3—Duty not to act against users except in accordance with terms of service.
Government new clause 4—Further duties about terms of service.
Government new clause 5—OFCOM’s guidance about duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further
duties about terms of service).
Government new clause 6—Interpretation of this Chapter.
I am seeking to impose new duties on category 1 services to ensure that they are held accountable to their terms of service and to protect free speech. Under the status quo, companies get to decide what we do and do not see online. They can arbitrarily ban users or remove their content without offering any form of due process and with very few avenues for users to achieve effective redress. On the other hand, companies’ terms of service are often poorly enforced, if at all.
I have mentioned before the horrendous abuse suffered by footballers around the 2020 Euro final, despite most platforms’ terms and conditions clearly not allowing that sort of content. There are countless similar instances, for example, relating to antisemitic abuse—as we have heard—and other forms of hate speech, that fall below the criminal threshold.
This group of amendments relates to a series of new duties that will fundamentally reset the relationship between platforms and their users. The duties will prevent services from arbitrarily removing content or suspending users without offering users proper avenues to appeal. At the same time, they will stop companies making empty promises to their users about their terms of service. The duties will ensure that where companies say they will remove content or ban a user, they actually do.
Government new clause 3 is focused on protecting free speech. It would require providers of category 1 services to remove or restrict access to content, or ban or suspend users, only where this is consistent with their terms of service. Ofcom will oversee companies’ systems and processes for discharging those duties, rather than supervising individual decisions.
I am grateful for what the Minister has said, and glad that Ofcom will have a role in seeing that companies do not remove content that is not in breach of terms of service where there is no legal requirement to do so. In other areas of the Bill where these duties exist, risk assessments are to be conducted and codes of practice are in place. Will there similarly be risk assessments and codes of practice to ensure that companies comply with their freedom of speech obligations?
Absolutely. As I say, it is really important that people understand right at the beginning, through risk assessments, what they are signing up for and what they can expect. To come back to the point of whether someone is an adult or a child, it is really important that parents lean in when it comes to children’s protections; that is a very important tool in the armoury.
New clause 4 will require providers of category 1 services to ensure that what their terms of service say about their content moderation policies is clear and accessible. Those terms have to be easy for users to understand, and should have sufficient detail, so that users know what to expect, in relation to moderation actions. Providers of category 1 services must apply their terms of service consistently, and they must have in place systems and processes that enable them to enforce their terms of service consistently.
These duties will give users the ability to report any content or account that they suspect does not meet a platform’s terms of service. They will also give users the ability to make complaints about platforms’ moderation actions, and raise concerns if their content is removed in error. Providers will be required to take appropriate action in response to complaints. That could include removing content that they prohibit, or reinstating content removed in error. These duties ensure that providers are made aware of issues to do with their services and require them to take action to resolve them, to keep users safe, and to uphold users’ rights to free speech.
The duties set out in new clauses 3 and 4 will not apply to illegal content, content that is harmful to children or consumer content. That is because illegal content and content that is harmful to children are covered by existing duties in the Bill, and consumer content is already regulated under consumer protection legislation. Companies will also be able to remove any content where they have a legal obligation to do so, or where the user is committing a criminal offence, even if that is not covered in their terms of service.
New clause 5 will require Ofcom to publish guidance to help providers of category 1 services to understand what they need to do to comply with their new duties. That could include guidance on how to make their terms of service clear and easy for users to understand, and how to operate an effective reporting and redress mechanism. The guidance will not prescribe what types of content companies should include in their terms of service, or how they should treat such content. That will be for companies to decide, based on their knowledge of their users, and their brand and commercial incentives, and subject to their other legal obligations.
New clause 6 clarifies terms used in new clauses 3 and 4. It also includes a definition of “Consumer content”, which is excluded from the main duties in new clauses 3 and 4. This covers content that is already regulated by the Competition and Markets Authority and other consumer protection bodies, such as content that breaches the Consumer Protection from Unfair Trading Regulations 2008. These definitions are needed to provide clarity to companies seeking to comply with the duties set out in new clauses 3 and 4.
The remaining amendments to other provisions in the Bill are consequential on the insertion of these new transparency, accountability and free speech duties. They insert references to the new duties in, for example, the provisions about content reporting, enforcement, transparency and reviewing compliance. That will ensure that the duties apply properly to the new measure.
Amendment 30 removes the duty on platforms to include clear and accessible provisions in their terms of service informing users that they have a right of action in court for breach of contract if a platform removes or restricts access to their content in violation of its terms of service. This is so that the duty can be moved to new clause 4, which focuses on ensuring that platforms comply with their terms of service. The replacement duty in new clause 4 will go further than the original duty, in that it will cover suspensions and bans of users as well as restrictions on content.
Amendments 46 and 47 impose a new duty on Ofcom to have regard to the need for it to be clear to providers of category 1 services what they must do to comply with their new duties. These amendments will also require Ofcom to have regard to the extent to which providers of category 1 services are demonstrating, in a transparent and accountable way, how they are complying with their new duties.
Lastly, amendment 95 temporarily exempts video-sharing platforms that are category 1 services from the new terms of service duties, as set out in new clauses 3 and 4, until the Secretary of State agrees that the Online Safety Bill is sufficiently implemented. This approach simultaneously maximises user protections by the temporary continuation of the VSP regime and minimises burdens for services and Ofcom. The changes are central to the Government’s intention to hold companies accountable for their promises. They will protect users in a way that is in line with companies’ terms of service. They are a critical part of the triple shield, which aims to protect adults online. It ensures that users are safe by requiring companies to remove illegal content, enforce their terms of service and provide users with tools to control their online experiences. Equally, these changes prevent arbitrary or random content removal, which helps to protect pluralistic and robust debate online. For those reasons, I hope that Members can support the amendments.
This is an extremely large grouping so, for the sake of the Committee, I will do my best to keep my comments focused and brief where possible. I begin by addressing Government new clauses 3 and 4 and the consequential amendments.
Government new clause 3 introduces new duties that aim to ensure that the largest or most risky online service providers design systems and processes that ensure they cannot take down or restrict content in a way prevents a person from seeing it without further action by the user, or ban users, except in accordance with their own terms of service, or if the content breaks the law or contravenes the Online Safety Bill regime. This duty is referred to as the duty not to act against users except in accordance with terms of service. In reality, that will mean that the focus remains far too much on the banning, taking down and restriction of content, rather than our considering the systems and processes behind the platforms that perpetuate harm.
Labour has long held the view that the Government have gone down an unhelpful cul-de-sac on free speech. Instead of focusing on defining exactly which content is or is not harmful, the Bill should be focused on the processes by which harmful content is amplified on social media. We must recognise that a person posting a racist slur online that nobody notices, shares or reads is significantly less harmful than a post that can quickly go viral, and can within hours gain millions of views or shares. We have talked a lot in this place about Kanye West and the comments he has made on Twitter in the past few weeks. It is safe to say that a comment by Joe Bloggs in Hackney that glorifies Hitler does not have the same reach or produce the same harm as Kanye West saying exactly the same thing to his 30 million Twitter followers.
Our approach has the benefit of addressing the things that social media companies can control—for example, how content spreads—rather than the things they cannot control, such as what people say online. It reduces the risk to freedom of speech because it tackles how content is shared, rather than relying entirely on taking down harmful content. Government new clause 4 aims to improve the effectiveness of platforms’ terms of service in conjunction with the Government’s new triple shield, which the Committee has heard a lot about, but the reality is they are ultimately seeking to place too much of the burden of protection on extremely flexible and changeable terms of service.
If a provider’s terms of service say that certain types of content are to be taken down or restricted, then providers must run systems and processes to ensure that that can happen. Moreover, people must be able to report breaches easily, through a complaints service that delivers appropriate action, including when the service receives complaints about the provider. This “effectiveness” duty is important but somewhat misguided.
The Government, having dropped some of the “harmful but legal” provisions, seem to expect that if large and risky services—the category 1 platforms—claim to be tackling such material, they must deliver on that promise to the customer and user. This reflects a widespread view that companies may pick and choose how to apply their terms of service, or implement them loosely and interchangeably, as we have heard. Those failings will lead to harm when people encounter things that they would not have thought would be there when they signed up. All the while, service providers that do not fall within category 1 need not enforce their terms of service, or may do so erratically or discriminatorily. That includes search engines, no matter how big.
This large bundle of amendments seems to do little to actually keep people safe online. I have already made my concerns about the Government’s so-called triple shield approach to internet safety clear, so I will not repeat myself. We fundamentally believe that the Government’s approach, which places too much of the onus on the user rather than the platform, is wrong. We therefore cannot support the approach that is taken in the amendments. That being said, the Minister can take some solace from knowing that we see the merits of Government new clause 5, which
“requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4”.
If this is the avenue that the Government insist on going down, it is absolutely vital that providers are advised by Ofcom on the processes they will be required to take to comply with these new duties.
Amendment 19 agreed to.
Amendment made: 20, in clause 18, page 19, line 33, at end insert
“, and
(b) section (Further duties about terms of service)(5)(a) (reporting of content that terms of service allow to be taken down or restricted).”—(Paul Scully.)
This amendment inserts a signpost to the new provision about content reporting inserted by NC4.
Clause 18, as amended, ordered to stand part of the Bill.
Clause 19
Duties about complaints procedures
Amendment made: 21, in clause 19, page 20, line 15, leave out “, (3) or (4)” and insert “or (3)”.—(Paul Scully.)
This amendment removes a reference to clause 20(4), as that provision is moved to NC4.
I beg to move amendment 22, in clause 19, page 20, line 27, leave out from “down” to “and” in line 28 and insert
“or access to it being restricted, or given a lower priority or otherwise becoming less likely to be encountered by other users,”.
NC2 states what is meant by restricting users’ access to content, and this amendment makes a change in line with that, to avoid any implication that downranking is a form of restriction on access to content.
With this it will be convenient to discuss the following:
Government amendment 59.
Government new clause 2—Restricting users’ access to content.
These amendments clarify the meaning of “restricting access to content” and “access to content” for the purposes of the Bill. Restricting access to content is an expression that is used in various provisions across the Bill, such as in new clause 2, under which providers of category 1 services will have a duty to ensure that they remove or restrict access to users’ content only where that is in accordance with their terms of service or another legal obligation. There are other such references in clauses 15, 16 and 17.
The amendments make it clear that the expression
“restricting users’ access to content”
covers cases where a provider prevents a user from accessing content without that user taking a prior step, or where content is temporarily hidden from a user. They also make it clear that this expression does not cover any restrictions that the provider puts in place to enable users to apply user empowerment tools to limit the content that they encounter, or cases where access to content is controlled by another user, rather than by the provider.
The amendments are largely technical, but they do cover things such as down-ranking. Amendment 22 is necessary because the previous wording of this provision wrongly suggested that down-ranking was covered by the expression “restricting access to content”. Down-ranking is the practice of giving content a lower priority on a user’s feed. The Government intend that users should be able to complain if they feel that their content has been inappropriately down-ranked as a result of the use of proactive technology. This amendment ensures consistency.
I hope that the amendments provide clarity as to the meaning of restricting access to content for those affected by the Bill, and assist providers with complying with their duties.
Again, I will keep my comments on clause 19 brief, as we broadly support the intentions behind the clause and the associated measures in the grouping. My hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) spoke at length about this important clause, which relates to the all-important complaints procedures available around social media platforms and companies, in the previous Bill Committee.
During the previous Committee, Labour tabled amendments that would have empowered more individuals to make a complaint about search content in the event of non-compliance. In addition, we wanted an external complaints option for individuals seeking redress. Sadly, all those amendments were voted down by the last Committee, but I must once again press the Minister on those points, particularly in the context of the new amendments that have been tabled.
Without redress for individual complaints, once internal mechanisms have been exhausted, victims of online abuse could be left with no further options. Consumer protections could be compromised and freedom of expression, with which the Government seem to be borderline obsessed, could be infringed for people who feel that their content has been unfairly removed.
Government new clause 2 deals with the meaning of references to
“restricting users’ access to content”,
in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14. We see amendments 22 and 59 as important components of new clause 2, and are therefore more than happy to support them. However, I reiterate to the Minister and place on the record once again the importance of introducing an online safety ombudsman, which we feel is crucial to new clause 2. The Joint Committee recommended the introduction of such an ombudsman, who would consider complaints when internal routes of redress had not resulted in resolution, had failed to address risk and had led to significant and demonstrable harm. As new clause 2 relates to restricting users’ access to content, we must also ensure that there is an appropriate channel for complaints if there is an issue that users wish to take up around restrictions in accessing content.
By now, the Minister will be well versed in my thoughts on the Government’s approach, and on the reliance on the user empowerment tool approach more broadly. It is fundamentally an error to pursue a regime that is so content-focused. Despite those points, we see the merits in Government amendments 22 and 59, and in new clause 2, so have not sought to table any further amendments at this stage.
I am slightly confused, and would appreciate a little clarification from the Minister. I understand what new clause 2 means; if the hon. Member for Pontypridd says that she does not want to see content of a certain nature, and I put something of that nature online, I am not being unfairly discriminated against in any way because she has chosen to opt out of receiving that content. I am slightly confused about the downgrading bit.
I know that an awful lot of platforms use downgrading when there is content that they find problematic, or something that they feel is an issue. Rather than taking that content off the platform completely, they may just no longer put it in users’ feeds, for example; they may move it down the priority list, and that may be part of what they already do to keep people safe. I am not trying to criticise what the Government are doing, but I genuinely do not understand whether that downgrading would still be allowed, whether it would be an issue, and whether people could complain about their content being downgraded because the platform was a bit concerned about it, and needed to check it out and work out what was going on, or if it was taken off users’ feeds.
Some companies, if they think that videos have been uploaded by people who are too young to use the platform, or by a registered child user of the platform, will not serve that content to everybody’s feeds. I will not be able to see something in my TikTok feed that was published by a user who is 13, for example, because there are restrictions on how TikTok deals with and serves that content, in order to provide increased protection and the safety that they want on their services.
Will it still be acceptable for companies to have their own internal downgrading system, in order to keep people safe, when content does not necessarily meet an illegality bar or child safety duty bar? The Minister has not used the phrase “market forces”; I think he said “commercial imperative”, and he has talked a lot about that. Some companies and organisations use downgrading to improve the systems on their site and to improve the user experience on the platform. I would very much appreciate it if the Minister explained whether that will still be the case. If not, will we all have a worse online experience as a result?
I will have a go at that, but I am happy to write to the hon. Lady if I do not respond as fully as she wants. Down-ranking content is a moderation action, as she says, but it is not always done just to restrict access to content; there are many reasons why people might want to do it. Through these changes, we are saying that the content is not actually being restricted; it can still be seen if it is searched for or otherwise encountered. That is consistent with the clarification.
This is quite an important point. The hon. Member for Aberdeen North was talking about recommendation systems. If a platform chooses not to amplify content, that is presumably not covered. As long as the content is accessible, someone could search and find it. That does not inhibit a platform’s decision, for policy reasons or whatever, not to actively promote it.
Absolutely. There are plenty of reasons why platforms will rank users’ content, including down-ranking it. Providing personal content recommendations will have that process in it as well. It is not practical to specify that restricting access includes down-ranking. That is why we made that change.
Amendment 22 agreed to.
Amendments made: 23, in clause 19, page 21, line 7, leave out from “The” to “complaints” in line 10 and insert
“relevant kind of complaint for Category 1 services is”.
This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 24, in clause 19, page 21, line 12, leave out sub-paragraph (i).
This amendment is consequential on Amendment 7 (removal of clause 13).
Amendment 25, in clause 19, page 21, line 18, leave out paragraphs (c) and (d).
This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 26, in clause 19, page 21, line 33, leave out from “also” to second “section”.
This is a technical amendment relating to Amendment 27.
Amendment 27, in clause 19, page 21, line 34, at end insert
“, and
(b) section (Further duties about terms of service)(6) (complaints procedure relating to content that terms of service allow to be taken down or restricted).”—(Paul Scully.)
This amendment inserts a signpost to the new provision about complaints procedures inserted by NC4.
Clause 19, as amended, ordered to stand part of the Bill.
Clause 20
Duties about freedom of expression and privacy
I beg to move amendment 28, in clause 20, page 21, line 42, after “have” insert “particular”.
This amendment has the result that providers of regulated user-to-user services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.
With this it will be convenient to discuss Government amendments 29, 31, 36 to 38 and 40.
I will be brief. The rights to freedom of expression and privacy are essential to our democracy. We have long been clear that the Bill must not interfere with those rights. The amendments will further strengthen protections for freedom of expression and privacy and ensure consistency in the Bill. They require regulated user-to-user and search services to have particular regard to freedom of expression and privacy when deciding on and implementing their safety measures and policy.
Amendments 28, 29 and 31 mean that service providers will need to thoroughly consider the impact that their safety and user empowerment measures have on users’ freedom of expression and privacy. That could mean, for example, providing detailed guidance and training for human reviewers about content that is particularly difficult to assess. Amendments 36 and 37 apply that to search services in relation to their safety duties. Ofcom can take enforcement action against services that fail to comply with those duties and will set out steps that platforms can take to safeguard freedom of expression and privacy in their codes of practice.
Those changes will not detract from platforms’ illegal content and child protection duties. Companies must tackle illegal content and ensure that children are protected on their services, but the amendments will protect against platforms taking an over-zealous approach to removing content or undermining users’ privacy when complying with their duties. Amendments 38 and 40 ensure that the rest of the Bill is consistent with those changes. The new duties will therefore ensure that companies give proper consideration to users’ rights when complying with them, and that that is reflected in Ofcom’s codes, providing greater clarity to companies.
Amendment 28 agreed to.
Amendments made: 29, in clause 20, page 22, line 2, after “have” insert “particular”.
This amendment has the result that providers of regulated user-to-user services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.
Amendment 30, in clause 20, page 22, line 6, leave out subsection (4).
This amendment removes clause 20(4), as that provision is moved to NC4.
Amendment 31, in clause 20, page 22, line 37, leave out paragraph (c) and insert—
“(c) section 14 (user empowerment),”.—(Paul Scully.)
The main effect of this amendment is that providers must consider freedom of expression and privacy issues when deciding on measures and policies to comply with clause 14 (user empowerment). The reference to clause 14 replaces the previous reference to clause 13 (adults’ safety duties), which is now removed (see Amendment 7).
Question proposed, That the clause, as amended, stand part of the Bill.
I will speak broadly to clause 20, as it is an extremely important clause, before making remarks about the group of Government amendments we have just voted on.
Clause 20 is designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, as Labour has repeatedly argued, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulations in less substantive ways. That is a genuine fear here.
We all want to see a Bill in place that protects free speech, but that cannot come at the expense of safety online. The situation with regards to content that is harmful to adults has become even murkier with the Government’s attempts to water down the Bill and remove adult risk assessments entirely.
The Minister must acknowledge that there is a balance to be achieved. We all recognise that. The truth is—and this is something that his predecessor, or should I say his predecessor’s predecessor, touched on when we considered this clause in the previous Bill Committee—that at the moment platforms are extremely inconsistent in their approach to getting the balance right. Although Labour is broadly supportive of this clause and the group of amendments, we feel that now is an appropriate time to put on record our concerns over the important balance between safety, transparency and freedom of expression.
Labour has genuine concerns over the future of platforms’ commitment to retaining that balance, particularly if the behaviours following the recent takeover of Twitter by Elon Musk are anything to go by. Since Elon Musk took over ownership of the platform, he has repeatedly used Twitter polls, posted from his personal account, as metrics to determine public opinion on platform policy. The general amnesty policy and the reinstatement of Donald Trump both emerged from such polls.
According to former employees, those polls are not only inaccurate representations of the platform’s user base, but are actually
“designed to be spammed and gamed”.
The polls are magnets for bots and other inauthentic accounts. This approach and the reliance on polls have allowed Elon Musk to enact and dictate his platform’s policy on moderation and freedom of expression. Even if he is genuinely trusting the results of these polls and not gamifying them, they do not accurately represent the user base nor the best practices for confronting disinformation and harm online.
Elon Musk uses the results to claim that “the people have spoken”, but they have not. Research from leading anti-hate organisation the Anti-Defamation League shows that far-right extremists and neo-Nazis encouraged supporters to actively re-join Twitter to vote in these polls. The impacts of platforming neo-Nazis on Twitter do not need to be stated. Such users are explicitly trying to promote violent and hateful agendas, and they were banned initially for that exact reason. The bottom line is that those people were banned in line with Twitter’s terms of service at the time, and they should not be re-platformed just because of the findings of one Twitter poll.
These issues are at the very heart of Labour’s concerns in relation to the Bill—that the duties around freedom of expression and privacy will be different for those at the top of the platforms. We support the clause and the group of amendments, but I hope the Minister will be able to address those concerns in his remarks.
I endorse the general approach set out by the hon. Member for Pontypridd. We do not want to define freedom of speech based on a personal poll carried out on one platform. That is exactly why we are enshrining it in this ground-breaking Bill.
We want to get the balance right. I have talked about the protections for children. We also want to protect adults and give them the power to understand the platforms they are on and the risks involved, while having regard for freedom of expression and privacy. That is a wider approach than one man’s Twitter feed. These clauses are important to ensure that the service providers interpret and implement their safety duties in a proportionate way that limits negative impact on users’ rights to freedom of expression. However, they also have to have regard to the wider definition of freedom of expression, while protecting users, which the rest of the Bill covers in a proportionate way.
This goes to the heart of more than just one person’s Twitter feed, although we could say that that person is an incredibly powerful and influential figure on the platform. In the past 24 hours, Twitter has disbanded its trust and safety council. Members of that council included expert groups working to tackle harassment and child sexual exploitation, and to promote human rights. Does the Minister not feel that the council being disbanded goes to the heart of what we have been debating? It shows how a platform can remove its terms of service or change them at whim in order to prevent harm from being perpetrated on that platform.
I will come back to some of the earlier points. At the end of the day, when platforms change their terms and conditions, which they are free to do, they will be judged by their users and indeed the advertisers from whom they make their money. There are market forces—I will use that phrase as well as “commercial imperative”, to get that one in there—that will drive behaviour. It may be the usability of Facebook, or Twitter’s terms and conditions and the approach of its new owner, that will drive those platforms to alternative users. I am old enough to remember Myspace, CompuServe and AOL, which tried to box people into their walled gardens. What happened to them? Only yesterday, someone from Google was saying that the new artificial intelligence chatbot—ChatGPT—may well disrupt Google. These companies, as big as they are, do not have a right to exist. They have to keep innovating. If they get it wrong, then they get it wrong.
Does my hon. Friend agree that this is why the Bill is structured in the way it is? We have a wide range of priority illegal offences that companies have to meet, so it is not down to Elon Musk to determine whether he has a policy on race hate. They have to meet the legal standards set, and that is why it is so important to have that wide range of priority illegal offences. If companies go beyond that and have higher safety standards in their terms of service, that is checked as well. However, a company cannot avoid its obligations simply by changing its terms of service.
My hon. Friend is absolutely right. We are putting in those protections, but we want companies to have due regard to freedom of speech.
I want to clarify a point that my hon. Friend made earlier about guidance to the new accountability, transparency and free speech duties. Companies will be free to set any terms of service that they want to, subject to their other legal obligations. That is related to the conversations that we have just been having. Those duties are there to properly enforce the terms of service, and not to remove content or ban users except in accordance with those terms. There will no platform risk assessments or codes of practices associated with those new duties. Instead, Ofcom will issue guidance on how companies can comply with their duties rather than codes of practice. That will focus on how companies set their terms of service, but companies will not be required to set terms directly for specific types of content or cover risks. I hope that is clear.
To answer the point made by the hon. Member for Pontypridd, I agree with the overall sentiment about how we need to protect freedom of expression.
I want to be clear on my point. My question was not related to how platforms set their terms of service, which is a matter for them and they are held to account for that. If we are now bringing in requirements to say that companies cannot go beyond terms of service or their duties in the Bill if they are going to moderate content, who will oversee that? Will Ofcom have a role in checking whether platforms are over-moderating, as the Minister referred to earlier? In that case, where those duties exist elsewhere in the Bill, we have codes of practice in place to make sure it is clear what companies should and should not do. We do not seem to be doing that with this issue.
Absolutely. We have captured that in other parts of the Bill, but I wanted to make that specific bit clear because I am not sure whether I understood or answered my hon. Friend’s question correctly at the time.
Question put and agreed to.
Clause 20, as amended, accordingly ordered to stand part of the Bill.
Clause 21
Record-keeping and review duties
Amendments made: 32, in clause 21, page 23, line 5, leave out “, 10 or 12” and insert “or 10”.
This amendment is consequential on Amendment 6 (removal of clause 12).
Amendment 33, in clause 21, page 23, line 45, leave out paragraph (c).
This amendment is consequential on Amendment 7 (removal of clause 13).
Amendment 34, in clause 21, page 24, line 6, leave out “section” and insert “sections”.
This amendment is consequential on Amendment 35.
Amendment 35, in clause 21, page 24, line 6, at end insert—
“, (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (duties about terms of service).”—(Paul Scully.)
This amendment ensures that providers have a duty to review compliance with the duties set out in NC3 and NC4 regularly, and after making any significant change to the design or operation of the service.
Question proposed, That the clause, as amended, stand part of the Bill.
Given that there are few changes to this clause from when the Bill was amended in the previous Public Bill Committee, I will be brief. We in the Opposition are clear that record-keeping and review duties on in-scope services make up an important function of the regulatory regime and sit at the very heart of the Online Safety Bill. We must push platforms to transparently report all harms identified and the action taken in response, in line with regulation.
Specifically on the issue that was just raised, there were two written ministerial statements on the Online Safety Bill. The first specifically said that an amendment would
“require the largest platforms to publish summaries of their risk assessments for illegal content and material that is harmful to children, to allow users and empower parents to clearly understand the risks presented by these services and the approach platforms are taking to children’s safety”.—[Official Report, 29 November 2022; Vol. 723, c. 31WS.]
Unless I have completely missed an amendment that has been tabled for this Committee, my impression is that that amendment will be tabled in the Lords and that details will be made available about how exactly the publishing will work and which platforms will be required to publish.
I would appreciate it if the Minister could provide more clarity about what that might look like, and about which platforms might have to publish their assessments. I appreciate that that will be scrutinised in the Lords but, to be fair, this is the second time that the Bill has been in Committee in the Commons. It would be helpful if we could be a bit more sighted on what exactly the Government intend to do—meaning more than the handful of lines in a written ministerial statement—because then we would know whether the proposal is adequate, or whether we would have to ask further questions in order to draw it out and ensure that it is published in a certain form. The more information the Minister can provide, the better.
I think we all agree that written records are hugely important. They are important as evidence in cases where Ofcom is considering enforcement action, and a company’s compliance review should be done regularly, especially before they make changes to their service.
The Bill does not intend to place excessive burdens on small and low-risk businesses. As such, clause 21 provides Ofcom with the power to exempt certain types of service from the record-keeping and review duties. However, the details of any exemptions must be published.
To half-answer the point made by the hon. Member for Aberdeen North, the measures will be brought to the Lords, but I will endeavour to keep her up to date as best we can so that we can continue the conversation. We have served together on several Bill Committees, including on technical Bills that required us to spend several days in Committee—although they did not come back for re-committal—so I will endeavour to keep her and, indeed, the hon. Member for Pontypridd, up to date with developments.
Question put and agreed to.
Clause 21, as amended, accordingly ordered to stand part of the Bill.
Clause 30
duties about freedom of expression and privacy
Amendments made: 36, in clause 30, page 31, line 31, after “have” insert “particular”.
This amendment has the result that providers of regulated search services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.
Amendment 37, in clause 30, page 31, line 34, after “have” insert “particular”.—(Paul Scully.)
This amendment has the result that providers of regulated search services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.
Clause 30, as amended, ordered to stand part of the Bill.
Clause 46
Relationship between duties and codes of practice
Amendments made: 38, in clause 46, page 44, line 27, after “have” insert “particular”.
This amendment has the result that providers of services who take measures other than those recommended in codes of practice in order to comply with safety duties must have particular regard to freedom of expression and users’ privacy.
Amendment 39, in clause 46, page 45, line 12, leave out paragraph (c).
This amendment is consequential on Amendment 7 (removal of clause 13).
Amendment 40, in clause 46, page 45, line 31, at end insert “, or
(ii) a duty set out in section 14 (user empowerment);”.—(Paul Scully.)
This amendment has the effect that measures recommended in codes of practice to comply with the duty in clause 14 are relevant to the question of whether a provider is complying with the duties in clause 20(2) and (3) (having regard to freedom of expression and users’ privacy).
Question proposed, That the clause, as amended, stand part of the Bill.
I do not wish to repeat myself and test the Committee’s patience, so I will keep my comments brief. As it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in the Bill. However, providers could take alternative measures to comply, but as I said in previous Committee sittings, Labour remains concerned that the definition of “alternative measures” is far too broad. I would be grateful if the Minister elaborated on his assessment of the instances in which a service provider may seek to comply via alternative measures.
The codes of practice should be, for want of a better phrase, best practice. Labour is concerned that, to avoid the duties, providers may choose to take the “alternative measures” route as an easy way out. We agree that it is important to ensure that providers have a duty with regard to protecting users’ freedom of expression and personal privacy. As we have repeatedly said, the entire Online Safety Bill regime relies on that careful balance being at the forefront. We want to see safety at the forefront, but recognise the importance of freedom of expression and personal privacy, and it is right that those duties are central to the clause. For those reasons, Labour has not sought to amend this part of the Bill, but I want to press the Minister on exactly how he sees this route being used.
It is important that service providers have flexibility, so that the Bill does not disincentivise innovation or force service providers to use measures that might not work for all business models or technological contexts. The tech sector is diverse and dynamic, and it is appropriate that companies can take innovative approaches to fulfilling their duties. In most circumstances, we expect companies to take the measures outlined in Ofcom’s code of practice as the easiest route to compliance. However, where a service provider takes alternative measures, Ofcom must consider whether those measures safeguard users’ privacy and freedom of expression appropriately. Ofcom must also consider whether they extend across all relevant areas of a service mentioned in the illegal content and children’s online safety duties, such as content moderation, staff policies and practices, design of functionalities, algorithms and other features. Ultimately, it will be for Ofcom to determine a company’s compliance with the duties, which are there to ensure users’ safety.
Question put and agreed to.
Clause 46, as amended, accordingly ordered to stand part of the Bill.
Clause 55 disagreed to.
Clause 56
Regulations under sections 54 and 55
Amendments made: 42, in clause 56, page 54, line 40, leave out subsection (3).
This amendment is consequential on Amendment 41 (removal of clause 55).
Amendment 43, in clause 56, page 54, line 46, leave out “or 55”.
This amendment is consequential on Amendment 41 (removal of clause 55).
Amendment 44, in clause 56, page 55, line 8, leave out “or 55”.
This amendment is consequential on Amendment 41 (removal of clause 55).
Amendment 45, in clause 56, page 55, line 9, leave out
“or adults are to children or adults”
and insert “are to children”.—(Paul Scully.)
This amendment is consequential on Amendment 41 (removal of clause 55).
Question proposed, That the clause, as amended, stand part of the Bill.
As we know, the clause makes provision in relation to the making of regulations designating primary and priority content that is harmful to children, and priority content that is harmful to adults. The Secretary of State may specify a description of content in regulations only if they consider that there is a material risk of significant harm to an appreciable number of children or adults in the United Kingdom presented by user-generated or search content of that description, and must consult Ofcom before making such regulations.
In the last Bill Committee, Labour raised concerns that there were no duties that required the Secretary of State to consult others, including expert stakeholders, ahead of making these regulations. That decision cannot be for one person alone. When it comes to managing harmful content, unlike illegal content, we can all agree that it is about implementing systems that prevent people from encountering it, rather than removing it entirely.
I completely agree: we are now on our third Secretary of State, our third Minister and our third Prime Minister since we began considering this iteration of the Bill. It is vital that this does not come down to one person’s ideological beliefs. We have spoken at length about this issue; the hon. Member for Don Valley has spoken about his concerns that Parliament should be sovereign, and should make these decisions. It should not be for one individual or one stakeholder to make these determinations.
We also have issues with the Government’s chosen toggle approach—we see that as problematic. We have debated it at length, but our concerns regarding clause 56 are about the lack of consultation that the Secretary of State of the day, whoever that may be and whatever political party they belong to, will be forced to make before making widespread changes to a regime. I am afraid that those concerns still exist, and are not just held by us, but by stakeholders and by Members of all political persuasions across the House. However, since our proposed amendment was voted down in the previous Bill Committee, nothing has changed. I will spare colleagues from once again hearing my pleas about the importance of consultation when it comes to determining all things related to online safety, but while Labour Members do not formally oppose the clause, we hope that the Minister will address our widespread concerns about the powers of the Secretary of State in his remarks.
I appreciate the hon. Lady’s remarks. We have tried to ensure that the Bill is proportionate, inasmuch as the Secretary of State can designate content if there is material risk of significant harm to an appreciable number of children in the United Kingdom. The Bill also requires the Secretary of State to consult Ofcom before making regulations on the priority categories of harm.
I appreciate that this point has been made about the same wording earlier today, but I really feel that the ambiguity of “appreciable number” is something that could do with being ironed out. The ambiguity and vagueness of that wording make it very difficult to enforce the provision. Does the Minister agree that “appreciable number” is too vague to be of real use in legislation such as this?
The different platforms, approaches and conditions will necessitate different numbers; it would be hard to pin a number down. The wording is vague and wide-ranging because it is trying to capture any number of scenarios, many as yet unknown. However, the regulations designating priority harms will be made under the draft affirmative resolution procedure.
On that point, which we discussed earlier—my hon. Friend the Member for Warrington North discussed it—I am struggling to understand what is an acceptable level of harm, and what is the acceptable number of people to be harmed, before a platform has to act.
It totally depends on the scenario. It is very difficult for me to stand here now and give a wide number of examples, but the Secretary of State will be reacting to a given situation, rather than trying to predict them.
The Minister has just outlined exactly what our concerns are. He is unable to give an exact number, figure or issue, but that is what the Secretary of State will have to do, without having to consult any stakeholders regarding that issue. There are many eyes on us around the world, with other legislatures looking at us and following suit, so we want the Bill to be world-leading. Many Governments across the world may deem that homosexuality, for example, is of harm to children. Because this piece of legislation creates precedent, a Secretary of State in such a Government could determine that any platform in that country should take down all that content. Does the Minister not see our concerns in that scenario?
I was about to come on to the fact that the Secretary of State would be required to consult Ofcom before making regulations on the priority categories of harm. Indeed Ofcom, just like the Secretary of State, speaks to and engages with a number of stakeholders on this issue to gain a deeper understanding. Regulations designating priority harms would be made under the draft affirmative resolution procedure, but there is also provision for the Secretary of State to use the made affirmative resolution procedure in urgent scenarios, and this would be an urgent scenario. It is about getting the balance right.
That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.
As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.
This has been a helpful debate. Everyone was right that transparency must be and is at the heart of the Bill. From when we were talking earlier today about how risk assessments and terms of service must be accessible to all, through to this transparency reporting section, it is important that we hold companies to account and that the reports play a key role in allowing users, Ofcom and civil society, including those in academia, to understand the steps that companies are taking to protect users.
Under clause 65, category 1 services, category 2A search services and category 2B user-to-user services need to publish transparency reports annually in accordance with the transparency report notice from Ofcom. That relates to the points about commerciality that my hon. Friend the Member for Folkestone and Hythe talked about. Ofcom will set out what information is required from companies in their notice, which will also specify the format, manner and deadline for the information to be provided to Ofcom. Clearly, it would not be proportionate to require every service provider within the scope of the overall regulatory framework to produce a transparency report—it is also important that we deal with capacity and proportionality—but those category threshold conditions will ensure that the framework is flexible and future-proofed.
I note what the Minister said about the commercial implications of some of these things, and some of those commercial implications might act as levers to push companies to do better on some things. By that same token, should this information not be more transparent and publicly available to give the user the choice he referred to earlier? That would mean that if a user’s data was not being properly protected and these companies were not taking the measures around safety that the public would expect, users can vote with their feet and go to a different platform. Surely that underpins a lot of what we have been talking about.
Yes, and that is why Ofcom will be the one that decides which information should be published, and from whom, to ensure that it is proportionate. At the end of the day, I have talked about the fact that transparency is at the heart of the Bill and that the transparency reports are important. To go to the original point raised by the hon. Member for Pontypridd about when these reports will be published, they will indeed be published in accordance with subsection 3(d) of the clause.
Question put and agreed to.
Clause 65 accordingly ordered to stand part of the Bill.
Schedule 8
Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services
Amendments made: 61, in schedule 8, page 203, line 13, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 62, in schedule 8, page 203, line 15, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 63, in schedule 8, page 203, line 17, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 64, in schedule 8, page 203, line 21, leave out from “or” to end of line 23 and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about user reporting of content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 65, in schedule 8, page 203, line 25, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 66, in schedule 8, page 203, line 29, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 67, in schedule 8, page 203, line 41, at end insert—
“11A Measures taken or in use by a provider to comply with any duty set out in section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service).”
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by NC3 and NC4.
Amendment 68, in schedule 8, page 204, line 2, leave out from “illegal content” to end of line 3 and insert
“or content that is harmful to children—”.
This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 69, in schedule 8, page 204, line 10, leave out from “illegal content” to “, and” in line 12 and insert
“and content that is harmful to children”.
This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 70, in schedule 8, page 204, line 14, leave out from “illegal content” to “present” in line 15 and insert
“and content that is harmful to children”.
This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 71, in schedule 8, page 205, line 38, after “Part 3” insert
“or Chapters 1 to 2A of Part 4”.—(Paul Scully.)
This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6 (and Chapter 1 of Part 4).
I beg to move amendment 72, in schedule 8, page 206, line 5, at end insert—
“35A (1) For the purposes of this Schedule, content of a particular kind is ‘relevant content’ if—
(a) a term of service, other than a term of service within sub-paragraph (2), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and
(b) it is regulated user-generated content.
(2) The terms of service within this sub-paragraph are as follows—
(a) terms of service which make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children);
(b) terms of service which deal with the treatment of consumer content.
(3) References in this Schedule to relevant content are to content that is relevant content in relation to the service in question.”
This amendment defines “relevant content” for the purposes of Schedule 8.
The amendments to schedule 8 confirm that references to relevant content, consumer content and regulated user-generated content have the same meaning as established by other provisions of the Bill. Again, that ensures consistency, which will, in turn, support Ofcom in requiring providers of category 1 services to give details in their annual transparency reports of their compliance with the new transparency, accountability and freedom of expression duties.
I will keep my comments on this grouping brief, because I have already raised our concerns and our overarching priority in terms of transparency reports in the previous debate, which was good one, with all Members highlighting the need for transparency and reporting in the Bill. With the Chair’s permission, I will make some brief comments on Government amendment 72 before addressing Government amendments 73 and 75.
It will come as no surprise to the Minister that amendment 72, which defines relevant content for the purposes of schedule 8, has a key omission—specifying priority content harmful to adults. For reasons we have covered at length, we think that it is a gross mistake on the Government’s side to attempt to water down the Bill in this way. If the Minister is serious about keeping adults safe online, he must reconsider this approach. However, we are happy to see amendments 73 and 75, which define consumer content and regulated user-generated content. It is important for all of us—whether we are politicians, researchers, academics, civil society, stakeholders, platforms, users or anyone else—that these definitions are in the Bill so that, when it is passed, it can be applied properly and at pace. That is why we have not sought to amend this grouping.
I must press the Minister to respond on the issues around relevant content as outlined in amendment 72. We greatly feel that more needs to be done to address this type of content and its harm to adults, so I would be grateful to hear the Minister’s assessment of how exactly these transparency reports will report back on this type of harm, given its absence in this group of amendments and the lack of a definition.
I am pleased to see the list included and the number of things that Ofcom can ask for more information on. I have a specific question about amendment 75. Amendment 75 talks about regulated user-generated content and says it has the same meaning as it does in the interpretation of part 3 under clause 50. The Minister may or may not know that there are concerns about clause 50(5), which relates to
“One-to-one live aural communications”.
One-to-one live aural communications are exempted. I understand that that is because the Government do not believe that telephony services, for example, should be part of the Online Safety Bill—that is a pretty reasonable position for them to take. However, allowing one-to-one live aural communications not to be regulated means that if someone is using voice chat in Fortnite, for example, and there are only two people on the team that they are on, or if someone is using voice chat in Discord and there are only two people online on the channel at that time, that is completely unregulated and not taken into account by the Bill.
I know that that is not the intention of the Bill, which is intended to cover user-generated content online. The exemption is purely in place for telephony services, but it is far wider than the Government intend it to be. With the advent of more and more people using virtual reality technology, for example, we will have more and more aural communication between just two people, and that needs to be regulated by the Bill. We cannot just allow a free-for-all.
If we have child protection duties, for example, they need to apply to all user-generated content and not exempt it specifically because it is a live, one-to-one aural communication. Children are still at significant risk from this type of communication. The Government have put this exemption in because they consider such communication to be analogous to telephony services, but it is not. It is analogous to telephony services if we are talking about a voice call on Skype, WhatsApp or Signal—those are voice calls, just like telephone services—but we are talking about a voice chat that people can have with people who they do not know, whose phone number they do not know and who they have no sort of relationship with.
Some of the Discord servers are pretty horrendous, and some of the channels are created by social media influencers or people who have pretty extreme views in some cases. We could end up with a case where the Discord server and its chat functions are regulated, but if aural communication or a voice chat is happening on that server, and there are only two people online because it is 3 o’clock in the morning where most of the people live and lots of them are asleep, that would be exempted. That is not the intention of the Bill, but the Government have not yet fixed this. So I will make one more plea to the Government: will they please fix this unintended loophole, so that it does not exist? It is difficult to do, but it needs to be done, and I would appreciate it if the Minister could take that into consideration.
I do not believe that the provisions in terms of Ofcom’s transparency powers have been watered down. It is really important that the Bill’s protection for adults strikes the right balance with its protections for free speech, which is why we have replaced the “legal but harmful” clause. I know we will not agree on that, but there are more new duties that will make platforms more accountable. Ofcom’s transparency powers will enable it to assess compliance with the new safety duties and hold platforms accountable for enforcing their terms of service to keep users safe. Companies will also have to report on the measures that they have in place to tackle illegal content or activity and content that is harmful for children, which includes proactive steps to address offences such as child sexual exploitation and abuse.
The legislation will set out high-level categories of information that companies may be required to include in their transparency reports, and Ofcom will then specify the information that service providers will need to include in those reports, in the form of a notice. Ofcom will consider companies’ resources and capacity, service type and audience in determining what information they will need to include. It is likely that the information that is most useful to the regulator and to users will vary between different services. To ensure that the transparency framework is proportionate and reflects the diversity of services in scope, the transparency reporting requirements set out in the Ofcom notice are likely to differ between those services, and the Secretary of State will have powers to update the list of information that Ofcom may require to reflect any changes of approach.
The in-game chat that children use is overwhelmingly voice chat. Children do not type if they can possibly avoid it. I am sure that that is not the case for all children, but it is for most children. Aural communication is used if someone is playing Fortnite duos, for example, with somebody they do not know. That is why that needs to be included.
I very much get that point. It is not something that I do, but I have certainly seen it myself. I am happy to chat to the hon. Lady to ensure that we get it right.
Amendment 72 agreed to.
Amendments made: 73, in schedule 8, page 206, line 6, at end insert—
“‘consumer content’ has the same meaning as in Chapter 2A of Part 4 (see section (Interpretation of this Chapter)(3));”.
This amendment defines “consumer content” for the purposes of Schedule 8.
Amendment 74, in schedule 8, page 206, leave out lines 7 and 8.
This amendment is consequential on Amendment 41 (removal of clause 55).
Amendment 75, in schedule 8, page 206, line 12, at end insert—
“‘regulated user-generated content’ has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question;”.—(Paul Scully.)
This amendment defines “regulated user-generated content” for the purposes of Schedule 8.
Schedule 8, as amended, agreed to.
Ordered, That further consideration be now adjourned. —(Mike Wood.)
(1 year, 11 months ago)
Public Bill CommitteesThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move amendment 48, in clause 82, page 72, line 21, at end insert—
“(ca) a regulated user-to-user service meets the conditions in section (List of emerging Category 1 services)(2) if those conditions are met in relation to the user-to-user part of the service;”.
This is a technical amendment ensuring that references to user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.
With this it will be convenient to discuss the following:
Government amendment 49.
Government new clause 7—List of emerging Category 1 services.
These Government amendments confer a duty on Ofcom to create and publish a list of companies that are approaching the category 1 threshold to ensure that it proactively identifies emerging high-reach, high-influence companies and is ready to add them to the category 1 register without delay. That is being done in recognition of the rapid pace of change in the tech industry, in which companies can grow quickly. The changes mean that Ofcom can designate companies as category 1 at pace. That responds to concerns that platforms could be unexpectedly popular and quickly grow in size, and that there could be delays in capturing them as category 1 platforms. Amendments 48 and 49 are consequential on new clause 7, which confers a duty on Ofcom to create and publish a list of companies that are approaching the category 1 threshold. For those reasons, I recommend that the amendments be accepted.
It will come as no surprise to Members to hear that we have serious concerns about the system of categorisation and the threshold conditions for platforms and service providers, given our long-standing view that the approach taken is far too inflexible.
In previous sittings, we raised the concern that the Government have not provided enough clarity about what will happen if a service is required to shift from one category to another, and how long that will take. We remain unclear about that, about how shifting categories will work in practice, and about how long Ofcom will have to preside over such changes and decisions.
I have been following this Bill closely for just over a year, and I recognise that the online space is constantly changing and evolving. New technologies are popping up that will make this categorisation process even more difficult. The Government must know that their approach does not capture smaller, high-harm platforms, which we know—we have debated this several times—can be at the root of some of the most dangerous and harmful content out there. Will the Minister clarify whether the Government amendments will allow Ofcom to consider adding such small, high-harm platforms to category 1, given the risk of harm?
More broadly, we are pleased that the Government tabled new clause 7, which will require Ofcom to prepare and update a list of regulated user-to-user services that have 75% of the number of users of a category 1 service, and at least one functionality of a category 1 service, or one required combination of a functionality and another characteristic or factor of a category 1 service. It is absolutely vital that Ofcom, as the regulator, is sufficiently prepared, and that there is monitoring of regulated user-to-user services so that this regime is as flexible as possible and able to cope with the rapid changes in the online space. That is why the Opposition support new clause 7 and have not sought to amend it. Moreover, we also support Government amendments 48 and 49, which are technical amendments to ensure that new clause 7 references user-to-user services and assessments of those services appropriately. I want to press the Minister on how he thinks these categories will work, and on Ofcom’s role in that.
I agree with everything that the hon. Lady said. New clause 7 is important. It was missing from the earlier iterations of the Bill, and it makes sense to have it here, but it raises further concerns about the number of people who are required to use a service before it is classed as category 1. We will come later to our amendment 104 to schedule 11, which is about adding high-risk platforms to the categorisation.
I am still concerned that the numbers are a pretty blunt instrument for categorising something as category 1. The number may end up being particularly high. I think it would be very easy for the number to be wrong—for it to be too high or too low, and probably too high rather than too low.
If Twitter were to disappear, which, given the changing nature of the online world, is not outside the realms of possibility, we could see a significant number of other platforms picking up the slack. A lot of them might have fewer users, but the same level of risk as platforms such as Twitter and Facebook. I am still concerned that choosing a number is a very difficult thing to get right, and I am not totally convinced that the Government’s way of going about this is right.
Ofcom will assess services that are close to meeting the threshold conditions of category 1 services and will publish a publicly available list of those emerging high-risk services. A service would have to meet two conditions to be added to the emerging services list: it would need at least 75% of the number of user figures in any category 1 threshold condition, and at least one functionality of a category 1 threshold condition, or one specified combination of a functionality and a characteristic or factor of a category 1 threshold condition.
Ofcom will monitor the emergence of new services. If it becomes apparent that a service has grown sufficiently to meet the threshold of becoming a category 1 service, Ofcom will be required to add that service to the register. The new clause and the consequential amendments take into account the possibility of quick growth.
Following the removal of “legal but harmful” duties, category 1 services will be subject to new transparency, accountability and free speech duties, as well as duties relating to protection for journalists and democratic content. Requiring all companies to comply with that full range of category 1 duties would pose a disproportionate regulatory burden on smaller companies that do not exert the same influence on public discourse, and that would possibly divert those companies’ resources away from tackling vital tasks.
Will my hon. Friend confirm that the risk assessments for illegal content—the priority illegal offences; the worst kind of content—apply to all services, whether or not they are category 1?
My hon. Friend is absolutely right. All companies will still have to tackle the risk assessment, and will have to remove illegal content. We are talking about the extra bits that could take a disproportionate amount of resource from core functions that we all want to see around child protection.
I would push the Minister further. He mentioned that there will not be an onus on companies to tackle the “legal but harmful” duty now that it has been stripped from the Bill, but we know that disinformation, particularly around elections in this country, is widespread on these high-harm platforms, and they will not be in scope of category 2. We have debated that at length. We have debated the time it could take Ofcom to act and put those platforms into category 1. Given the potential risk of harm to our democracy as a result, will the Minister press Ofcom to act swiftly in that regard? We cannot put that in the Bill now, but time is of the essence.
Absolutely. The Department has techniques for dealing with misinformation and disinformation as well, but we will absolutely push Ofcom to work as quickly as possible. As my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the former Secretary of State, has said, once an election is done, it is done and it cannot be undone.
Could the Minister also confirm that the provisions of the National Security Bill read across to the Online Safety Bill? Where disinformation is disseminated by networks operated by hostile foreign states, particularly Russia, as has often been the case, that is still in scope. That will still require a risk assessment for all platforms, whether or not they are category 1.
Indeed. We need to take a wide-ranging, holistic view of disinformation and misinformation, especially around election times. There is a suite of measures available to us, but it is still worth pushing Ofcom to make sure that it works as quickly as possible.
Amendment 48 agreed to.
Amendment made: 49, in clause 82, page 72, line 23, after “conditions” insert
“or the conditions in section (List of emerging Category 1 services)(2)”.—(Paul Scully.)
This is a technical amendment ensuring that references to assessments of user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.
Clause 82, as amended, ordered to stand part of the Bill.
Schedule 11
Categories of regulated user-to-user services and regulated search services: regulations
I beg to move amendment 76, in schedule 11, page 213, line 11, at end insert
“, and
(c) any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.”
This amendment provides that regulations specifying Category 1 threshold conditions for the user-to-user part of regulated user-to-user services must also include conditions relating to any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.
With this, it will be convenient to discuss Government amendments 77 to 79, 81 to 84, 86 to 91 and 93.
These Government amendments seek to change the approach to category 1 designation, following the removal from the Bill of the adult safety duties and the concept of “legal but harmful” content. Through the proposed new duties on category 1 services, we aim to hold companies accountable to their terms of service, as we have said. I seek to remove all requirements on category 1 services relating to harmful content, so it is no longer appropriate to designate them with reference to harm. Consequently, the amendments in this group change the approach to designating category 1 services, to ensure that only the largest companies with the greatest influence over public discourse are designated as category 1 services.
Specifically, these amendments will ensure that category 1 services are so designated where they have functionalities that enable easy, quick and wide dissemination of user-generated content, and the requirement of category 1 services to meet a number of users threshold remains unchanged.
The amendments also give the Secretary of State the flexibility to consider other characteristics of services, as well as other relevant factors. Those characteristics might include a service’s functionalities, the user base, the business model, governance, and other systems and processes. That gives the designation process greater flexibility to ensure that services are designated category 1 services only when they have significant influence over public discourse.
The amendments also seek to remove the use of criteria for content that is harmful to adults from category 2B, and we have made a series of consequential amendments to the designation process for categories 2A and 2B to ensure consistency.
I have commented extensively on the flaws in the categorisation process in this and previous Committees, so I will not retread old ground. I accept the amendments in this grouping. They show that the Government are prepared to broaden the criteria for selecting which companies are likely to be in category 1. That is a very welcome, if not subtle, shift in the right direction.
The amendments bring the characteristics of a company’s service into consideration, which will be a slight improvement on the previous focus on size and functionality, so we welcome them. The distinction is important, because size and functionality alone are obviously very vague indicators of harm, or the threat of harm.
We are pleased to see that the Government have allowed for a list to be drawn up of companies that are close to the margins of category 1, or that are emerging as category 1 companies. This is a positive step for regulatory certainty, and I hope that the Minister will elaborate on exactly how the assessment will be made.
However, I draw the Minister’s attention to Labour’s long-held concern about the Bill’s over-reliance on powers afforded to the Secretary of State of the day. We debated this concern in a previous sitting. I press the Minister again on why these amendments, and the regulations around the threshold conditions, are ultimately only for the Secretary of State to consider, depending on characteristics or factors that only he or she, whoever they may be, deems relevant.
We appreciate that the regulations need some flexibility, but we have genuine concerns—indeed, colleagues from all parties have expressed such concerns—that the Bill will give the Secretary of State far too much power to determine how the entire online safety regime is imposed. I ask the Minister to give the Committee an example of a situation in which it would be appropriate for the Secretary of State to make such changes without any consultation with stakeholders or the House.
It is absolutely key for all of us that transparency should lie at the heart of the Bill. Once again, we fear that the amendments are a subtle attempt by the Government to impose on what is supposed to be an independent regulatory process the whim of one person. I would appreciate assurance on that point. The Minister knows that these concerns have long been held by me and colleagues from all parties, and we are not alone in those concerns. Civil society groups are also calling for clarity on exactly how decisions will be made, and particularly on what information will be used to determine a threshold. For example, do the Government plan on quantifying a user base, and will the Minister explain how the regime would work in practice, when we know that a platform’s user base can fluctuate rapidly? We have seen that already with Mastodon; the latter’s users have increased incredibly as a result of Elon Musk’s takeover of Twitter. I hope that the Minister can reassure me about those concerns. He will know that this is a point of contention for colleagues from across the House, and we want to get the Bill right before we progress to Report.
My understanding is that only a very small number of platforms will reach the category 1 threshold. We are talking about the platforms that everybody has heard of—Facebook, Twitter and so on—and not about the slightly smaller platforms that lots of people have heard of and use. We are probably not talking about platforms such as Twitch, which has a much smaller user base than Facebook and Twitter but has a massive reach. My concern continues to be that the number threshold does not take into account the significant risks of harm from some of those platforms.
I have a specific question about amendment 76. I agree with my Labour Front-Bench colleague, the hon. Member for Pontypridd, that it shows that the Government are willing to take into account other factors. However, I am concerned that the Secretary of State is somehow being seen as the arbiter of knowledge—the person who is best placed to make the decisions—when much more flexibility could have been given to Ofcom instead. From all the evidence I have heard and all the people I have spoken to, Ofcom seems much more expert in dealing with what is happening today than any Secretary of State could ever hope to be. There is no suggestion about how the Secretary of State will consult, get information and make decisions on how to change the threshold conditions.
It is important that other characteristics that may not relate to functionalities are included if we discover that there is an issue with them. For example, I have mentioned livestreaming on a number of occasions in Committee, and we know that livestreaming is inherently incredibly risky. The Secretary of State could designate livestreaming as a high-risk functionality, and it could be included, for example, in category 1. I do not know whether it will be, but we know that there are risks there. How will the Secretary of State get that information?
There is no agreement to set up a user advocacy board. The requirement for Ofcom to consult the Children’s Commissioner will be brought in later, but organisations such as the National Society for the Prevention of Cruelty to Children, which deals with phone calls from children asking for help, are most aware of emerging threats. My concern is that the Secretary of State cannot possibly be close enough to the issue to make decisions, unless they are required to consult and listen to organisations that are at the coal face and that regularly support people. I shall go into more detail about high-harm platforms when we come to amendment 104.
The amendments give the Secretary of State the flexibility to consider other characteristics of services as well as other relevant factors, which include functionalities, user base, business model, governance, and other systems and processes. They effectively introduce greater flexibility into the designation process, so that category 1 services are designated only if they have significant influence over public discourse. Although the Secretary of State will make the regulations, Ofcom will carry out the objective and evidence-based process, which will be subject to parliamentary scrutiny via statutory instruments. The Secretary of State will have due consultation with Ofcom at every stage, but to ensure flexibility and the ability to move fast, it is important that the Secretary of State has those powers.
Amendment 76 agreed to.
I beg to move amendment 104, in schedule 11, page 213, line 11, at end insert—
“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”
This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.
I would say this, but I think that this is the most important amendment. The key area that the Government are getting wrong is the way in which platforms, providers or services will be categorised. The threshold is based on the number of users. It is the number of users “and” one of those other things, not the number of users “or” one of those other things; even that would make a significant difference.
The Secretary of State talked about the places that have a significant influence over public discourse. It is perfectly possible to have a significant influence over public discourse with a small number of users, or with a number of users that does not number into the millions. We have seen the spread of conspiracy theories that have originated and been perpetuated on very small platforms—very small, shady places on the internet that none of us has experienced or even heard of. Those are the places that have a massive impact and effect.
We know that one person can have a significant impact on the world and on people’s lives. We have heard about the physical harm that people can be incited to cause by the platforms they access, and the radicalisation and extremism they find themselves subject to. That can cause massive, damaging effects to anybody they choose to take physical action against, and to some of the most marginalised communities and groups in society. We are seeing an increase in the amount of hate crime and the number of people who believe conspiracy theories, and not all of that is because of the spread of those things on Facebook and Twitter. It is because of the breadcrumbing and the spread that there can be on smaller platforms.
The most extreme views do not necessarily tip over into “illegal” or “incitement”; they do not actually say, “Please go out and kill everybody in this particular group.” They say, “This particular group is responsible for all of ills you feel and for every negative thing that is happening in your life”, and people are therefore driven to take extremist, terrorist action. That is a significant issue.
I want to talk about a couple of platforms. Kiwi Farms, which is no longer in existence and has been taken down, was a very small platform that dramatically damaged the lives of trans people in particular. It was a platform where people went to incite hatred and give out the addresses of folk who they knew were members of the trans community. Some of those people had to move to another continent to get away from the physical violence and attacks they faced as a result of the behaviour on that incredibly small platform, which very few people will have heard about.
Kiwi Farms has been taken down because the internet service providers decided that it was too extreme and they could not possibly host it any more. That was eventually recognised and change was made, but the influence that that small place had on lives—the difficulties and harm it caused—is untold. Some of that did tip over into illegality, but some did not.
I also want to talk about the places where there is a significant amount of pornography. I am not going to say that I have a problem with pornography online; the internet will always have pornography on it. It attracts a chunk of people to spend time online, and some of that pornography is on large mainstream sites. Searches for incest, underage girls, or black women being abused all get massive numbers of hits. There is a significant amount of pornography on these sites that is illegal, that pretends to be illegal or that acts against people with protected characteristics. Research has found that a significant proportion—significantly more than a half—of pornography on mainstream sites that involves black women also involves violence. That is completely and totally unacceptable, and has a massive negative impact on society, whereby it reinforces negativity and discrimination against groups that are already struggling with being discriminated against and that do not experience the privilege of a cis white man.
It is really grim that we are requiring a number of users to be specified, when we know the harm that caused by platforms that do not have 10 million or 20 million United Kingdom users. I do not know what the threshold will be, but I know it will be too high to include a lot of platforms that have a massive effect. The amendment is designed specifically to give Ofcom the power to designate as category 1 any service that it thinks has a very high risk of harm; I have not set the bar particularly low. Now that the Minister has increased the levels of transparency that will be required for category 1 platforms, it is even more important that we subject extremist sites and platforms—the radicalising ones, which are perpetuating discrimination—to a higher bar and require them to have the transparency that they need as a category 1 service. This is a place where the Bill could really make a difference and change lives, and I am really concerned that it is massively failing to do so.
The reason I have said that it should be Ofcom’s responsibility to designate category 1 services is on the basis that it has the experts who will be looking at all the risk assessments, dealing with companies on a day-to-day basis, and seeing the harms and transparencies that the rest of us will not be able to see. The reporting mechanisms will be public for only some of the category 1 platforms, and we will not be able to find out the level of information that Ofcom has, so it is right that it should be responsible for designating sites as having a very high risk of harm. That is why I tabled the amendment, which would make a massive difference to people who are the most discriminated against as it is and who are the most at risk of harm from extremism. I urge the Minister to think again.
As debated earlier, we are removing the adult safety duties from the Bill, which means that no company will face any duties related to legal but harmful content. In their place, the Government are introducing new transparency accountability, and free speech duties on category 1 services. They have been discussed in detail earlier this session.
It would not be proportionate to apply those new duties to smaller services, but, as we have heard from my hon. Friend the Member for Folkestone and Hythe, they will still have to comply with the illegal content and child safety duties if they are accessed by children. Those services have limited resources, and blanket applying additional duties on them would divert those resources away from complying with the illegal content and child safety duties. That would likely weaken the duties’ impact on tackling criminal activity and protecting children.
The new duties are about user choice and accountability on the largest platforms—if users do not want to use smaller harmful sites, they can choose not to—but, in recognition of the rapid pace with which companies can grow, I introduced an amendment earlier to create a watchlist of companies that are approaching the category 1 threshold, which will ensure that Ofcom can monitor rapidly scaling companies, reduce any delay in designating companies as category 1 services, and apply additional obligations on them.
The hon. Member for Aberdeen North talked about ISPs acting with respect to Kiwi Farms. I talked on Tuesday about the need for a holistic approach. There is not one silver bullet. It is important to look at Government, the platforms, parenting and ISPs, because that makes up a holistic view of how the internet works. It is the multi-stakeholder framework of governing the internet in its entirety, rather than the Government trying to do absolutely everything. We have talked a lot about illegality, and I think that a lot of the areas in that case were illegal; the hon. Lady described some very distasteful things. None the less, with the introduction of the watchlist, I do not believe amendment 104 is required.
The hon. Member for Folkestone and Hythe made a good point. I do not disagree that Ofcom will have a significant role in policing platforms that are below the category 1 threshold. I am sure it will be very hands on, particularly with platforms that have the highest risk and are causing the most harm.
I still do not think that is enough. I do not think that the Minister’s change with regard to emerging platforms should be based on user numbers. It is reasonable for us to require platforms that encourage extremism, spread conspiracy theories and have the most horrific pornography on them to meet a higher bar of transparency. I do not really care if they only have a handful of people working there. I am not fussed if they say, “Sorry, we can’t do this.” If they cannot keep people safe on their platform, they should have to meet a higher transparency bar, provide more information on how they are meeting their terms of service and provide toggles—all those things. It does not matter how small these platforms are. What matters is that they have massive risks and cause massive amounts of harm. It is completely reasonable that we hold them to a higher regulatory bar. On that basis, I will push the amendment to a vote.
Thank you, Dame Angela—take 2.
Clause 115 focuses on the enforcement action that may be taken and will be triggered if a platform fails to comply. Given that the enforceable requirements may include, for example, duties to carry out and report on risk assessments and general safety duties, it is a shame that the Government have not seen the merits of going further with these provisions. I point the Minister to the previous Public Bill Committee, where Labour made some sensible suggestions for how to remedy the situation. Throughout the passage of the Bill, we have made it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment.
We cannot and should not rely solely on Ofcom to act as problems arise when they could be spotted earlier by experts somewhere else. We have already heard the Minister outline the immense task that Ofcom has ahead of it to monitor risk assessments and platforms, ensuring that platforms comply and taking action where there is illegal content and a risk to children. It is important that Ofcom has at its disposal all the help it needs.
It would be helpful if there were more transparency about how the enforcement provisions work in practice. We have repeatedly heard that without independent researchers accessing data on relevant harm, platforms will have no real accountability for how they tackle online harm. I hope that the Minister can clarify why, once again, the Government have not seen the merit of encouraging transparency in their approach. It would be extremely valuable and helpful to both the online safety regime and the regulator as a whole, and it would add merit to the clause.
We have talked about the fact that Ofcom will have robust enforcement powers. It can direct companies to take specific steps to come into compliance or to remedy failure to comply, as well as issue fines and apply to the courts for business disruption measures. Indeed, Ofcom can institute criminal proceedings against senior managers who are responsible for compliance with an information notice, when they have failed to take all reasonable steps to ensure the company’s compliance with that notice. That criminal offence will commence two months after Royal Assent.
Ofcom will be required to produce enforcement guidelines, as it does in other areas that it regulates, explaining how it proposes to use its enforcement powers. It is important that Ofcom is open and transparent, and that companies and people using the services understand exactly how to comply. Ofcom will provide those guidelines. People will be able to see who are the users of the services. The pre-emptive work will come from the risk assessments that platforms themselves will need to produce.
We will take a phased approach to bringing the duties under the Bill into effect. Ofcom’s initial focus will be on illegal content, so that the most serious harms can be addressed as soon as possible. When those codes of practice and guidelines come into effect, the hon. Member for Pontypridd will see some of the transparency and openness that she is looking for.
Question put and agreed to.
Clause 115, as amended, accordingly ordered to stand part of the Bill.
Clause 55
Review
Amendment made: 56, in clause 155, page 133, line 27, after “Chapter 1” insert “or 2A”.—(Paul Scully.)
Clause 155 is about a review by the Secretary of State of the regulatory framework established by this Bill. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6.
I am glad that there is a review function in the Bill. I have been a member of a lot of Bill Committees and Delegated Legislation Committees that have considered legislation that has no review function and that says, “This will be looked at in the normal course of departmental reviews.” We know that not all Departments always do such reviews. In fact, some Departments do under 50% of the reviews that they are supposed to do, and whether reviews take place is not checked. We therefore we do not find out whether a piece of legislation has had the intended effect. I am sure some will have done, but some definitely will not.
If the Government do not internally review whether a Bill or piece of delegated legislation has had the effect it was supposed to have, they cannot say whether it has been a success and cannot make informed decisions about future legislation, so having a review function in this Bill is really good. However, that function is insufficient as it is not enough for the Secretary of State to do the review and we will not see enough outputs from Ofcom.
The Bill has dominated the lives of a significant number of parliamentarians for the past year—longer, in some cases—because it is so important and because it has required so much scrutiny, thinking and information gathering to get to this stage. That work will not go away once the Bill is enacted. Things will not change or move at once, and parts of the legislation will not work as effectively as they could, as is the case for any legislation, whether moved by my Government or somebody else’s. In every piece of legislation there will be things that do not pan out as intended, but a review by the Secretary of State and information from Ofcom about how things are working do not seem to be enough.
Committee members, including those on the Government Benches, have suggested having a committee to undertake the review or adding that function to the responsibilities of the Digital, Culture, Media and Sport Committee. We know that the DCMS Committee is busy and will be looking into a significant number of wide-ranging topics, so it would be difficult for it to keep a watching brief on the Online Safety Bill.
The previous Minister said that there will be some sort of reviewing mechanism, but I would like further commitment from the Government that the Bill will be kept under review and that the review process as set out will not be the only type of review that happens as things move and change and the internet develops. Many people talk about more widespread use of virtual reality, for example, but there could be other things that we have not even heard of yet. After the legislation is implemented, it will be years before every part of the Bill is in action and every requirement in the legislation is working. By the time we get to 2027-28—or whenever every part of the legislation is working—things could have changed again and be drastically different to today. Indeed, the legislation may not be fit for purpose when it first starts to work, so will the Minister provide more information about what the review process will look like on an ongoing basis? The Government say this is world-leading legislation, but how we will ensure that that is the case and that it makes a difference to the safety and experience of both children and adults online?
I am glad that we are all in agreement on the need for a review. It is important that we have a comprehensive and timely review of the regulatory regime and how it is built into legislation. It is important that we understand that the legislation has the impact that we intend.
The legislation clearly sets out what the review must consider, how Ofcom is carrying out its role and if the legislation is effective in dealing with child protection, which as the hon. Lady rightly says is its core purpose. We have struck the balance of specifying two to five years after the regime comes into force, because it provides a degree of flexibility to future Ministers to judge when it should happen. None the less, I take the hon. Lady’s point that technology is developing. That is why this is a front-footed first move in this legislation, when other countries are looking at what we are doing; because of that less prescriptive approach to technologies, the legislation can be flexible and adapt to emerging new technologies. Inevitably, this will not be the last word. Some of the things in the Digital Economy Act 2017, for example, are already out of date, as is some of the other legislation that was put in place in the early 2000s. We will inevitably come back to this, but I think we have the right balance at the moment in terms of the timing.
I do not think we need to bed in whom we consult, but wider consultation will none the less be necessary to ascertain the effectiveness of the legislation.
I am following carefully what the Minister says, but I would say briefly that a lot of the debate we have had at all stages of the Bill has rested on how we believe Ofcom will use the powers it has been given, and we need to make sure that it does that. We need to ensure that it is effective and that it has the resources it needs. The hon. Member for Aberdeen North (Kirsty Blackman) makes an important point that it may not be enough to rely on a Select Committee of the Lords or the Commons having the time to do that in the detail we would want. We might need to consider either a post-legislative scrutiny Committee or some other mechanism to ensure that there is the necessary level of oversight.
My hon. Friend is absolutely right. The report as is obviously has to be laid before Parliament and will form part of the package of parliamentary scrutiny. But, yes, we will consider how we can utilise the expertise of both Houses in post-legislative scrutiny. We will come back on that.
Question put and agreed to.
Clause 155, as amended, accordingly ordered to stand part of the Bill.
Clause 169
Individuals providing regulated services: liability
Amendment made: 57, in clause 169, page 143, line 15, at end insert—
“(fa) Chapter 2A of Part 4 (terms of service: transparency, accountability and freedom of expression);”.—(Paul Scully.)
Clause 169 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6, so that individuals may be jointly and severally liable for the duties imposed by that Chapter.
Clause 169, as amended, ordered to stand part of the Bill.
Clause 183 ordered to stand part of the Bill.
Schedule 17
Video-sharing platform services: transitional provision etc
Amendments made: 94, in schedule 17, page 235, line 43, leave out paragraph (c).
This amendment is consequential on Amendment 6 (removal of clause 12).
Amendment 95, in schedule 17, page 236, line 27, at end insert—
“(da) the duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (terms of service);”.—(Paul Scully.)
This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by NC3 and NC4 during the transitional period.
Question proposed, That the schedule, as amended, be the Seventeenth schedule to the Bill.
Labour welcomes schedule 17, which the Government introduced on Report. We see this schedule as clarifying exactly how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework. The schedule is fundamentally important for both providers and users, as it establishes the formal requirements of these platforms as we move the requirement to this new legislation.
We welcome the clarification in paragraph 1(1) of the definition of a qualifying video-sharing service. On that point, I would be grateful if the Minister clarified the situation around livestreaming video platforms and whether this schedule would also apply to them. Throughout this Bill Committee, we have heard just how dangerous and harmful live video-sharing platforms can be, so this is an important point to clarify.
I have spoken at length about the importance of capturing the harms on these platforms, particularly in the context of child sexual exploitation being livestreamed online, which, thanks to the brilliant work of International Justice Mission, we know is a significant and widespread issue. I must make reference to the IJM’s findings from its recent White Paper, which highlighted the extent of the issue in the Philippines, which is widely recognised as a source country for livestreamed sexual exploitation of children. It found that traffickers often use cheap Android smartphones with pre-paid cellular data services to communicate with customers and produce and distribute explicit material. To reach the largest possible customer base, they often connect with sexually motivated offenders through everyday technology—the same platforms that the rest of us use to communicate with friends, family and co-workers.
One key issue in assessing the extent of online sexual exploitation of children is that we are entirely dependent on the detection of the crime, but the reality is that most current technologies that are widely used to detect various forms of online sexual exploitation of children are not designed to recognise livestreaming video services. This is an important and prolific issue, so I hope the Minister can assure me that the provisions in the schedule will apply to those platforms too.
We are setting out in schedule 17 how the existing video-sharing platform regime will be repealed in the transitional provisions that apply to these providers as they transition to the online safety framework. My understanding is that it does include livestreaming, but I will obviously write to the hon. Lady if I have got that wrong. I am not sure there is a significant legal effect here. To protect children and treat services fairly while avoiding unnecessary burdens on business, we are maintaining the current user protections in the VSP regime while the online safety framework is being implemented. That approach to transition avoids the duplication of regulation.
Question put and agreed to.
Schedule 17, as amended, accordingly agreed to.
Clause 203
Interpretation: general
I beg to move amendment 105, in clause 203, page 167, line 8, after “including” insert “but not limited to”.
This amendment makes clear that the definition provided for content is not exhaustive.
I am delighted that we have a new Minister, because I can make exactly the same speech as I made previously in Committee—don’t worry, I won’t—and he will not know.
I still have concerns about the definition of “content”. I appreciate that the Government have tried to include a number of things in the definition. It currently states:
“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.
That is pretty wide-ranging, but I do not think it takes everything into account. I know that it uses the word “including”; it does not say “only limited to” or anything like that. If there is to be a list of stuff, it should be exhaustive. That is my idea of how the Bill should be.
I have suggested in amendment 105 that we add “not limited to” after “including” in order to be absolutely clear that the content that we are talking about includes anything. It may or may not be on this list. Something that is missing from the list is VR technology. If someone is using VR or immersive technology and is a character on the screen, they can see what the character is doing and move their body around as that character, and whatever they do is user-generated content. It is not explicitly included in the Bill, even though there is a list of things. I do not even know how that would be written down in any way that would make sense.
I have suggested adding “not limited to” to make it absolutely clear that this is not an exhaustive list of the things that could be considered to be user-generated content or content for the purposes of the Bill. It could be absolutely anything that is user-generated. If the Minister is able to make it absolutely clear that this is not an exhaustive list and that “content” could be anything that is user-generated, I will not press the amendment to a vote. I would be happy enough with that commitment.
Indeed I can give that commitment. This is an indicative list, not an exhaustive list, for the reasons that the hon. Lady set out. Earlier, we discussed the fact that technology moves on, and she has come up with an interesting example. It is important to note that adding unnecessary words in legislation could lead to unforeseen outcomes when it is interpreted by courts, which is why we have taken this approach, but we think it does achieve the same thing.
On that basis, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Amendment proposed: 58, in clause 203, page 167, leave out lines 26 to 31. —(Paul Scully.)
This amendment removes the definition of the “maximum summary term for either-way offences”, as that term has been replaced by references to the general limit in a magistrates’ court.
I would like to ask the Minister why this amendment has been tabled. I am not entirely clear. Could he give us some explanation of the intention behind the amendment? I am pretty sure it will be fine but, if he could just let us know what it is for, that would be helpful.
I am happy to do so. Clause 203 sets out the interpretation of the terms used throughout the Bill. Amendment 58 removes a definition that is no longer required because the term is no longer in the Bill. It is as simple as that. The definition of relevant crime penalties under the Bill now uses a definition that has been updated in the light of changes to sentencing power in magistrates courts set out in the Judicial Review And Courts Act 2022. The new definition of
“general limit in a magistrates court”
is now included in the Interpretation Act 1978, so no definition is required in this Bill.
Question put and agreed to.
Amendment 58 accordingly agreed to.
Amendment made: 59, in clause 203, page 168, line 48, at end insert—
“and references to restrictions on access to a service or to content are to be read accordingly.” —(Paul Scully.)
NC2 states what is meant by restricting users’ access to content, and this amendment makes it clear that the propositions in clause 203 about access read across to references about restricting access.
Question proposed, That the clause, as amended, stand part of the Bill.
Once again, I will abuse the privilege of having a different Minister at the Dispatch Box and mention the fact that, in the definitions, “oral communications” is mentioned in line 9 that we already mentioned in terms of the definition of “content”. It is “oral communications” in this part of the Bill but “aural communications” in an earlier part of the Bill. I am still baffled as to why there is a difference. Perhaps we should have both included in both of these sections or perhaps there should be some level of consistency throughout the Bill.
The “aural communications” section that I mentioned earlier in clause 50 is the one of the parts that I am particularly concerned about because it could create a loophole. That is a different spelling of the word. I asked this last time. I am not convinced that the answer I got gave me any more clarity than I had previously. I would be keen to understand why there is a difference, if the difference is intentional and what the difference therefore is between “oral” and “aural” communications in terms of the Bill. My understanding is that oral communications are ones that are said and aural communications are ones that are heard. But, for the purposes of the Bill, those two things are really the same, unless user-generated content in which there is user-generated oral communication that no one can possibly hear is included. That surely does not fit into the definitions, because user-generated content is only considered if it is user-to-user—something that other people can see. Surely, oral communication would also be aural communication. In pretty much every instance that the Bill could possibly apply to, both definitions would mean the same thing. I understand the Minister may not have the answer to this at his fingertips, and I would be happy to hear from him later if that would suit him better.
The clause provides legal certainty about the meaning of those terms as used in the Bill: things such as “content”, “encounter”, “taking down” and “terms of service”. That is what the clause is intended to do. It is intentional and is for the reasons the hon. Lady said. Oral means speech and speech only. Aural is speech and other sounds, which is what can be heard on voice calls. That includes music as well. One is speech. The other is the whole gamut.
I am intrigued, because the hon. Member for Aberdeen North makes an interesting point. It is not one I have heard made before. Does the Minister think there is a distinction between oral and aural, where oral is live speech and aural is pre-recorded material that might be played back? Are those two are considered distinct?
My knowledge is being tested, so I will write to the hon. Member for Aberdeen North and make that available to the Committee. Coming back to the point she made about oral and aural on Tuesday about another clause on the exclusions, as I said, we have a narrow exemption to ensure that traditional phone calls are not subject to regulation. But that does mean that if a service such as Fortnite, which she spoke about previously, enables adults and children to have one-to-one oral calls, companies will still need to address the surrounding functionality around how that happens, because to enable that might cause harm—for example if an adult can contact an unknown child. That is still captured within the Bill.
Platforms will have to address, for example, the ways in which users can communicate with people who are not on their friends list. Things like that and other ways in which communication can be set up will have to be looked at in the risk assessment. With Discord, for instance, where two people can speak to each other, Discord will have to look at the way those people got into contact with each other and the risks associated with that, rather than the conversation itself, even though the conversation might be the only bit that involves illegality.
It is the functionalities around it that enable the voice conversation to happen.
Question put and agreed to.
Clause 203, as amended, accordingly ordered to stand part of the Bill.
Clause 206
Extent
Question proposed, That the clause stand part of the Bill.
I would like to welcome the Government’s clarification, particularly as an MP representing a devolved nation within the UK. It is important to clarify the distinction between the jurisdictions, and I welcome that this clause does that.
Question put and agreed to.
Clause 206 accordingly ordered to stand part of the Bill.
Clause 207
Commencement and transitional provision
Amendment made: 60, in clause 207, page 173, line 15, leave out “to” and insert “and”.—(Paul Scully.)
This amendment is consequential on amendment 41 (removal of clause 55).
Question proposed, That the clause, as amended, stand part of the Bill.
Exactly. My hon. Friend makes an incredibly important point that goes to the heart of why we are here in the first place. If the platforms were not motivated by commercial interest and we could trust them to do the right thing on keeping children safe and reducing harm on their platforms, we would not require this legislation in the first place. But sadly, we are where we are, which is why it is even more imperative that we get on with the job, that Ofcom is given the tools to act swiftly and tries to reduce the limit of when they come into effect and that this legislation is enacted so that it actually makes a lasting difference.
Ofcom has already been responsible for regulating video-sharing platforms for two years, yet still, despite being in year 3, it is only asking websites to provide a plan as to how they will be compliant. That means the reality is that we can expect little on child protection before 2027-28, which creates a massive gap compared with public expectations of when the Bill will be passed. We raised these concerns last time, and I felt little assurance from the Minister in post last time, so I am wondering whether the current Minister can improve on his predecessor by ensuring a short timeline for when exactly the Bill can be implemented and Ofcom can act.
We all understand the need for the Bill, which my hon. Friend the Member for Warrington North just pointed out. That is why we have been supportive in Committee and throughout the passage of the Bill. But the measures that the Bill introduces must come into force as soon as is reasonably possible. Put simply, the industry is ready and users want to be protected online and are ready too. It is just the Government, sadly, and the regulator that would be potentially holding up implementation of the legislation.
The Minister has failed to concede on any of the issues that we have raised in Committee, despite being sympathetic and supportive. His predecessor was also incredibly supportive and sympathetic on everything we raised in Committee, yet failed to take into account a single amendment or issue that we raised. I therefore make a plea to this Minister to at least see the need to press matters and the timescale that is needed here. We have not sought to formally amend this clause, so I seek the Minister’s assurance that this legislation will be dealt with swiftly. I urge him to work with Labour, SNP colleagues and colleagues across the House to ensure that the legislation and the provisions in it are enacted and that there are no further unnecessary delays.
Our intention is absolutely to get this regime operational as soon as possible after Royal Assent. We have to get to Royal Assent first, so I am looking forward to working with all parties in the other House to get the legislation to that point. After that, we have to ensure that the necessary preparations are completed effectively and that service providers understand exactly what is expected of them. To answer the point made by the hon. Member for Warrington North about service providers, the key difference from what happened in the years that led to this legislation being necessary is that they now will know exactly what is expected of them—and it is literally being expected of them, with legislation and with penalties coming down the line. They should not be needing to wait for the day one switch-on. They can be testing and working through things to ensure that the system does work on day one, but they can do that months earlier.
The legislation does require some activity that can be carried out only after Royal Assent, such as public consultation or laying of secondary legislation. The secondary legislation is important. We could have put more stuff in primary legislation, but that would belie the fact that we are trying to make this as flexible as possible, for the reasons that we have talked about. It is so that we do not have to keep coming back time and again for fear of this being out of date almost before we get to implementation in the first place.
However, we are doing things at the moment. Since November 2020, Ofcom has begun regulation of harmful content online through the video-sharing platform regulatory regime. In December 2020, Government published interim codes of practice on terrorist content and activity and sexual exploitation and abuse online. Those will help to bridge the gap until the regulator becomes operational. In June 2021, we published “safety by design” guidance, and information on a one-stop-shop for companies on protecting children online. In July 2021, we published the first Government online media literacy strategy. We do encourage stakeholders, users and families to engage with and help to promote that wealth of material to minimise online harms and the threat of misinformation and disinformation. But clearly, we all want this measure to be on the statute book and implemented as soon as possible. We have talked a lot about child protection, and that is the core of what we are trying to do here.
Question put and agreed to.
Clause 207, as amended, accordingly ordered to stand part of the Bill.
New Clause 1
OFCOM’s guidance: content that is harmful to children and user empowerment
“(1) OFCOM must produce guidance for providers of Part 3 services which contains examples of content or kinds of content that OFCOM consider to be, or consider not to be— OFCOM must produce guidance for providers of Category 1 services which contains examples of content or kinds of content that OFCOM consider to be, or consider not to be, content to which section 14(2) applies (see section 14(8A)).
(a) primary priority content that is harmful to children, or
(b) priority content that is harmful to children.
(2) Before producing any guidance under this section (including revised or replacement guidance), OFCOM must consult such persons as they consider appropriate.
(3) OFCOM must publish guidance under this section (and any revised or replacement guidance).”—(Paul Scully.)
This new clause requires OFCOM to give guidance to providers in relation to the kinds of content that OFCOM consider to be content that is harmful to children and content relevant to the duty in clause 14(2) (user empowerment).
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
The Government are committed to empowering adults to have greater control over their online experience, and to protecting children from seeing harmful content online. New clause 1 places a new duty on Ofcom to produce and publish guidance for providers of user-to-user regulated services, in relation to the crucial aims of empowering adults and providers having effective systems and processes in place. The guidance will provide further clarity, including through
“examples of content or kinds of content that OFCOM consider to be…primary priority”
or
“priority content that is harmful to children.”
Ofcom will also have to produce guidance that sets out examples of content that it considers to be relevant to the user empowerment duties, as set out in amendment 15 to clause 14.
It is really important that expert opinion is considered in the development of this guidance, and the new clause places a duty on Ofcom to consult with relevant persons when producing sets of guidance. That will ensure that the views of subject matter experts are reflected appropriately.
Labour is pleased to see the introduction of the new clause, which clarifies the role of Ofcom in delivering guidance to providers about their duties. Specifically, the new clause will require Ofcom to give guidance to providers on the kind of content that Ofcom considers to be harmful to children, or relevant to the user empowerment duty in clause 14. That is a very welcome addition indeed.
Labour remains concerned about exactly how these so-called user empowerment tools will work in practice—we have discussed that at length—and let us face it: we have had little assurance from the Minister on that point. We welcome the new clause, as it clarifies what guidance providers can expect to receive from Ofcom once the Bill is finally enacted. We can all recognise that Ofcom has a colossal task ahead of it—the Minister said so himself—so it is particularly welcome that the guidance will be subject to consultation with those that it deems appropriate. I can hope only that that will include the experts, and the many groups that provided expertise, support and guidance on internet regulation long before the Bill even received its First Reading, a long time ago. There are far too many of those experts and groups to list, but it is fundamental that the experts who often spot online harms before they properly emerge be consulted and included in this process if we are to truly capture the priority harms to children, as the new clause intends.
We also welcome the clarification in subsection (2) that Ofcom will be required to provide “examples of content” that would be considered to be—or not be—harmful. These examples will be key to ensuring that the platforms have nowhere to hide when it comes to deciding what is harmful; there will be no grey area. Ofcom will have the power to show them exact examples of what could be deemed harmful.
We recognise, however, that there is subjectivity to the work that Ofcom will have to do once the Bill passes. On priority content, it is most important that providers are clear about what is and is not acceptable; that is why we welcome the new clause, but we do of course wish that the Government applied the same logic to harm pertaining to adults online.
I am also happy to support new clause 1, but I have a couple of questions. It mentions that “replacement guidance” may be provided, which is important because, as we have said a number of times, things will change, and we will end up with a different online experience; that can happen quickly. I am glad that Ofcom has the ability to refresh and update the guidance.
My question is about timelines. There do not seem to be any timelines in the new clause for when the guidance is required to be published. It is key that the guidance be published before companies and organisations have to comply with it. My preference would be for it to be published as early as possible. There may well need to be more work, and updated versions of the guidance may therefore need to be published, but I would rather companies had an idea of the direction of travel, and what they must comply with, as soon as possible, knowing that it might be tweaked. That would be better than waiting until the guidance was absolutely perfect and definitely the final version, but releasing it just before people had to start complying with it. I would like an assurance that Ofcom will make publishing the guidance a priority, so that there is enough time to ensure compliance. We want the Bill to work; it will not work if people do not know what they have to comply with. Assurance on that would be helpful.
I absolutely give that assurance to the hon. Lady; that is important. We all want the measures to be implemented, and the guidance to be out there, as soon as possible. Just now I talked about the platforms bringing in measures as soon as possible, without waiting for the implementation period. They can do that far better if they have the guidance. We are already working with Ofcom to ensure that the implementation period is as short as possible, and we will continue to do so.
Question put and agreed to.
New clause 1 accordingly read a Second time, and added to the Bill.
New Clause 2
Restricting users’ access to content
“(1) This section applies for the purposes of this Part.
(2) References to restricting users’ access to content, and related references, include any case where a provider takes or uses a measure which has the effect that—
(a) a user is unable to access content without taking a prior step (whether or not taking that step might result in access being denied), or
(b) content is temporarily hidden from a user.
(3) But such references do not include any case where—
(a) the effect mentioned in subsection (2) results from the use or application by a user of features, functionalities or settings which a provider includes in a service in compliance with the duty set out in section 14(2) (user empowerment), or
(b) access to content is controlled by another user, rather than the provider.
(4) See also section 203(5).”—(Paul Scully.)
This new clause deals with the meaning of references to restricting users’ access to content, in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14.
Brought up, read the First and Second time, and added to the Bill.
New Clause 3
Duty not to act against users except in accordance with terms of service
“(1) A provider of a Category 1 service must operate the service using proportionate systems and processes designed to ensure that the provider does not—
(a) take down regulated user-generated content from the service,
(b) restrict users’ access to regulated user-generated content, or
(c) suspend or ban users from using the service,
except in accordance with the terms of service.
(2) Nothing in subsection (1) is to be read as preventing a provider from taking down content from a service or restricting users’ access to it, or suspending or banning a user, if such an action is taken—
(a) to comply with the duties set out in—
(i) section 9(2) or (3) (protecting individuals from illegal content), or
(ii) section 11(2) or (3) (protecting children from content that is harmful to children), or
(b) to avoid criminal or civil liability on the part of the provider that might reasonably be expected to arise if such an action were not taken.
(3) In addition, nothing in subsection (1) is to be read as preventing a provider from—
(a) taking down content from a service or restricting users’ access to it on the basis that a user has committed an offence in generating, uploading or sharing it on the service, or
(b) suspending or banning a user on the basis that—
(i) the user has committed an offence in generating, uploading or sharing content on the service, or
(ii) the user is responsible for, or has facilitated, the presence or attempted placement of a fraudulent advertisement on the service.
(4) The duty set out in subsection (1) does not apply in relation to—
(a) consumer content (see section (Interpretation of this Chapter));
(b) terms of service which deal with the treatment of consumer content.
(5) If a person is the provider of more than one Category 1 service, the duty set out in subsection (1) applies in relation to each such service.
(6) The duty set out in subsection (1) extends only to the design, operation and use of a service in the United Kingdom, and references in this section to users are to United Kingdom users of a service.
(7) In this section—
‘criminal or civil liability’ includes such a liability under the law of a country outside the United Kingdom;
‘fraudulent advertisement’ has the meaning given by section 35;
‘offence’ includes an offence under the law of a country outside the United Kingdom.
(8) See also section 16 (duties to protect news publisher content).”—(Paul Scully.)
This new clause imposes a duty on providers of Category 1 services to ensure that they do not take down content or restrict users’ access to it, or suspend or ban users, except in accordance with the terms of service.
Brought up, read the First and Second time, and added to the Bill.
New Clause 4
Further duties about terms of service
All services
“(1) A provider of a regulated user-to-user service must include clear and accessible provisions in the terms of service informing users about their right to bring a claim for breach of contract if—
(a) regulated user-generated content which they generate, upload or share is taken down, or access to it is restricted, in breach of the terms of service, or
(b) they are suspended or banned from using the service in breach of the terms of service.
Category 1 services
(2) The duties set out in subsections (3) to (7) apply in relation to a Category 1 service, and references in subsections (3) to (9) to ‘provider’ and ‘service’ are to be read accordingly.
(3) A provider must operate a service using proportionate systems and processes designed to ensure that—
(a) if the terms of service state that the provider will take down a particular kind of regulated user-generated content from the service, the provider does take down such content;
(b) if the terms of service state that the provider will restrict users’ access to a particular kind of regulated user-generated content in a specified way, the provider does restrict users’ access to such content in that way;
(c) if the terms of service state cases in which the provider will suspend or ban a user from using the service, the provider does suspend or ban the user in those cases.
(4) A provider must ensure that—
(a) terms of service which make provision about the provider taking down regulated user-generated content from the service or restricting users’ access to such content, or suspending or banning a user from using the service, are—
(i) clear and accessible, and
(ii) written in sufficient detail to enable users to be reasonably certain whether the provider would be justified in taking the specified action in a particular case, and
(b) those terms of service are applied consistently.
(5) A provider must operate a service using systems and processes that allow users and affected persons to easily report—
(a) content which they consider to be relevant content (see section (Interpretation of this Chapter));
(b) a user who they consider should be suspended or banned from using the service in accordance with the terms of service.
(6) A provider must operate a complaints procedure in relation to a service that—
(a) allows for complaints of a kind mentioned in subsection (8) to be made,
(b) provides for appropriate action to be taken by the provider of the service in response to complaints of those kinds, and
(c) is easy to access, easy to use (including by children) and transparent.
(7) A provider must include in the terms of service provisions which are easily accessible (including to children) specifying the policies and processes that govern the handling and resolution of complaints of a kind mentioned in subsection (8).
(8) The kinds of complaints referred to in subsections (6) and (7) are—
(a) complaints by users and affected persons about content present on a service which they consider to be relevant content;
(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in any of subsections (1) or (3) to (5);
(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is relevant content;
(d) complaints by users who have been suspended or banned from using a service.
(9) The duties set out in subsections (3) and (4) do not apply in relation to terms of service which—
(a) make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children), or
(b) deal with the treatment of consumer content.
Further provision
(10) If a person is the provider of more than one regulated user-to-user service or Category 1 service, the duties set out in this section apply in relation to each such service.
(11) The duties set out in this section extend only to the design, operation and use of a service in the United Kingdom, and references to users are to United Kingdom users of a service.
(12) See also section 16 (duties to protect news publisher content).”—(Paul Scully.)
Subsections (3) to (8) of this new clause impose new duties on providers of Category 1 services in relation to terms of service that allow a provider to take down content or restrict users’ access to it, or to suspend or ban users. Such terms of service must be clear and applied consistently. Subsection (1) of the clause contains a duty which, in part, was previously in clause 20 of the Bill.
Brought up, read the First and Second time, and added to the Bill.
New Clause 5
OFCOM’s guidance about duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service)
“(1) OFCOM must produce guidance for providers of Category 1 services to assist them in complying with their duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service)(3) to (7).
(2) OFCOM must publish the guidance (and any revised or replacement guidance).”—(Paul Scully.)
This new clause requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4.
Brought up, read the First and Second time, and added to the Bill.
New Clause 6
Interpretation of this Chapter
“(1) This section applies for the purposes of this Chapter.
(2) “Regulated user-generated content” has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question.
(3) “Consumer content” means—
(a) regulated user-generated content that constitutes, or is directly connected with content that constitutes, an offer to sell goods or to supply services,
(b) regulated user-generated content that amounts to an offence under the Consumer Protection from Unfair Trading Regulations 2008 (S.I. 2008/1277) (construed in accordance with section 53: see subsections (3), (11) and (12) of that section), or
(c) any other regulated user-generated content in relation to which an enforcement authority has functions under those Regulations (see regulation 19 of those Regulations).
(4) References to restricting users’ access to content, and related references, are to be construed in accordance with sections (Restricting users’ access to content) and 203(5).
(5) Content of a particular kind is “relevant content” if—
(a) a term of service, other than a term of service mentioned in section (Further duties about terms of service)(9), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and
(b) it is regulated user-generated content.
References to relevant content are to content that is relevant content in relation to the service in question.
(6) “Affected person” means a person, other than a user of the service in question, who is in the United Kingdom and who is—
(a) the subject of the content,
(b) a member of a class or group of people with a certain characteristic targeted by the content,
(c) a parent of, or other adult with responsibility for, a child who is a user of the service or is the subject of the content, or
(d) an adult providing assistance in using the service to another adult who requires such assistance, where that other adult is a user of the service or is the subject of the content.
(7) In determining what is proportionate for the purposes of sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service), the size and capacity of the provider of a service is, in particular, relevant.
(8) For the meaning of “Category 1 service”, see section 83 (register of categories of services).”—(Paul Scully.)
This new clause gives the meaning of terms used in NC3 and NC4.
Brought up, read the First and Second time, and added to the Bill.
New Clause 7
List of emerging Category 1 services
“(1) As soon as reasonably practicable after the first regulations under paragraph 1(1) of Schedule 11 come into force (regulations specifying Category 1 threshold conditions), OFCOM must comply with subsections (2) and (3).
(2) OFCOM must assess each regulated user-to-user service which they consider is likely to meet each of the following conditions, to determine whether the service does, or does not, meet them—
(a) the first condition is that the number of United Kingdom users of the user-to-user part of the service is at least 75% of the figure specified in any of the Category 1 threshold conditions relating to number of users (calculating the number of users in accordance with the threshold condition in question);
(b) the second condition is that—
(i) at least one of the Category 1 threshold conditions relating to functionalities of the user-to-user part of the service is met, or
(ii) if the regulations under paragraph 1(1) of Schedule 11 specify that a Category 1 threshold condition relating to a functionality of the user-to-user part of the service must be met in combination with a Category 1 threshold condition relating to another characteristic of that part of the service or a factor relating to that part of the service (see paragraph 1(4) of Schedule 11), at least one of those combinations of conditions is met.
(3) OFCOM must prepare a list of regulated user-to-user services which meet the conditions in subsection (2).
(4) The list must contain the following details about a service included in it—
(a) the name of the service,
(b) a description of the service,
(c) the name of the provider of the service, and
(d) a description of the Category 1 threshold conditions by reference to which the conditions in subsection (2) are met.
(5) OFCOM must take appropriate steps to keep the list up to date, including by carrying out further assessments of regulated user-to-user services.
(6) OFCOM must publish the list when it is first prepared and each time it is revised.
(7) When assessing whether a service does, or does not, meet the conditions in subsection (2), OFCOM must take such steps as are reasonably practicable to obtain or generate information or evidence for the purposes of the assessment.
(8) An assessment for the purposes of this section may be included in an assessment under section 83 or 84 (as the case may be) or carried out separately.”—(Paul Scully.)
This new clause requires OFCOM to prepare and keep up to date a list of regulated user-to-user services that have 75% of the number of users of a Category 1 service, and at least one functionality of a Category 1 service or one required combination of a functionality and another characteristic or factor of a Category 1 service.
Brought up, read the First and Second time, and added to the Bill.
New Clause 8
Child user empowerment duties
“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.
(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.
(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—
(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or
(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.
(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.
(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.
(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.
(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—
(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and
(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.
(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.
(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—
(a) non-verified users, or
(b) adult users, or
(c) any user other than those on a list approved by the child user.
(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—
(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and
(b) the size and capacity of the provider of a service.
(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 58(1)).
(12) In this section references to features include references to functionalities and settings.”—(Kirsty Blackman.)
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
That was some stretch of procedure, Dame Angela, but we got there in the end. This new clause is about child user empowerment duties. I am really pleased that the Government have user empowerment duties in the Bill—they are a good thing—but I am confused as to why they apply only to adult users, and why children do not deserve the same empowerment rights over what they access online.
In writing the new clause, I pretty much copied clause 14, before there were any amendments to it, and added a couple of extra bits: subsections (8) and (9). In subsection (8), I have included:
“A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.”
That would go a step further than the verification process and allow users to approve only people who are in their class at school, people with whom they are friends, or even certain people in their class at school, and to not have others on that list. I know that young people playing Fortnite—I have mentioned Fortnite a lot because people play it a lot—or Roblox are contacted by users whom they do not know, and there is no ability for young people to switch off some of the features while still being able to contact their friends. Users can either have no contact from anyone, or they can have a free-for-all. That is not the case for all platforms, but a chunk of them do not let users speak only to people on their friends list, or receive messages only from people on the list.
My proposed subsection (8) would ensure that children could have a “white list” of people who they believe are acceptable, and who they want to be contacted by, and could leave others off the list. That would help tackle not just online child exploitation, but the significant online bullying that teachers and children report. Children have spoken of the harms they experience as a result of people bullying them and causing trouble online; the perpetrators are mainly other children. Children would be able to remove such people from the list and so would not receive any content, messages or comments from those who make their lives more negative.
Subsection (9) is related to subsection (8); it would require a service to include
“features which child users may use or apply if they wish to filter out private messages from—
(a) non-verified users, or
(b) adult users, or
(c) any user other than those on a list approved by the child user.”
Adults looking to exploit children will use private messaging on platforms such as Instagram. Instagram has to know how old its users are, so anybody who is signed up to it will have had to provide it with their date of birth. It is completely reasonable for a child to say, “I want to filter out everything from an adult.” When we talk about children online, we are talking about anybody from zero to 18, which is a very wide age range. Some of those people will be working and paying bills, but will not have access to the empowerment features that adults have access to, because they have not yet reached that magical threshold. Some services may decide to give children access to user empowerment tools, but there is no requirement to. The only requirement in the Bill on user empowerment tools is for adults. That is not fair.
Children should have more control over the online environment. We know how many children feel sad as a result of their interactions online, and how many encounter content online that they wish they had never seen and cannot unsee. We should give them more power over that, and more power to say, “No, I don’t want to see that. I don’t want people I don’t know contacting me. I don’t want to get unsolicited messaged. I don’t want somebody messaging me, pretending that they are my friend or that they go to another school, when they are in fact an adult, and I won’t realise until it is far too late.”
The Bill applies to people of all ages. All of us make pretty crappy decisions sometimes. That includes teenagers, but they also make great decisions. If there was a requirement for them to have these tools, they could choose to make their online experience better. I do not think this was an intentional oversight, or that the Government set out to disadvantage children when they wrote the adult user empowerment clauses. I think they thought that it would be really good to have those clauses in the Bill, in order to give users a measure of autonomy over their time and interactions online. However, they have failed to include the same thing for children. It is a gap.
I appreciate that there are child safety duties, and that there is a much higher bar for platforms that have child users, but children are allowed a level of autonomy; look at the UN convention on the rights of the child. We give children choices and flexibilities; we do not force them to do every single thing they do, all day every day. We recognise that children should be empowered to make decisions where they can.
I know the Government will not accept the provision—I am not an idiot. I have never moved a new clause in Committee that has been accepted, and I am pretty sure that it will not happen today. However, if the Government were to say that they would consider, or even look at the possibility of, adding child user empowerment duties to the Bill, the internet would be a more pleasant place for children. They are going to use it anyway; let us try to improve their online experience even more than the Bill does already.
The hon. Member for Aberdeen North has outlined the case for the new clause eloquently and powerfully. She may not press it to a Division, if the Minister can give her assurances, but if she did, she would have the wholehearted support of the Opposition.
We see new clause 8 as complementing the child safety duties in the legislation. We fully welcome provisions that provide children with greater power and autonomy in choosing to avoid exposure to certain types of content. We have concerns about how the provisions would work in practice, but that issue has more to do with the Government’s triple-shield protections than the new clause.
The Opposition support new clause 8 because it aims to provide further protections, in addition to the child safety duties, to fully protect children from harmful content and to empower them. It would empower and enable them to filter out private messages from adults or non-verified users. We also welcome the measures in the new clause that require platforms and service providers to design accessible terms of service. That is absolutely vital to best protect children online, which is why we are all here, and what the legislation was designed for.
The aim of the user empowerment duty is to give adults more control over certain categories of legal content that some users will welcome greater choice over. Those duties also give adult users greater control over who they interact with online, but these provisions are not appropriate for children. As the hon. Member for Aberdeen North acknowledged, there are already separate duties on services likely to be accessed by children, in scope of part 3, to undertake comprehensive risk assessments and to comply with safety duties to protect children from harm. That includes requirements to assess how many specific functionalities may facilitate the spread of harmful content, as outlined in clause 10(6)(e), and to protect children from harmful content, including content that has been designated as priority harmful content, by putting in place age-appropriate protections.
As such, children will not need to be provided with tools to control any harmful content they see, as the platform will need to put in place age-appropriate protections. We do not want to give children an option to choose to see content that is harmful to them. The Bill also outlines in clause 11(4)(f) that, where it is proportionate to do so, service providers will be required to take measures in certain areas to meet the child-safety duties. That includes functionalities allowing for control over content that is encountered. It would not be appropriate to require providers to offer children the option to verify their identity, due to the safeguarding and data protection risks that that would pose. Although we expect companies to use technologies such as age assurance to protect children on their service, they would only be used to establish age, not identity.
The new clause would create provisions to enable children to filter out private messages from adults and users who are not on an approved list, but the Bill already contains provisions that address the risks of adults contacting children. There are also requirements on service providers to consider how their service could be used for grooming or child sexual exploitation and abuse, and to apply proportionate measures to mitigate those risks. The service providers already have to assess and mitigate the risks. They have to provide the risk assessment, and within it they could choose to mitigate risk by requiring services to prevent unknown users from contacting children.
For the reasons I have set out, the Bill already provides strong protections for children on services that they are likely to access. I am therefore not able to accept the new clause, and I hope that the hon. Member for Aberdeen North will withdraw it.
That was one of the more disappointing responses from the Minister, I am afraid. I would appreciate it if he could write to me to explain which part of the Bill provides protection to children from private messaging. I would be interested to have another look at that, so it would be helpful if he could provide details.
We do not want children to choose to see unsafe stuff, but the Bill is not strong enough on stuff like private messaging or the ability of unsolicited users to contact children, because it relies on the providers noticing that in their risk assessment, and putting in place mitigations after recognising the problem. It relies on the providers being willing to act to keep children safe in a way that they have not yet done.
When I am assisting my children online, and making rules about how they behave online, the thing I worry most about is unsolicited contact: what people might say to them online, and what they might hear from adults online. I am happy enough for them to talk to their friends online—I think that is grand—but I worry about what adults will say to them online, whether by private messaging through text or voice messages, or when they are playing a game online with the ability for a group of people working as a team together to broadcast their voices to the others and say whatever they want to say.
Lastly, one issue we have seen on Roblox, which is marketed as a children’s platform, is people creating games within it—people creating sex dungeons within a child’s game, or having conversations with children and asking the child to have their character take off their clothes. Those things have happened on that platform, and I am concerned that there is not enough protection in place, particularly to address that unsolicited contact. Given the disappointing response from the Minister, I am keen to push this clause to a vote.
Question put, That the clause be read a Second time.
I rise to recognise the spirit and principle behind new clause 9, while, of course, listening carefully to the comments made by my hon. Friend the Member for Folkestone and Hythe. He is right to raise those concerns, but my question is: is there an industry-specific way in which the same responsibility and liability could be delivered?
I recognise too that the Bill is hugely important. It is a good Bill that has child protection at its heart. It also contains far more significant financial penalties than we have previously seen—as I understand it, 10% of qualifying revenue up to £18 million. This will drive some change, but it comes against the backdrop of multi-billion-pound technology companies.
I would be interested to understand whether a double lock around the board-level responsibility might further protect children from some of the harrowing and harmful content we see online. What we need is nothing short of transformation and significant culture change. Even today, The Guardian published an article about TikTok and a study by the Centre for Countering Digital Hate, which found that teenagers who demonstrated an interest in self-harm and eating disorders were having algorithms pushing that content on to them within minutes. That is most troubling.
We need significant, serious and sustained culture change. There is precedent in other sectors, as has been mentioned, and there was a previous recommendation, so clearly there is merit in this. My understanding is that there is strong public support, because the public recognise that this new responsibility cannot be strengthened by anything other than liability. If there is board-level liability, that will drive priorities and resources, which will broker the kind of change we are looking for. I look forward to what the Minister might share today, as this has been a good opportunity to bring these issues into further consideration, and they might then be carried over into subsequent stages of this excellent Bill.
I would like to build on the excellent comments from my colleagues and to speak about child sexual abuse material. I thank my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone for tabling the amendment. I am very interested in how we can use the excellent provisions in the Bill to keep children safe from child sexual abuse material online. I am sure the Committee is aware of the devastating impact of such material.
Sexual abuse imagery—of girls in particular—is increasingly prevalent. We know that 97% of this material in 2021 showed female children. The Internet Watch Foundation took down a record-breaking 252,000 URLs that had images of children being raped, and seven in 10 of those images were of children aged 11 to 13. Unfortunately, the National Crime Agency estimates that between 550,000 and 850,000 people in the UK are searching for such material on the internet. They are actively looking for it, and at the moment they are able to find it.
My concern is with how we use what is in the Bill already to instil a top-down culture in companies, because this is about culture change in the boardroom, so that safety is considered with every decision. I have read the proceedings from previous sittings, and I recognise that the Government and Ministers have said that we have sufficient provisions to protect children, but I think there is a little bit of a grey area with tech companies.
I want to mention Apple and the update it was planning for quite a few years. There was an update that would have automatically scanned for child sex abuse material. Apple withdrew it following a backlash from encryption and privacy experts, who claimed it would undermine the privacy and security of iCloud users and make people less safe on the internet. Having previously said that it would pause it to improve it, Apple now says that it has stopped it altogether and that it is vastly expanding its end-to-end encryption, even though law enforcement agencies around the world, including our own UK law enforcement agencies, have expressed serious concerns because it makes investigations and prosecution more challenging. All of us are not technical experts. I do not believe that we are in a position to judge how legitimate it is for Apple to have this pause. What we do know is that while there is this pause, the risks for children are still there, proliferating online.
We understand completely that countering this material involves a complicated balance and that the tech giants need to walk a fine line between keeping users safe and keeping their data safe. But the question is this: if Apple and others continue to delay or backtrack, will merely failing to comply with an information request, which is what is in the Bill now, be enough to protect children from harm? Could they delay indefinitely and still be compliant with the Bill? That is what I am keen to hear from the Minister. I would be grateful if he could set out why he thinks that individuals who have the power to prevent the harmful content that has torn apart the lives of so many young people and their families should not face criminal consequences if they fail to do so. Can he reassure us as to how he thinks that the Bill can protect so many children—it is far too many children—from this material online?
Labour supports new clause 9, as liability is an issue that we have repeatedly raised throughout the passage of the Bill—most recently, on Report. As colleagues will be aware, the new clause would introduce criminal liabilities for directors who failed to comply with their duties. This would be an appropriate first step in ensuring a direct relationship between senior management of platforms and companies, and their responsibilities to protect children from significant harm. As we have heard, this measure would drive a more effective culture of awareness and accountability in relation to online safety at the top of and within the entire regulated firm. It would go some way towards ensuring that online safety was at the heart of the governance structures internally. The Bill must go further to actively promote cultural change and put online safety at the forefront of business models; it must ensure that these people are aware that it is about keeping people safe and that that must be at the forefront, over any profit. A robust corporate and senior management liability scheme is needed, and it needs to be one that imposes personal liability on directors when they put children at risk.
The Minister knows as well as I do that the benefits of doing so would be strong. We have only to turn to the coroner’s comments in the tragic case of Molly Russell’s death—which I know we are all mindful of as we debate this Bill—to fully understand the damaging impact of viewing harmful content online. I therefore urge the Minister to accept new clause 9, which we wholeheartedly support.
The Government recognise that the intent behind the new clause is to create new criminal offences of non-compliance with selected duties. It would establish a framework for personal criminal offences punishable through fines or imprisonment. It would mean that providers committed a criminal offence if they did not comply with certain duties.
We all want this Bill to be effective. We want it to be on the statute book. It is a question of getting that fine balance right, so that we can properly hold companies to account for the safety of their users. The existing approach to enforcement and senior manager liability strikes the right balance between robust enforcement and deterrent, and ensuring that the UK remains an attractive place to do business. We are confident that the Bill as a whole will bring about the change necessary to ensure that users, especially younger users, are kept safe online.
This new clause tries to criminalise not complying with the Bill’s duties. Exactly what activity would be criminalised is not obvious from the new clause, so it could be difficult for individuals to foresee exactly what type of conduct would constitute an offence. That could lead to unintended consequences, with tech executives driving an over-zealous approach to content take-down for fear of imprisonment, and potentially removing large volumes of innocuous content and so affecting the ability for open debate to take place.
Does the Minister not think that the freedom of speech stuff and the requirement to stick to terms of service that he has put in as safeguards for that are strong enough, then?
I come back to this point: I think that if people were threatened with personal legal liability, that would stifle innovation and make them over-cautious in their approach. That would remove the balance, disturb the balance, that we have tried to achieve in this iteration of the Bill. Trying to keep internet users, particularly children, safe has to be achieved alongside free speech and not at its expense.
Further, the threat of criminal prosecution for failing to comply with numerous duties also runs a real risk of damaging the attractiveness of the UK as a place to start up and grow a digital business. I want internet users in the future to be able to access all the benefits of the internet safely, but we cannot achieve that if businesses avoid the UK because our enforcement regime is so far out of kilter with international comparators. Instead, the most effective way to ensure that services act to protect people online is through the existing framework and the civil enforcement options that are already provided for in the Bill, overseen by an expert regulator.
I appreciate the Minister’s comments, but from what my hon. Friends the Members for Folkestone and Hythe, for Eastbourne, and for Redditch said this morning about TikTok—these sorts of images get to children within two and a half minutes—it seems that there is a cultural issue, which the hon. Member for Pontypridd mentioned. Including new clause 9 in the Bill would really ram home the message that we are taking this seriously, that the culture needs to change, and that we need to do all that we can. I hope that the Minister will speak to his colleagues in the Ministry of Justice to see what, if anything, can be done.
I forgot to respond to my hon. Friend’s question about whether I would meet him. I will happily meet him.
I appreciate that. We will come back to this issue on Report, but I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
Question proposed, That the Chair do report the Bill, as amended, to the House.
It is usual at this juncture for there to be a few thanks and niceties, if people wish to give them.
I apologise, Dame Angela; I did not realise that I had that formal role, but you are absolutely right.
Dame Angela, you know that I love niceties. It is Christmas—the festive season! It is a little bit warmer today because we changed room, but we remember the coldness; it reminds us that it is Christmas.
I thank you, Dame Angela, and thank all the Clerks in the House for bringing this unusual recommittal to us all, and schooling us in the recommittal process. I thank Members from all parts of the House for the constructive way in which the Bill has been debated over the two days of recommittal. I also thank the Doorkeepers and my team, many of whom are on the Benches here or in the Public Gallery. They are watching and WhatsApping—ironically, using end-to-end encryption.
I thank you, too, Dame Angela. I echo the Minister’s sentiments, and thank all the Clerks, the Doorkeepers, the team, and all the stakeholders who have massively contributed, with very short turnarounds, to the scrutiny of this legislation. I have so appreciated all that assistance and expertise, which has helped me, as shadow Minister, to compile our comments on the Bill following the Government’s recommittal of it to Committee, which is an unusual step. Huge thanks to my colleagues who joined us today and in previous sittings, and to colleagues from across the House, and particularly from the SNP, a number of whose amendments we have supported. We look forward to scrutinising the Bill further when it comes back to the House in the new year.
(1 year, 10 months ago)
Commons ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to follow my right hon. Friend the Member for Chelmsford (Vicky Ford), who made a very powerful speech, and I completely agree with her about the importance of treating eating disorders as being of the same scale of harm as other things in the Bill.
I was the media analyst for Merrill Lynch about 22 years ago, and I made a speech about the future of media in which I mentioned the landscape changing towards one of self-generated media. However, I never thought we would get to where it is now and what the effect is. I was in the Pizza Express on Gloucester Road the other day at birthday party time, and an 11-year-old boy standing in the queue was doomscrolling TikTok videos rather than talking to his friends, which I just thought was a really tragic indication of where we have got to.
Digital platforms are also critical sources of information and our public discourse. Across the country, people gather up to 80% of information from such sources, but we should not have trust in them. Their algorithms, which promote and depromote, and their interfaces, which engage, are designed, as we have heard, to make people addicted to the peer validation and augmentation of particular points of view. They are driving people down tribal rabbit holes to the point where they cannot talk to each other or even listen to another point of view. It is no wonder that 50% of young people are unhappy or anxious when they use social media, and these algorithmic models are the problem. Trust in these platforms is wrong: their promotion or depromotion of messages and ideas is opaque, often subjective and subject to inappropriate influence.
It is right that we tackle illegal activity and that harms to children and the vulnerable are addressed, and I support the attempt to do that in the Bill. Those responsible for the big platforms must be held to account for how they operate them, but trusting in those platforms is wrong, and I worry that compliance with their terms of service might become a tick-box absolution of their responsibility for unhappiness, anxiety and harm.
What about harm to our public sphere, our discourse, and our processes of debate, policymaking and science? To trust the platforms in all that would be wrong. We know they have enabled censorship. Elon Musk’s release of the Twitter files has shown incontrovertibly that the big digital platforms actively censor people and ideas, and not always according to reasonable moderation. They censor people according to their company biases, by political request, or with and on behalf of the three-letter Government agencies. They censor them at the behest of private companies, or to control information on their products and the public policy debate around them. Censorship itself creates mistrust in our discourse. To trust the big platforms always to do the right thing is wrong. It is not right that they should be able to hide behind their terms of service, bury issues in the Ofcom processes in the Bill, or potentially pay lip service to a tick-box exercise of merely “having regard” to the importance of freedom of expression. They might think they can just write a report, hire a few overseers, and then get away scot-free with their cynical accumulation, and the sale of the data of their addicted users and the manipulation of their views.
The Government have rightly acknowledged that addressing such issues of online safety is a work in progress, but we must not think that the big platforms are that interested in helping. They and their misery models are the problem. I hope that the Government, and those in the other place, will include in the Bill stronger duties to stop things that are harmful, to promote freedom of expression properly, to ensure that people have ready and full access to the full range of ideas and opinions, and to be fully transparent in public and real time about the way that content is promoted or depromoted on their platforms. Just to trust in them is insufficient. I am afraid the precedent has been set that digital platforms can be used to censor ideas. That is not the future; that is happening right now, and when artificial intelligence comes, it will get even worse. I trust that my colleagues on the Front Bench and in the other place will work hard to improve the Bill as I know it can be improved.
I strongly support the Bill. This landmark piece of legislation promises to put the UK at the front of the pack, and I am proud to see it there. We must tackle online abuse while protecting free speech, and I believe the Bill gets that balance right. I was pleased to serve on the Bill Committee in the last Session, and I am delighted to see it returning to the Chamber. The quicker it can get on to the statute book, the more children we can protect from devastating harm.
I particularly welcome the strengthened protections for children, which require platforms to clearly articulate in their terms of service what they are doing to enforce age requirements on their site. That will go some way to reassuring parents that their children’s developing brains will not be harmed by early exposure to toxic, degrading, and demeaning extreme forms of pornography. Evidence is clear that early exposure over time warps young girls’ views of what is normal in a relationship, with the result that they struggle to form healthy equal relationships. For boys, that type of sexual activity is how they learn about sex, and it normalises abusive, non-consensual and violent acts. Boys grow up into men whose neural circuits become habituated to that type of imagery. They actually require it, regardless of the boundaries of consent that they learn about in their sex education classes—I know this is a difficult and troubling subject, but we must not be afraid to tackle it, which is what we are doing with the Bill. It is well established that the rise of that type of pornography on the internet over time has driven the troubling and pernicious rise in violence against women and girls, perpetrated by men, as well as peer-on-peer child sexual abuse and exploitation.
During Committee we had a good debate about the need for greater criminal sanctions to hold directors individually to account and drive a more effective safety culture in the boardroom. I am proud to serve in the Chamber with my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates). I have heard about all their work on new clause 2 and commend them heartily for it. I listened carefully to the Minister’s remarks in Committee and thank him and the Secretary of State for their detailed engagement.
The evidence we have received is that it is parents who need the powers. I want to normalise the ability to turn off anonymised accounts. I think we will see children do that very naturally. We should also try to persuade their parents to take those stances and to have those conversations in the home. I obviously need to take up the matter with the hon. Lady and think carefully about it as matters proceed through the other place.
We know that parents are very scared about what their children see online. I welcome what the Minister is trying to do with the Bill and I welcome the legislation and the openness to change it. These days, we are all called rebels whenever we do anything to improve legislation, but the reality is that that is our job. We are sending this legislation to the other House in a better shape.
There is a lot to cover in the short time I have, but first let me thank Members for their contributions to the debate. We had great contributions from the hon. Member for Pontypridd (Alex Davies-Jones), my right hon. Friend the Member for Witham (Priti Patel) and the right hon. Member for Barking (Dame Margaret Hodge)—I have to put that right, having not mentioned her last time—as well as from my hon. Friend the Member for Gosport (Dame Caroline Dinenage); the hon. Member for Aberdeen North (Kirsty Blackman); the former Secretary of State, my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright); and the hon. Members for Plymouth, Sutton and Devonport (Luke Pollard), for Reading East (Matt Rodda) and for Leeds East (Richard Burgon).
I would happily meet the hon. Member for Plymouth, Sutton and Devonport to talk about incel content, as he requested, and the hon. Members for Reading East and for Leeds East to talk about Olly Stephens and Joe Nihill. Those are two really tragic examples and it was good to hear the tributes to them and their being mentioned in this place in respect of the changes in the legislation.
We had great contributions from my right hon. Friend the Member for South Northamptonshire (Dame Andrea Leadsom), the hon. Member for Strangford (Jim Shannon) and my hon. Friend the Member for Dover (Mrs Elphicke). I am glad that my hon. Friend the Member for Stone (Sir William Cash) gave a three-Weetabix speech—I will have to look in the Tea Room for the Weetabix he has been eating.
There were great contributions from my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Great Grimsby (Lia Nici), from my right hon. Friend the Member for Chelmsford (Vicky Ford) and from my hon. Friend the Member for Yeovil (Mr Fysh). The latter talked about doom-scrolling; I recommend that he speaks to my right hon. Friend the Member for South Holland and The Deepings (Sir John Hayes), whose quoting of G. K. Chesterton shows the advantages of reading books rather than scrolling through a phone. I also thank my hon. Friends the Members for Redditch (Rachel Maclean), for Watford (Dean Russell) and for Stroud (Siobhan Baillie).
I am also grateful for the contributions during the recommittal process. The changes made to the Bill during that process have strengthened the protections that it can offer.
We reviewed new clause 2 carefully, and I am sympathetic to its aims. We have demonstrated our commitment to strengthening protections for children elsewhere in the Bill by tabling a series of amendments at previous stages, and the Bill already includes provisions to make senior managers liable for failing to prevent a provider from committing an offence and for failing to comply with information notices. We are committed to ensuring that children are safe online, so we will work with those Members and others to bring to the other place an effective amendment that delivers our shared aims of holding people accountable for their actions in a way that is effective and targeted at child safety, while ensuring that the UK remains an attractive place for technology companies to invest and grow.
We need to take time to get this right. We intend to base our amendments on the Irish Online Safety and Media Regulation Act 2022, which, ironically, was largely based on our work here, and which introduces individual criminal liability for failure to comply with the notice to end contravention. In line with that approach, the final Government amendment, at the end of the ping-pong between the other place and this place, will be carefully designed to capture instances in which senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment or fines, will be commensurate with those applying to similar offences. While the amendment will not affect those who have acted in good faith to comply in a proportionate way, it will give the Act additional teeth—as we have heard—to deliver the change that we all want, and ensure that people are held to account if they fail to protect children properly.
As was made clear by my right hon. Friend the Member for Witham, child protection and strong implementation are at the heart of the Bill. Its strongest protections are for children, and companies will be held accountable for their safety. I cannot guarantee the timings for which my right hon. Friend asked, but we will not dilute our commitment. We have already started to speak to companies in this sphere, and I will also continue to work with her and others.
My hon. Friend has rightly prioritised the protection of children. He will recall that throughout the debate, a number of Members have asked the Government to consider the amendment that will be tabled by Baroness Kidron, which will require coroners to have access to data in cases in which the tragic death of a child may be related to social media and other online activities. Is my hon. Friend able to give a commitment from the Dispatch Box that the Government will look favourably on that amendment?
Coroners already have some powers in this area, but we are aware of instances raised by my right hon. Friend and others in which that has not been the case. We will happily work with Baroness Kidron, and others, and look favourably on changes where they are necessary.
I entirely agree that our focus has been on protecting children, but is the Minister as concerned as I am about the information and misinformation, and about the societal impacts on our democracy, not just in this country but elsewhere? The hon. Member for Watford suggested a Committee that could monitor such impacts. Is that something the Minister will reconsider?
For the purpose of future-proofing, we have tried to make the Bill as flexible and as technologically neutral as possible so that it can adapt to changes. I think we will need to review it, and indeed I am sure that, as technology changes, we will come back with new legislation in the future to ensure that we continue to be world-beating—but let us see where we end up with that.
May I follow up my hon. Friend’s response to our right hon. Friend the Member for Bromsgrove (Sajid Javid)? If it is the case that coroners cannot access data and information that they need in order to go about their duties—which was the frustrating element in the Molly Russell case—will the Government be prepared to close that loophole in the House of Lords?
We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—
I am grateful to the Minister for giving way. He was commenting on my earlier remarks about new clause 2 and the specifics around a timetable. I completely recognise that much of this work is under development. In my remarks, I asked for a timetable on engagement with the tech firms as well as transparency to this House on the progress being made on developing the regulations around criminal liability. It is important that this House sees that, and that we follow every single stage of that process.
I thank my right hon. Friend for that intervention. We want to have as many conversations as possible in this area with Members on all sides, and I hope we can be as transparent as possible in that operation. We have already started the conversation. The Secretary of State and I met some of the big tech companies just yesterday to talk about exactly this area.
My hon. Friend the Member for Dover, my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead (Mrs May) and others are absolutely right to highlight concerns about illegal small boat crossings and the harm that can be caused to people crossing in dangerous situations. The use of highly dangerous methods to enter this country, including unseaworthy, small or overcrowded boats and refrigerated lorries, presents a huge challenge to us all. Like other forms of serious and organised crime, organised immigration crime endangers lives, has a corrosive effect on society, puts pressure on border security resources and diverts money from our economy.
As the Prime Minister has said, stopping these crossings is one of the Government’s top priorities for the next year. The situation needs to be resolved and we will not hesitate to take action wherever that can have the most effect, including through this Bill. Organised crime groups continue to facilitate most migrant journeys to the UK and have no respect for human life, exploiting vulnerable migrants, treating them as commodities and knowingly putting people in life-threatening situations. Organised crime gangs are increasingly using social media to facilitate migrant crossings and we need to do more to prevent and disrupt the crimes facilitated through these platforms. We need to share best practice, improve our detection methods and take steps to close illegal crossing routes as the behaviour and methods of organised crime groups evolve.
However, amendment 82 risks having unforeseen consequences for the Bill. It could bring into question the meaning of the term “content” elsewhere in the Bill, with unpredictable implications for how the courts and companies would interpret it. Following constructive discussions with my hon. Friend the Member for Dover and my right hon. Friend the Member for Maidenhead, I can now confirm that in order to better tackle illegal immigration encouraged by organised gangs, the Government will add section 2 of the Modern Slavery Act 2015 to the list of priority offences. Section 2 makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation.
We will also add section 24 of the Immigration Act to the priority offences list in schedule 7. Although the offences in section 2 cannot be carried out online, paragraph 33 of the schedule states the priority illegal content includes the inchoate offences relating to the offences listed. Therefore aiding, abetting, counselling and conspiring in those offences by posting videos of people crossing the channel that show the activity in a positive light could be an offence that is committed online and therefore fall within what is priority illegal content. The result of this amendment would therefore be that platforms would have to proactively remove that content. I am grateful to my hon. Friend the Member for Dover and my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead for raising this important issue and I would be happy to offer them a meeting with my officials to discuss the drafting of this amendment ahead of it being tabled in the other place.
We recognise the strength of feeling on the issue of harmful conversion practices and remain committed to protecting people from these practices and making sure that they can live their lives free from the threat of harm or abuse. We have had constructive engagement with my hon. Friend the Member for Rutland and Melton (Alicia Kearns) on her amendment 84, which seeks to prevent children from seeing harmful online content on conversion practices. It is right that this issue is tackled through a dedicated and tailored legislative approach, which is why we are announcing today that the Government will publish a draft Bill to set out a proposed approach to banning conversion practices. This will apply to England and Wales. The Bill will protect everybody, including those targeted on the basis of their sexuality or being transgender. The Government will publish the Bill shortly and will ask for pre-legislative scrutiny by a Joint Committee in this parliamentary Session.
This is a complex area and pre-legislative scrutiny exists to help ensure that any Bill introduced to Parliament does not cause unintended consequences. It will also ensure that the Bill benefits from stakeholder expertise and input from parliamentarians. The legislation must not, through a lack of clarity, harm the growing number of children and young adults experiencing gender-related distress through inadvertently criminalising or chilling legitimate conversations that parents or clinicians may have with children. This is an important issue, and it needs the targeted and robust approach that a dedicated Bill would provide.
I am afraid I have only three minutes, so I am not able to give way.
The Government cannot accept the Labour amendments that would re-add the adult safety duties and the concept of content that is harmful to adults. These duties and the definition of harmful content were removed from the Bill in Committee to protect free speech and to ensure that the Bill does not incentivise tech companies to censor legal content. It is not appropriate for the Government to decide whether legal content is harmful to adult users, and then to require companies to risk assess and set terms for such content. Many stakeholders and parliamentarians are justifiably concerned about the consequences of doing so, and I share those concerns. However, the Government recognise the importance of giving users the tools and information they need to keep themselves safe online, which is why we have introduced to the Bill a fairer, simpler approach for adults—the triple shield.
Members have talked a little about user empowerment. I will not have time to cover all of that, but the Government believe we have struck the right balance of empowering adult users on the content they see and engage with online while upholding the right to free expression. For those reasons, I am not able to accept these amendments, and I hope the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson) will not press them to a vote.
The Government amendments are consequential on removing the “legal but harmful” sections, which were debated extensively in Committee.
The Government recognise the concern of my hon. Friend the Member for Stroud about anonymous online abuse, and I applaud her important campaigning in this area. We expect Ofcom to recommend effective tools for compliance, with the requirement that these tools can be applied by users who wish to filter out non-verified users. I agree that the issue covered by amendment 52 is important, and I am happy to continue working with her to deliver her objectives in this area.
My right hon. Friend the Member for Chelmsford spoke powerfully, and we take the issue incredibly seriously. We are committed to introducing a new communications offence of intentional encouragement and assistance of self-harm, which will apply whether the victim is a child or an adult.
I do not have time, but I thank all Members who contributed to today’s debate. I pay tribute to my officials and to all the Ministers who have worked on this Bill over such a long time.
I beg to ask leave to withdraw the clause.
Clause, by leave, withdrawn.
I beg to move, That the Bill be now read the Third time.
It has been a long road to get here, and it has required a huge team effort that has included Members from across the House, the Joint Committee, Public Bill Committees, the Ministers who worked on this over the years in the Department for Digital, Culture, Media and Sport and my predecessors as Secretaries of State. Together, we have had some robust and forthright debates, and it is thanks to Members’ determination, expertise and genuine passion on this issue that we have been able to get to this point today. Our differences of opinion across the House have been dwarfed by the fact that we are united in one single goal: protecting children online.
I have been clear since becoming Secretary of State that protecting children is the very reason that this Bill exists, and the safety of every child up and down the UK has driven this legislation from the start. After years of inaction, we want to hold social media companies to account and make sure that they are keeping their promises to their own users and to parents. No Bill in the world has gone as far as this one to protect children online. Since this legislation was introduced last year, the Government have gone even further and made a number of changes to enhance and broaden the protections in the Bill while also securing legal free speech. If something should be illegal, we should have the courage of our convictions to make it illegal, rather than creating a quasi-legal category. That is why my predecessor’s change that will render epilepsy trolling illegal is so important, and why I was determined to ensure that the promotion of self-harm, cyber-flashing and intimate image abuse are also made illegal once and for all in this Bill.
Will my right hon. Friend make it clear, when the Bill gets to the other place, that content that glamorises eating disorders will be treated as seriously as content glamorising other forms of self-harm?
I met my right hon. Friend today to discuss that very point, which is particularly important and powerful. I look forward to continuing to work with her and the Ministry of Justice as we progress this Bill through the other place.
The changes are balanced with new protections for free speech and journalism—two of the core pillars of our democratic society. There are amendments to the definition of recognised news publishers to ensure that sanctioned outlets such as RT must not benefit.
Since becoming Secretary of State I have made a number of my own changes to the Bill. First and foremost, we have gone even further to boost protections for children. Social media companies will face a new duty on age limits so they can no longer turn a blind eye to the estimated 1.6 million underage children who currently use their sites. The largest platforms will also have to publish summaries of their risk assessments for illegal content and material that is harmful for children—finally putting transparency for parents into law.
I believe it is blindingly obvious and morally right that we should have a higher bar of protection when it comes to children. Things such as cyber-bullying, pornography and posts that depict violence do enormous damage. They scar our children and rob them of their right to a childhood. These measures are all reinforced by children and parents, who are given a real voice in the legislation by the inclusion of the Children’s Commissioner as a statutory consultee. The Bill already included provisions to make senior managers liable for failure to comply with information notices, but we have now gone further. Senior managers who deliberately fail children will face criminal liability. Today, we are drawing our line in the sand and declaring that the UK will be the world’s first country to comprehensively protect children online.
Those changes are completely separate to the changes I have made for adults. Many Members and stakeholders had concerns over the “legal but harmful” section of the Bill. They were concerned that it would be a serious threat to legal free speech and would set up a quasi-legal grey area where tech companies would be encouraged to take down content that is perfectly legal to say on our streets. I shared those concerns, so we have removed “legal but harmful” for adults. We have replaced it with a much simpler and fairer and, crucially, much more effective mechanism that gives adults a triple shield of protection. If it is illegal, it has to go. If it is banned under the company’s terms and conditions, it has to go.
Lastly, social media companies will now offer adults a range of tools to give them more control over what they see and interact with on their own feeds.
My right hon. Friend makes an important point about things that are illegal offline but legal online. The Bill has still not defined a lot of content that could be illegal and yet promoted through advertising. As part of their ongoing work on the Bill and the online advertising review, will the Government establish the general principle that content that is illegal will be regulated whether it is an ad or a post?
I completely agree with my hon. Friend on the importance of this topic. That is exactly why we have the online advertising review, a piece of work we will be progressing to tackle the nub of the problem he identifies. We are protecting free speech while putting adults in the driving seat of their own online experience. The result is today’s Bill.
I thank hon. Members for their hard work on this Bill, including my predecessors, especially my right hon. Friend the Member for Mid Bedfordshire (Ms Dorries). I thank all those I have worked with constructively on amendments, including my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates), for Stone (Sir William Cash), for Dover (Mrs Elphicke), for Rutland and Melton (Alicia Kearns), and my right hon. Friends the Members for South Holland and The Deepings (Sir John Hayes), for Chelmsford (Vicky Ford), for Basingstoke (Dame Maria Miller) and for Romsey and Southampton North (Caroline Nokes).
I would like to put on record my gratitude for the hard work of my incredibly dedicated officials—in particular, Sarah Connolly, Orla MacRae and Emma Hindley, along with a number of others; I cannot name them all today, but I note their tremendous and relentless work on the Bill. Crucially, I thank the charities and devoted campaigners, such as Ian Russell, who have guided us and pushed the Bill forward in the face of their own tragic loss. Thanks to all those people, we now have a Bill that works.
Legislating online was never going to be easy, but it is necessary. It is necessary if we want to protect our values —the values that we protect in the real world every single day. In fact, the NSPCC called this Bill “a national priority”. The Children’s Commissioner called it
“a once-in-a-lifetime opportunity to protect all children”.
But it is not just children’s organisations that are watching. Every parent across the country will know at first hand just how difficult it is to shield their children from inappropriate material when social media giants consistently put profit above children’s safety. This legislation finally puts it right.
(1 year, 10 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
That the Bill be now read a second time.
My Lords, I am very glad to be here to move the Second Reading of the Online Safety Bill. I know that this is a moment which has been long awaited in your Lordships’ House and noble Lords from across the House share the Government’s determination to make the online realm safer.
That is what this Bill seeks to do. As it stands, over three quarters of adults in this country express a concern about going online; similarly, the number of parents who feel the benefits outweigh the risks of their children being online has decreased rather than increased in recent years, falling from two-thirds in 2015 to barely over half in 2019. This is a terrible indictment of a means through which people of all ages are living increasing proportions of their lives, and it must change.
All of us have heard the horrific stories of children who have been exposed to dangerous and deeply harmful content online, and the tragic consequences of such experiences both for them and their families. I am very grateful to the noble Baroness, Lady Kidron, who arranged for a number of noble Lords, including me, to see some of the material which was pushed relentlessly at Molly Russell whose family have campaigned bravely and tirelessly to ensure that what happened to their daughter cannot happen to other young people. It is with that in mind, at the very outset of our scrutiny of this Bill, that I would like to express my gratitude to all those families who continue to fight for change and a safer, healthier online realm. Their work has been central to the development of this Bill. I am confident that, through it, the Government’s manifesto commitment to make the UK the safest place in the world to be online will be delivered.
This legislation establishes a regulatory regime which has safety at its heart. It is intended to change the mindset of technology companies so that they are forced to consider safety and risk mitigation when they begin to design their products, rather than as an afterthought.
All companies in scope will be required to tackle criminal content and activity online. If it is illegal offline; it is illegal online. All in-scope platforms and search services will need to consider in risk assessments the likelihood of illegal content or activity taking place on their site and put in place proportionate systems and processes to mitigate those risks. Companies will also have to take proactive measures against priority offences. This means platforms will be required to take proportionate steps to prevent people from encountering such content.
Not only that, but platforms will also need to mitigate the risk of the platform being used to facilitate or commit such an offence. Priority offences include, inter alia: terrorist material, child sexual abuse and exploitation, so-called revenge pornography and material encouraging or assisting suicide. In practice, this means that all in-scope platforms will have to remove this material quickly and will not be allowed to promote it in their algorithms.
Furthermore, for non-priority illegal content, platforms must have effective systems in place for its swift removal once this content has been flagged to them. Gone will be the days of lengthy and arduous complaints processes and platforms feigning ignorance of such content. They can and will be held to account.
As I have previously mentioned, the safety of children is of paramount importance in this Bill. While all users will be protected from illegal material, some types of legal content and activity are not suitable for children and can have a deeply damaging impact on their mental health and their developing sense of the world around them.
All in-scope services which are likely to be accessed by children will therefore be required to assess the risks to children on their service and put in place safety measures to protect child users from harmful and age inappropriate content. This includes content such as that promoting suicide, self-harm or eating disorders which does not meet a criminal threshold; pornography; and damaging behaviour such as bullying.
The Bill will require providers specifically to consider a number of risk factors as part of their risk assessments. These factors include how functionalities such as algorithms could affect children’s exposure to content harmful to children on their service, as well as children’s use of higher risk features on the service such as livestreaming or private messaging. Providers will need to take robust steps to mitigate and effectively manage any risks identified.
Companies will need to use measures such as age verification to prevent children from accessing content which poses the highest risk of harm to them, such as online pornography. Ofcom will be able to set out its expectations about the use of age assurance solutions, including age verification tools, through guidance. This guidance will also be able to refer to relevant standards. The Bill also now makes it clear that providers may need to use age assurance to identify the age of their users to meet the necessary child safety duties and effectively enforce age restrictions on their service.
The Government will set out in secondary legislation the priority categories of content harmful to children so that all companies are clear on what they need to protect children from. Our intention is to have the regime in place as soon as possible after Royal Assent, while ensuring the necessary preparations are completed effectively and service providers understand clearly what is expected. We are working closely with Ofcom and I will keep noble Lords appraised.
My ministerial colleagues in another place worked hard to strengthen these provisions and made commitments to introduce further provisions in your Lordships’ House. With regard to increased protections for children specifically, the Government will bring forward amendments at Committee stage to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it is preparing a code of practice, ensuring that the experience of children and young people is accounted for during implementation.
We will also bring forward amendments to specify that category 1 companies—the largest and most risky platforms—will be required to publish a summary of their risk assessments for both illegal content and material that is harmful to children. This will increase transparency about illegal and harmful content on in-scope services and ensure that Ofcom can do its job regulating effectively.
We recognise the great suffering experienced by many families linked to children’s exposure to harmful content and the importance of this Bill in ending that. We must learn from the horrific events from the past to secure a safe future for children online.
We also understand that, unfortunately, people of any age may experience online abuse. For many adults, the internet is a positive source of entertainment and information and a way to connect with others; for some, however, it can be an arena for awful abuse. The Bill will therefore offer adult users a triple shield of protection when online, striking the right balance between protecting the right of adult users to access legal content freely, and empowering adults with the information and tools to manage their own online experience.
First, as I have outlined, all social media firms and search services will need to tackle illegal content and activity on their sites. Secondly, the Bill will require category 1 services to set clear terms of service regarding the user-generated content they prohibit and/or restrict access to, and to enforce those terms of service effectively. All the major social media platforms such as Meta, Twitter and TikTok say that they ban abuse and harassment online. They all say they ban the promotion of violence and violent threats, yet this content is still easily visible on those sites. People sign up to these platforms expecting one environment, and are presented with something completely different. This must stop.
As well as ensuring the platforms have proper systems to remove banned content, the Bill will also put an end to services arbitrarily removing legal content. The largest platform category 1 services must ensure that they remove or restrict access to content or ban or suspend users only where that is expressly allowed in their terms of service, or where they otherwise have a legal obligation to do so.
This Bill will make sure that adults have the information they need to make informed decisions about the sites they visit, and that platforms are held to their promises to users. Ofcom will have the power to hold platforms to their terms of service, creating a safer and more transparent environment for all.
Thirdly, category 1 services will have a duty to provide adults with tools they can use to reduce the likelihood that they encounter certain categories of content, if they so choose, or to alert them to the nature of that content. This includes content which encourages, promotes, or provides instructions for suicide, self-harm or eating disorders. People will also have the ability to filter out content from unverified users if they so wish. This Bill will mean that adult users will be empowered to make more informed choices about what services they use, and to have greater control over whom and what they engage with online.
It is impossible to speak about the aspects of the Bill which protect adults without, of course, mentioning freedom of expression. The Bill needs to strike a careful balance between protecting users online, while maintaining adults’ ability to have robust—even uncomfortable or unpleasant—conversations within the law if they so choose. Freedom of expression within the law is fundamental to our democracy, and it would not be right for the Government to interfere with what legal speech is permitted on private platforms. Instead, we have developed an approach based on choice and transparency for adult users, bounded by major platforms’ clear commercial incentives to provide a positive experience for their users.
Of course, we cannot have robust debate without being accurately informed of the current global and national landscape. That is why the Bill includes particular protections for recognised news publishers, content of democratic importance, and journalistic content. We have been clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections. We will therefore bring forward an amendment in your Lordships’ House explicitly to exclude entities subject to sanctions from the definition of a recognised news publisher.
Alongside the safety duties for children and the empowerment tools for adults, platforms must also have effective reporting and redress mechanisms in place. They will need to provide accessible and effective mechanisms for users to report content which is illegal or harmful, or where it breaches terms and conditions. Users will need to be given access to effective mechanisms to complain if content is removed without good reason.
The Bill will place a duty on platforms to ensure that those reporting mechanisms are backed up by timely and appropriate redress mechanisms. Currently, internet users often do not bother to report harmful content they encounter online, because they do not feel that their reports will be followed up. That too must change. If content has been unfairly removed, it should be reinstated. If content should not have been on the site in question, it should be taken down. If a complaint is not upheld, the reasons should be made clear to the person who made the report.
There have been calls—including from the noble Lord, Lord Stevenson of Balmacara, with whom I look forward to working constructively, as we have done heretofore—to use the Bill to create an online safety ombudsman. We will listen to all suggestions put forward to improve the Bill and the regime it ushers in with an open mind, but as he knows from our discussions, of this suggestion we are presently unconvinced. Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them. Instead, the Bill ensures that, where providers’ user-reporting and redress mechanisms are not sufficient, Ofcom will have the power to take enforcement action and require the provider to improve its user-redress provisions to meet the standard required of them. I look forward to probing elements of the Bill such as this in Committee.
This regulatory framework could not be effective if Ofcom, as the independent regulator, did not have a robust suite of powers to take enforcement actions against companies which do not comply with their new duties, and if it failed to take the appropriate steps to protect people from harm. I believe the chairman of Ofcom, the noble Lord, Lord Grade of Yarmouth, is in his place. I am glad that he has been and will be following our debates on this important matter.
Through the Bill, Ofcom will have wide-ranging information-gathering powers to request any information from companies which is relevant to its safety functions. Where necessary, it will be able to ask a suitably skilled person to undertake a report on a company’s activity—for example, on its use of algorithms. If Ofcom decides to take enforcement action, it can require companies to take specific steps to come back into compliance.
Ofcom will also have the power to impose substantial fines of up to £18 million, or 10% of annual qualifying worldwide revenue, whichever is higher. For the biggest technology companies, this could easily amount to billions of pounds. These are significant measures, and we have heard directly from companies that are already changing their safety procedures to ensure they comply with these regulations.
If fines are not sufficient, or not deemed appropriate because of the severity of the breach, Ofcom will be able to apply for a court order allowing it to undertake business disruption measures. This could be blocking access to a website or preventing it making money via payment or advertising services. Of course, Ofcom will be able to take enforcement action against any company that provides services to people in the UK, wherever that company is located. This is important, given the global nature of the internet.
As the Bill stands, individual senior managers can be held criminally liable and face a fine for failing to ensure their platform complies with Ofcom’s information notice. Further, individual senior managers can face jail, a fine or both for failing to prevent the platform committing the offences of providing false information, encrypting information or destroying information in response to an information notice.
The Government have also listened to and acknowledged the need for senior managers to be made personally liable for a wider range of failures of compliance. We have therefore committed to tabling an amendment in your Lordships’ House which will be carefully designed to capture instances where senior managers have consented to or connived in ignoring enforceable requirements, risking serious harm to children. We are carefully designing this amendment to ensure that it can hold senior managers to account for their actions regarding the safety of children, without jeopardising the UK’s attractiveness as a place for technology companies to invest in and grow. We intend to base our offence on similar legislation recently passed in the Republic of Ireland, as well as looking carefully at relevant precedent in other sectors in the United Kingdom.
I have discussed the safety of children, adults, and everyone’s right to free speech. It is not possible to talk about this Bill without also discussing its protections for women and girls, who we know are disproportionately affected by online abuse. As I mentioned, all services in scope will need to seek out and remove priority illegal content proactively. There are a number of offences which disproportionately affect women and girls, such as revenge pornography and cyberstalking, which the Bill requires companies to tackle as a priority.
To strengthen protections for women in particular, we will be listing controlling or coercive behaviour as a priority offence. Companies will have to take proactive measures to tackle this type of illegal content. We will also bring forward an amendment to name the Victims’ Commissioner and the domestic abuse commissioner as statutory consultees for the codes of practice. This means there will be a requirement for Ofcom to consult both commissioners ahead of drafting and amending the codes of practice, ensuring that victims, particularly victims and survivors of domestic abuse, are better protected. The Secretary of State and our colleagues have been clear that women’s and girls’ voices must be heard clearly in developing this legislation.
I also want to take this opportunity to acknowledge the concerns voiced over the powers for the Secretary of State regarding direction in relation to codes of practice that currently appear in the Bill. That is a matter on which my honourable friend Paul Scully and I were pressed by your Lordships’ Communications and Digital Committee when we appeared before it last week. As we explained then, we remain committed to ensuring that Ofcom maintains its regulatory independence, which is vital to the success of this framework. As we are introducing ground-breaking regulation, our aim is to balance the need for the regulator’s independence with appropriate oversight by Parliament and the elected Government.
We intend to bring forward two changes to the existing power: first, replacing the “public policy” wording with a defined list of reasons that a direction can be made; and secondly, making it clear that this element of the power can only be used in exceptional circumstances. I would like to reassure noble Lords—as I sought to reassure the Select Committee—that the framework ensures that Parliament will always have the final say on codes of practice, and that strong safeguards are in place to ensure that the use of this power is transparent and proportionate.
Before we begin our scrutiny in earnest, it is also necessary to recognise that this Bill is not just establishing a regulatory framework. It also updates the criminal law concerning communication offences. I want to thank the Law Commission for its important work in helping to strengthen criminal law for victims. The inclusion of the new offences for false and threatening communications offers further necessary protections for those who need it most. In addition, the Bill includes new offences to criminalise cyberflashing and epilepsy trolling. We firmly believe that these new offences will make a substantive difference to the victims of such behaviour. The Government have also committed to adding an additional offence to address the encouragement or assistance of self-harm communications and offences addressing intimate image abuse online, including deep- fake pornography. Once these offences are introduced, all companies will need to treat this content as illegal under the framework and take action to prevent users from encountering it. These new offences will apply in respect of all victims of such activity, children as well as adults.
This Bill has been years in the making. I am proud to be standing here today as the debate begins in your Lordships’ House. I realise that noble Lords have been waiting long and patiently for this moment, but I know that they also appreciate that considerable work has already been done to ensure that this Bill is proportionate and fair, and that it provides the change that is needed.
A key part of that work was conducted by the Joint Committee, which conducted pre-legislative scrutiny of the Bill, drawing on expertise from across both Houses of Parliament, from all parties and none. I am very glad that all the Members of your Lordships’ House who served on that committee are speaking in today’s debate: the noble Baroness, Lady Kidron; the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, who have very helpfully been called to service on the Opposition Front Bench; the noble Lord, Lord Clement-Jones, who speaks for the Liberal Democrats; as well as my noble friends Lord Black of Brentwood and Lord Gilbert of Panteg.
While I look forward to the contributions of all Members of your Lordships’ House, and will continue the open-minded, collaborative approach established by my right honourable friend the Secretary of State and her predecessors—listening to all ideas which are advanced to make this Bill as effective as it can be—I urge noble Lords who are not yet so well-versed in its many clauses and provisions, or who might be disinclined to accept at first utterance the points I make from this Dispatch Box, to consult those noble Lords before bringing forward their amendments in later stages of the Bill. I say that not to discourage noble Lords from doing so, but in the spirit of ensuring that what they do bring forward, and our deliberations on them, will be pithy, focused, and conducive to making this Bill law as swiftly as possible. In that spirit, I shall draw my already too lengthy remarks to a close. I beg to move.
My Lords, I declare my interest, as set out in the register, as a member of the advisory council of the Free Speech Union.
This is an important Bill. It has taken time to get to us, and rightly so. Many important requirements have to be balanced in it—the removal of illegal material, and the protection of children, as we have heard so movingly already today. But, as legislators, we must also have an eye on all elements of public policy. We cannot eliminate every evil entirely, except at unacceptable cost to other objectives and, notably, to free speech.
The Bill, as it was developing last summer, was damaging in many ways to that objective. At times I was quite critical of it, so I welcome the efforts that have been made by the new broom and new team at DCMS to put it in a better place. It is not perfect, but is considerably better and less damaging to the free speech objective. In particular, I welcome the removal of the so-called legal but harmful provisions, their replacement with a duty to empower users and the decision to list out the areas that this provision applies to, rather than leaving it to secondary legislation. I also welcome the strengthening of provisions to protect the right to free speech and democratic debate more broadly, although I will come on to a couple of concerns, and the dropping of the new harmful communications offence in the original Bill. It is clear, from what we have heard so far today, that there will be proposals to move backwards—as I would see it—to the original version of the Bill. I hope that the Government will be robust on that, having taken the position that they have.
Although the Bill is less damaging, it must still be fit for purpose. With 25,000 companies in its scope, it also affects virtually every individual in the country, so it is important that it is clear and usable and does not encourage companies to be too risk averse. With that in mind, there are areas for improvement. Given the time constraints, I will focus on free speech.
I believe that in a free society, adults—not children but adults—should be able to cope with free debate, if they are given the tools to do so. Noble Lords have spoken already about the abuse that they get online, and we all do. I am sure I am not unique in that; some if it drifts into the real world as well, from time to time. However, I do not look to the Government to defend me from it. I already have most of the tools to turn that off when I want to, which I think is the right approach. It is the one that the Government are pursuing. Free speech is the best way of dealing with controversial issues, as we have seen in the last few weeks, and it is right for the Government to err on the side of caution and not allow a chilling effect in practice.
With this in mind, there are a couple of improvements that I hope the Government might consider. For example, they could require an opt-out from seeing the relevant “legal but harmful” content, rather than an opt-in to see it, and ensure those tools are easy to use. There is otherwise a risk that risk-averse providers will block controversial content and people will not even know about it. It could be useful to require providers to say how they intend to protect freedom of speech, just as they are required to say explicitly how they will manage the Clause 12 provisions. Without that, there is some risk that freedom of speech may become a secondary objective.
To repeat, there has been considerable improvement overall. I welcome my noble friend the Minister’s commitment to listen carefully to all proposals as we take the Bill through in this House. I am happy to support him in enabling the passage of this legislation in good order soon.
My Lords, I am grateful to the very many noble Lords who have spoken this afternoon and this evening. They have spoken with passion—we heard that in the voices of so many—about their own experiences, the experiences of their families and the experiences of far too many of our fellow subjects, who have harrowing examples of the need for this Bill. But noble Lords have also spoken with cool-headed precision and forensic care about the aspects of the Bill that demand our careful scrutiny. Both hearts and heads are needed to make this Bill worth the wait.
I am very grateful for the strong consensus that has come through in noble Lords’ speeches on the need to make this Bill law and to do so quickly, and therefore to do our work of scrutiny diligently and speedily. I am grateful for the very generous and public-spirited offer the noble Lord, Lord Stevenson, has just issued. I, too, would like to make this not a party-political matter; it is not and has not been in the speeches we have heard today. The work of your Lordships’ House is to consider these matters in detail and without party politics intruding, and it would be very good if we could proceed on the basis of collaboration, co-operation and, on occasion, compromise.
In that spirit, I should say at the outset that I share the challenge faced by the noble Lords, Lord Clement-Jones and Lord Stevenson. Given that so many speakers have chosen to contribute, I will not be able to cover or acknowledge everyone who has spoken. I shall undoubtedly have to write on many of the issues to provide the technical detail that the matters they have raised deserve. It is my intention to write to noble Lords and invite them to join a series of meetings to look in depth at some of the themes and areas between now and Committee, so that as a group we can have well-informed discussions in Committee. I shall write with details suggesting some of those themes, and if noble Lords feel that I have missed any, or particular areas they would like to continue to talk about, please let me know and I will be happy to facilitate those.
I want to touch on a few of the issues raised today. I shall not repeat some of the points I made in my opening speech, given the hour. Many noble Lords raised the very troubling issue of children accessing pornography online, and I want to talk about that initially. The Government share the concerns raised about the lack of protections for children from this harmful and deeply unsuitable content. That is why the Bill introduces world-leading protections for children from online pornography. The Bill will cover all online sites offering pornography, including commercial pornography sites, social media, video-sharing platforms and fora, as well as search engines, which play a significant role in enabling children to access harmful and age-inappropriate content online. These companies will have to prevent children accessing pornography or face huge fines. To ensure that children are protected from this content, companies will need to put in place measures such as age verification, or demonstrate that the approach they are taking delivers the same level of protection for children.
While the Bill does not mandate that companies use specific technologies to comply with these new duties, in order to ensure that the Bill is properly future-proofed, we expect Ofcom to take a robust approach to sites which pose the highest risk of harm to children, including sites hosting online pornography. That may include directing the use of age verification technologies. Age verification is also referred to in the Bill. This is to make clear that these are measures that the Government expect to be used for complying with the duties under Part 3 and Part 5 to protect children from online pornography. Our intention is to have the regime operational as soon as possible after Royal Assent, while ensuring that the necessary preparations are completed effectively and that service providers understand what is expected of them. We are working very closely with Ofcom to ensure this.
The noble Lord, Lord Morrow, and others asked about putting age verification in the Bill more clearly, as was the case with the Digital Economy Act. The Online Safety Bill includes references to age assurance and age verification in the way I have just set out. That is to make clear that these are measures which the Government expect to be used for complying with the duties where proportionate to do so. While age assurance and age verification are referred to in the Bill, the Government do not mandate the use of specific approaches or technologies. That is similar to the approach taken in the Digital Economy Act, which did not mandate the use of a particular technology either.
I think my noble friend Lord Bethell prefers the definition of pornography in Part 3 of the Digital Economy Act. There is already a robust definition of “pornographic content” in this Bill which is more straightforward for providers and Ofcom to apply. That is important. The definition we have used is similar to the definition of pornographic content used in existing legislation such as the Coroners and Justice Act 2009. It is also in line with the approach being taken by Ofcom to regulate UK-established video-sharing platforms, meaning that the industry will already have familiarity with this definition and that Ofcom will already have experience in regulating content which meets this definition. That means it can take action more swiftly. However, I have heard the very large number of noble Lords who are inclined to support the work that my noble friend is doing in the amendments he has proposed. I am grateful for the time he has already dedicated to conversations with the Secretary of State and me on this and look forward to discussing it in more detail with him between now and Committee.
A number of noble Lords, including the noble Baronesses, Lady Finlay of Llandaff and Lady Kennedy of The Shaws, talked about algorithms. All platforms will need to undertake risk assessments for illegal content. Services likely to be accessed by children will need to undertake a children’s risk assessment to ensure they understand the risks associated with their services. That includes taking into account in particular the risk of algorithms used by their service. In addition, the Bill includes powers to ensure that Ofcom is able effectively to assess whether companies are fulfilling their regulatory requirements, including in relation to the operating of their algorithms. Ofcom will have the power to require information from companies about the operation of their algorithms and the power to investigate non-compliance as well as the power to interview employees. It will have the power to require regulated service providers to undergo a skilled persons report and to audit company systems and processes, including in relation to their algorithms.
The noble Baroness, Lady Kidron, rightly received many tributes for her years of work in relation to so many aspects of this Bill. She pressed me on bereaved parents’ access to data and, as she knows, it is a complex issue. I am very grateful to her for the time she has given to the meetings that the Secretary of State and I have had with her and with colleagues from the Ministry of Justice on this issue, which we continue to look at very carefully. We acknowledge the distress that some parents have indeed experienced in situations such as this and we will continue to work with her and the Ministry of Justice very carefully to assess this matter, mindful of its complexities which, of course, were something the Joint Committee grappled with as well.
The noble Baroness, Lady Featherstone, my noble friend Lady Wyld and others focused on the new cyberflashing offence and suggested that a consent-based approach would be preferable. The Law Commission looked at that in drawing up its proposals for action in this area. The Law Commission’s report raised concerns about the nature of consent in instant messaging conversations, particularly where there are misjudged attempts at humour or intimacy that could particularly affect young people. There is a risk, which we will want to explore in Committee, of overcriminalising young people. That is why the Government have brought forward proposals based on the Law Commission’s work. If noble Lords are finding it difficult to see the Law Commission’s reports, I am very happy to draw them to their attention so that they can benefit from the consultation and thought it conducted on this difficult issue.
The noble Baroness, Lady Gohir, talked about the impact on body image of edited images in advertising. Through its work on the online advertising programme, DCMS is considering how the Government should approach advertisements that contribute to body image concerns. A consultation on this programme closed in June 2022. We are currently analysing the responses to the consultation and developing policy. Where there is harmful user-generated content related to body image that risks having an adverse physical or psychological impact on children, the Online Safety Bill will require platforms to take action against that. Under the Bill’s existing risk assessment duties, regulated services are required to consider how media literacy can be used to mitigate harm for child users. That could include using content provenance technology, which can empower people to identify when content has been digitally altered in ways such as the noble Baroness mentioned.
A number of noble Lords focused on the changes made in relation to the so-called “legal but harmful” measures to ensure that adults have the tools they need to curate and control their experience online. In particular, noble Lords suggested that removing the requirement for companies to conduct risk assessments in relation to a list of priority content harmful to adults would reduce protections available for users. I do not agree with that assessment. The new duties will empower adult users to make informed choices about the services they use and to protect themselves on the largest platforms. The new duties will require the largest platforms to enforce all their terms of service regarding the moderation of user-generated content, not just the categories of content covered in a list in secondary legislation. The largest platforms already prohibit the most abusive and harmful content. Under the new duties, platforms will be required to keep their promises to users and take action to remove it.
There was rightly particular focus on vulnerable adult users. The noble Baronesses, Lady Hollins and Lady Campbell of Surbiton, and others spoke powerfully about that. The Bill will give vulnerable adult users, including people with disabilities, greater control over their online experience too. When using a category 1 service, they will be able to reduce their exposure to online abuse and hatred by having tools to limit the likelihood of their encountering such content or to alert them to the nature of it. They will also have greater control over content that promotes, encourages or provides instructions for suicide, self-harm and eating disorders. User reporting and redress provisions must be easy to access by all users, including people with a disability and adults with caring responsibilities who are providing assistance. Ofcom is of course subject to the public sector equality duty as well, so when performing its duties, including writing its codes of practice, it will need to take into account the ways in which people with protected characteristics, including people with disabilities, can be affected. I would be very happy to meet the noble Baronesses and others on this important matter.
The noble Lords, Lord Hastings of Scarisbrick and Lord Londesborough, and others talked about media literacy. The Government fully recognise the importance of that in achieving online safety. As well as ensuring that companies take action to keep users safe through this Bill, we are taking steps to educate and empower them to make safe and informed choices online. First, the Bill strengthens Ofcom’s existing media literacy functions. Media literacy is included in Ofcom’s new transparency reporting and information-gathering powers. In response to recommendations from the Joint Committee, the legislation also now specifies media literacy in the risk-assessment duties. In July 2021, DCMS published the online media literacy strategy, which sets out our ambition to improve national media literacy. We have committed to publishing annual action plans in each financial year until 2024-25, setting out our plans to deliver that. Furthermore, in December of that year, Ofcom published Ofcom’s Approach to Online Media Literacy, which includes an ambitious range of work focusing on media literacy.
Your Lordships’ House is, understandably, not generally enthusiastic about secondary legislation and secondary legislative powers, so I was grateful for the recognition by many tonight of the importance of providing for them in certain specific instances through this Bill. As the noble Lord, Lord Brooke of Alverthorpe, put it, there may be loopholes that Parliament wishes to close, and quickly. My noble friend Lord Inglewood spoke of the need for “living legislation”, and it is important to stress, as many have, that this Bill seeks to be technology-neutral—not specifying particular technological approaches that may quickly become obsolete—in order to cater for new threats and challenges as yet not envisaged. Some of those threats and challenges were alluded to in the powerful speech of my noble friend Lord Sarfraz. I know noble Lords will scrutinise those secondary powers carefully. I can tell my noble friend that the Bill does apply to companies that enable users to share content online or interact with each other, as well as to search services. That includes a broad range of services, including the metaverse. Where haptics enable user interaction, companies must take action. The Bill is also clear that content generated by bots is in scope where it interacts with user-generated content such as on Twitter, but not if the bot is controlled by or on behalf of the service, such as providing customer services for a particular site.
Given the range of secondary powers and the changing technological landscape, a number of noble Lords understandably focused on the need for post-legislative scrutiny. The Bill has undoubtedly benefited from pre-legislative scrutiny. As I said to my noble friend Lady Stowell of Beeston in her committee last week, we remain open-minded on the best way of doing that. We must ensure that once this regime is in force, it has the impact we all want it to have. Ongoing parliamentary scrutiny will be vital in ensuring that is the case. We do not intend to legislate for a new committee, not least because it is for Parliament itself to decide what committees it sets up. But I welcome further views on how we ensure that we have effective parliamentary scrutiny, and I look forward to discussing that in Committee. We have also made it very clear that the Secretary of State will undertake a review of the effectiveness of the regime between two and five years after it comes into force, producing a report that will then be laid in Parliament, thus providing a statutory opportunity for Parliament to scrutinise the effectiveness of the legislation.
My noble friend and other members of her committee followed up with a letter to me about the Secretary of State’s powers. I shall reply to that letter in detail and make that available to all noble Lords to see ahead of Committee. This is ground-breaking legislation, and we have to balance the need for regulatory independence with the appropriate oversight for Parliament and the Government. In particular, concerns were raised about the Secretary of State’s power of direction in Clause 39. Ofcom’s independence and expertise will be of utmost importance here, but the very broad nature of online harms means that there may be subjects that go beyond its expertise and remit as a regulator. That was echoed by Ofcom itself when giving evidence to the Joint Committee: it noted that there will clearly be some issues in respect of which the Government have access to expertise and information that the regulator does not, such as national security.
The framework in the Bill ensures that Parliament will always have the final say on codes of practice, and the use of the affirmative procedure will further ensure that there is an increased level of scrutiny in the exceptional cases where that element of the power is used. As I said, I know that we will look at that in detail in Committee.
My noble friend Lord Black of Brentwood, quoting Stanley Baldwin, talked about the protections for journalistic content. He and others are right that the free press is a cornerstone of British democracy; that is why the Bill has been designed to protect press and media freedom and why it includes robust provisions to ensure that people can continue to access diverse news sources online. Category 1 companies will have a new duty to safeguard all journalistic content shared on their platform, which includes citizen journalism. Platforms will need to put systems and processes in place to protect journalistic content, and they must enforce their terms of service consistently across all moderation and in relation to journalistic content. They will also need to put in place expedited appeals processes for producers of journalistic content.
The noble Baroness, Lady Anderson of Stoke-on-Trent, spoke powerfully about the appalling abuse and threats of violence she sustained in her democratic duties, and the noble Baroness, Lady Foster, spoke powerfully of the way in which that is putting off people, particularly women, from going into public life. The noble Baroness, Lady Anderson, asked about a specific issue: the automatic deletion of material and the implications for prosecution. We have been mindful of the scenario where malicious users post threatening content which they then delete themselves, and of the burden on services that retaining that information in bulk would cause. We have also been mindful of the imperative to ensure that illegal content cannot be shared and amplified online by being left there. The retention of data for law enforcement purposes is strictly regulated, particularly through the Investigatory Powers Act, which the noble Lord, Lord Anderson of Ipswich, is reviewing at the request of the Home Secretary. I suggest that the noble Baroness and I meet to speak about that in detail, mindful of that ongoing review and the need to bring people to justice.
The noble Baroness, Lady Chakrabarti, asked about sex for rent. Existing offences can be used to prosecute that practice, including Sections 52 and 53 of the Sexual Offences Act 2003, both of which are listed as priority offences in Schedule 7 to the Bill. As a result, all in-scope services must take proactive measures to prevent people being exposed to such content.
The noble Lord, Lord Davies of Brixton, and others talked about scams. The largest and most popular platforms and search engines—category 1 and category 2A services in the Bill—will have a duty to prevent paid-for fraudulent adverts appearing on their services, making it harder for fraudsters to advertise scams online. We know that that can be a particularly devastating crime. The online advertising programme builds on this duty in the Bill and will look at the role of the whole advertising system in relation to fraud, as well as the full gamut of other harms which are caused.
My noble friend Lady Fraser talked about the devolution aspects, which we will certainly look at. Internet services are a reserved matter for the UK Government. The list of priority offences in Schedule 7 can be updated only by the Secretary of State, subject to approval by this Parliament.
The right reverend Prelate the Bishop of Manchester asked about regulatory co-operation, and we recognise the importance of that. Ofcom has existing and strong relationships with other regulators, such as the ICO and the CMA, which has been supported and strengthened by the establishment of the Digital Regulation Cooperation Forum in 2020. We have used the Bill to strengthen Ofcom’s ability to work closely with, and to disclose information to, other regulatory bodies. Clause 104 ensures that Ofcom can do that, and the Bill also requires Ofcom to consult the Information Commissioner.
I do not want to go on at undue length—I am mindful of the fact that we will have detailed debates on all these issues and many more in Committee—but I wish to conclude by reiterating my thanks to all noble Lords, including the many who were not able to speak today but to whom I have already spoken outside the Chamber. They all continue to engage constructively with this legislation to ensure that it meets our shared objectives of protecting children and giving people a safe experience online. I look forward to working with noble Lords in that continued spirit.
My noble friend Lady Morgan of Cotes admitted to being one of the cavalcade of Secretaries of State who have worked on this Bill; I pay tribute to her work both in and out of office. I am pleased that my right honourable friend the Secretary of State was here to observe part of our debate today and, like all noble Lords, I am humbled that Ian Russell has been here to follow our debate in its entirety. The experience of his family and too many others must remain uppermost in our minds as we carry out our duty on the Bill before us; I know that it will be. We have an important task before us, and I look forward to getting to it.
(1 year, 7 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, let me start by saying how saying how pleased I, too, am that we are now in Committee. I thank all noble Lords for giving up their time to attend the technical briefings that officials in my department and I have held since Second Reading and for the collaborative and constructive nature of their contributions in those discussions.
In particular, not least because today is his birthday, I pay tribute to the noble Lord, Lord Stevenson of Balmacara, for his tireless work on the Bill—from his involvement in its pre-legislative scrutiny to his recall to the Front Bench in order to see the job through. We are grateful for his diligence and, if I may say so, the constructive and collaborative way in which he has gone about it. He was right to pay tribute both to my noble friend Lord Gilbert of Panteg, who chaired the Joint Committee, and to the committee’s other members, including all the other signatories to this amendment. The Bill is a better one for their work, and I repeat my thanks to them for it. In that spirit, I am grateful to the noble Lord for bringing forward this philosophical opening amendment. As noble Lords have said, it is a helpful place for us to start and refocus our thoughts as we begin our line-by-line scrutiny of this Bill.
Although I agree with the noble Lord’s broad description of his amendment’s objectives, I am happy to respond to the challenge that lies behind it and put the objectives of this important legislation clearly on the record at the outset of our scrutiny. The Online Safety Bill seeks to bring about a significant change in online safety. The main purposes of the Bill are: to give the highest levels of protection to children; to protect users of all ages from being exposed to illegal content; to ensure that companies’ approach focuses on proactive risk management and safety by design; to protect people who face disproportionate harm online including, for instance, because of their sex or their ethnicity or because they are disabled; to maintain robust protections for freedom of expression and privacy; and to ensure that services are transparent and accountable.
The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children; as the noble Baroness, Lady Benjamin, my noble friend Lord Cormack and others have again reminded us today, that is paramount. Children’s safety is prioritised throughout this Bill. Not only will children be protected from illegal content through its illegal content duties but its child safety duties add an additional layer of protection so that children are protected from harmful or inappropriate content such as grooming, pornography and bullying. I look forward to contributions from the noble Baroness, Lady Kidron, and others who will, I know, make sure that our debates are properly focused on that.
Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure both that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.
Regulated services will need to prioritise responding to online content and activity that present the highest risk of harm to users, including where this is linked to something classified as a protected characteristic under the terms of the Equality Act 2010. This will ensure that platforms protect users who are disproportionately affected by online abuse—for example, women and girls. When undertaking child safety and illegal content risk assessments, providers must consider whether certain people face a greater risk of harm online and ensure that those risks are addressed and mitigated.
The Bill will place duties relating to freedom of expression and privacy on both Ofcom and all in-scope companies. Those companies will have to consider and implement safeguards for freedom of expression when fulfilling their duties. Ofcom will need to carry out its new duties in a way that protects freedom of expression. The largest services will also have specific duties to protect democratic and journalistic content.
Ensuring that services are transparent about the risks on their services and the actions they are taking to address them is integral to this Bill. User-to-user services must set out in their terms of service how they are complying with their illegal and child safety duties. Search services must do the same in public statements. In addition, government amendments that we tabled yesterday will require the biggest platforms to publish summaries of their illegal and their child safety risk assessments, increasing transparency and accountability, and Ofcom will have a power to require information from companies to assess their compliance with providers’ duties.
Finally, the Bill will also increase transparency and accountability relating to platforms with the greatest influence over public discourse. They will be required to ensure that their terms of service are clear and properly enforced. Users will be able to hold platforms accountable if they fail to enforce those terms.
The noble Baroness, Lady Kidron, asked me to say which of the proposed new paragraphs (a) to (g), to be inserted by Amendment 1, are not the objectives of this Bill. Paragraph (a) sets out that the Bill must ensure that services
“do not endanger public health or national security”.
The Bill will certainly have a positive impact on national security, and a core objective of the Bill is to ensure that platforms are not used to facilitate terrorism. Ofcom will issue a stand-alone code on terrorism, setting out how companies can reduce the risk of their services being used to facilitate terrorist offences, and remove such content swiftly if it appears. Companies will also need to tackle the new foreign interference offence as a priority offence. This will ensure that the Bill captures state-sponsored disinformation, which is of most concern—that is, attempts by foreign state actors to manipulate information to interfere in our society and undermine our democratic, political and legal processes.
The Bill will also have a positive impact on public health but I must respectfully say that that is not a primary objective of the legislation. In circumstances where there is a significant threat to public health, the Bill already provides powers for the Secretary of State both to require Ofcom to prioritise specified objectives when carrying out its media literacy activity and to require companies to report on the action they are taking to address the threat. Although the Bill may lead to additional improvements—I am sure that we all want to see them—for instance, by increasing transparency about platforms’ terms of service relating to public health issues, making this a primary objective on a par with the others mentioned in the noble Lord’s amendment risks making the Bill much broader and more unmanageable. It is also extremely challenging to prohibit such content, where it is viewed by adults, without inadvertently capturing useful health advice or legitimate debate and undermining the fundamental objective of protecting freedom of expression online—a point to which I am sure we will return.
The noble Lord’s amendment therefore reiterates many objectives that are interwoven throughout the legislation. I am happy to say again on the record that I agree with the general aims it proposes, but I must say that accepting it would be more difficult than the noble Lord and others who have spoken to it have set out. Accepting this amendment, or one like it, would create legal uncertainty. I have discussed with the officials sitting in the Box—the noble Baroness, Lady Chakrabarti, rightly paid tribute to them—the ways in which such a purposive statement, as the noble Lord suggests, could be made; we discussed it between Second Reading and now.
I appreciate the care and thought with which the noble Lord has gone about this—mindful of international good practice in legislation and through discussion with the Public Bill Office and others, to whom he rightly paid tribute—but any deviation from the substantive provisions of the Bill and the injection of new terminology risk creating uncertainty about the proper interpretation and application of those provisions. We have heard that again today; for example, the noble Baroness, Lady Fox, said that she was not clear what the meaning of certain words may be while my noble friend Lady Stowell made a plea for simplicity in legislation. The noble Lord, Lord Griffiths, also gave an eloquent exposition of the lexicographical befuddlement that can ensue when new words are added. All pointed to some confusion; indeed, there have been areas of disagreement even in what I am sure the noble Lord, Lord Stevenson, thinks was a very consensual summary of the purposes of the Bill.
That legal uncertainty could provide the basis for an increased number of judicial reviews or challenges to the decisions taken under the Bill and its framework, creating significant obstacles to the swift and effective implementation of the new regulatory framework, which I know is not something that he or other noble Lords would want. As noble Lords have noted, this is a complicated Bill, but adding further statements and new terminology to it, for however laudable a reason, risks adding to that complication, which can only benefit those with, as the noble Baroness, Lady Kidron, put it, the deepest pockets.
However, lest he think that I and the Government have not listened to his pleas or those of the Joint Committee, I highlight, as my noble friend Lady Stowell did, that the Joint Committee’s original recommendation was that these objectives
“should be for Ofcom”.
The Government took that up in Schedule 4 to the Bill, and in Clause 82(4), which set out objectives for the codes and for Ofcom respectively. At Clause 82(4) the noble Lord will see the reference to
“the risk of harm to citizens presented by content on regulated services”
and
“the need for a higher level of protection for children than for adults”.
I agree with the noble Baroness, Lady Chakrabarti, that it is not impossible to add purposive statements to Bills and nor is it unprecedented. I echo her tribute to the officials and lawyers in government who have worked on this Bill and given considerable thought to it. She has had the benefit of sharing their experience and the difficulties of writing tightly worded legislation. In different moments of her career, she has also had the benefit of picking at the loose threads in legislation and poking at the holes in it. That is the purpose of lawyers who question the thoroughness with which we have all done our work. I will not call them “pesky lawyers”, as she did—but I did hear her say it. I understand the point that she was making in anticipation but reassure her that she has not pre-empted the points that I was going to make.
To the layperson, legislation is difficult to understand, which is why we publish Explanatory Notes, on which the noble Baroness and others may have had experience of working before. I encourage noble Lords, not just today but as we go through our deliberations, to consult those as well. I hope that noble Lords will agree that they are more easily understood, but if they do not do what they say and provide explanation, I will be very willing to listen to their thoughts on it.
So, while I am not going to give the noble Lord, Lord Stevenson, the birthday present of accepting his amendment, I hope that the clear statement that I gave at the outset from this Dispatch Box, which is purposive as well, about the objectives of the Bill, and my outline of how it tries to achieve them, is a sufficient public statement of our intent, and that it achieves what I hope he was intending to get on the record today. I invite him to withdraw his amendment.
Well, my Lords, it has been a very good debate, and we should be grateful for that. In some senses, I should bank that; we have got ourselves off to a good start for the subsequent debates and discussions that we will have on the nearly 310 amendments that we must get through before the end of the process that we have set out on.
However, let us pause for a second. I very much appreciated the response, not least because it was very sharp and very focused on the amendment. It would have been tempting to go wider and wider, and I am sure that the Minister had that in mind at some point, but he has not done that. The first substantial point that he made seemed to be a one-pager about what this Bill is about. Suitably edited and brought down to manageable size, it would fit quite well into the Bill. I am therefore a bit puzzled as to why he cannot make the jump, intellectually or otherwise, from having that written for him and presumably working on it late at night with candles so that it was perfect—because it was pretty good; I will read it very carefully in Hansard, but it seemed to say everything that I wanted to say and covered most of the points that everybody else thought of to say, in a way that would provide clarity for those seeking it.
The issue we are left with was touched on by the noble Baroness, Lady Stowell, in her very perceptive remarks. Have we got this pointing in the right direction? We should think about it as a way for the Government to get out of this slightly ridiculous shorthand of the safest place to be online, to a statement to themselves about what they are trying to do, rather than an instruction to Ofcom—because that is where it gets difficult and causes problems with the later stages. This is really Parliament and government agreeing to say this, in print, rather than just through reading Hansard. That then reaches back to where my noble friend Lady Chakrabarti is, and it helps the noble Baroness, Lady Harding, with her very good point, that this will not work if people do not even bother to get through the first page.
(1 year, 7 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.
As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.
As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.
I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.
These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.
Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—
I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?
If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.
On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.
The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.
I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?
They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.
Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.
As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.
We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.
The Minister just said something that was material to this debate. He said that Ofcom has existing powers to prevent app stores from providing material that would have caused problems for the services to which they allow access. Can he confirm that?
Perhaps the noble Lord could clarify his question; I was too busy finishing my answer to the noble Lord, Lord Knight.
It is a continuation of the point raised by the noble Baroness, Lady Harding, and it seems that it will go part of the way towards resolving the differences that remain between the Minister and the noble Baroness, which I hope can be bridged. Let me put it this way: is it the case that Ofcom either now has powers or will have powers, as a result of the Bill, to require app stores to stop supplying children with material that is deemed in breach of the law? That may be the basis for understanding how you can get through this. Is that right?
Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.
My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.
I will be very happy to set that out in more detail.
Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.
My Lords, it has certainly been an interesting debate, and I am grateful to noble Lords on all sides of the Committee for their contributions and considerations. I particularly thank the noble Lords who tabled the amendments which have shaped the debate today.
In general, on these Benches, we believe that the Bill offers a proportionate approach to tackling online harms. We feel that granting some of the exemptions proposed in this group would be unintentionally counterproductive and would raise some unforeseen difficulties. The key here—and it has been raised by a number of noble Lords, including the noble Baronesses, Lady Harding and Lady Kidron, and, just now, the noble Lord, Lord Clement-Jones, who talked about the wider considerations of the Joint Committee and factors that should be taken into account—is that we endorse a risk-based approach. In this debate, it is very important that we take ourselves back to that, because that is the key.
My view is that using other factors, such as funding sources or volunteer engagement in moderation, cuts right across this risk-based approach. To refer to Amendment 4, it is absolutely the case that platforms with fewer than 1 million UK monthly users have scope to create considerable harm. Indeed, noble Lords will have seen that later amendments call for certain small platforms to be categorised on the basis of the risk—and that is the important word—that they engender, rather than the size of the platform, which, unfortunately, is something of a crude measure. The point that I want to make to the noble Baroness, Lady Fox, is that it is not about the size of the businesses and how they are categorised but what they actually do. The noble Baroness, Lady Kidron, rightly said that small is not safe, for all the reasons that were explained, including by the noble Baroness, Lady Harding.
Amendment 9 would exempt small and medium-sized enterprises and certain other organisations from most of the Bill’s provisions. I am in no doubt about the well-meaning nature of this amendment, tabled by the noble Lord, Lord Moylan, and supported by the noble Lord, Lord Vaizey. Indeed, there may well be an issue about how start-ups and entrepreneur unicorns cope with the regulatory framework. We should attend to that, and I am sure that the Minister will have something to say about it. But I also expect that the Minister will outline why this would actually be unhelpful in combating many of the issues that this Bill is fundamentally designed to deal with if we were to go down the road of these exclusions.
In particular, granting exemptions simply on the basis of a service’s size could lead to a situation where user numbers are capped or perhaps even where platforms are deliberately broken up to avoid regulation. This would have an effect that none of us in this Chamber would want to see because it would embed harmful content and behaviour rather than helping to reduce them.
Referring back to the comments of the noble Lord, Lord Moylan, I agree with the noble Lord, Lord Vaizey, in his reflection. I, too, have not experienced the two sides of the Chamber that the noble Lord, Lord Moylan, described. I feel that the Chamber has always been united on the matter of child safety and in understanding the ramifications for business. It is the case that good legislation must always seek a balance, but, to go back to the point about excluding small and medium-sized enterprises, to call them a major part of the British economy is a bit of an understatement when they account for 99.9% of the business population. In respect of the exclusion of community-based services, including Wikipedia—and we will return to this in the next group—there is nothing for platforms to fear if they have appropriate systems in place. Indeed, there are many gains to be had for community-based services such as Wikipedia from being inside the system. I look forward to the further debate that we will have on that.
I turn to Amendment 9A in the name of my noble friend Lord Knight of Weymouth, who is unable to participate in this section of the debate. It probes how the Bill’s measures would apply to specialised search services. Metasearch engines such as Skyscanner have expressed concern that the legislation might impose unnecessary burdens on services that pose little risk of hosting the illegal content targeted by the Bill. Perhaps the Minister, in his response, could confirm whether or not such search engines are in scope. That would perhaps be helpful to our deliberations today.
While we on these Benches are not generally supportive of exemptions, the reality is that there are a number of online search services that return content that would not ordinarily be considered harmful. Sites such as Skyscanner and Expedia, as we all know, allow people to search for and book flights and other travel services such as car hire. Obviously, as long as appropriate due diligence is carried out on partners and travel agents, the scope for users to encounter illegal or harmful material appears to be minimal and returns us to the point of having a risk-based approach. We are not necessarily advocating for a carve-out from the Bill, but it would perhaps be helpful to our deliberations if the Minister could outline how such platforms will be expected to interact with the Ofcom-run online safety regime.
My Lords, I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, but I cannot accept the amendments tabled by the noble Baroness, Lady Fox, and others. Doing so would greatly reduce the strong protections that the Bill offers to internet users, particularly to children. I agree with the noble Baroness, Lady Merron, that that has long been the shared focus across your Lordships’ House as we seek to strike the right balance through the Bill. I hope to reassure noble Lords about the justification for the existing balance and scope, and the safeguards built in to prevent undue burdens to business.
I will start with the amendments tabled by the noble Baroness, Lady Fox of Buckley—Amendments 4, 6 to 8, 12, 288 and 305—which would significantly narrow the definition of services in scope of regulation. The current scope of the Bill reflects evidence of where harm is manifested online. There is clear evidence that smaller services can pose a significant risk of harm from illegal content, as well as to children, as the noble Baroness, Lady Kidron, rightly echoed. Moreover, harmful content and activity often range across a number of services. While illegal content or activity may originate on larger platforms, offenders often seek to move to smaller platforms with less effective systems for tackling criminal activity in order to circumvent those protections. Exempting smaller services from regulation would likely accelerate that process, resulting in illegal content being displaced on to smaller services, putting users at risk.
These amendments would create significant new loopholes in regulation. Rather than relying on platforms and search services to identify and manage risk proactively, they would require Ofcom to monitor smaller harmful services, which would further annoy my noble friend Lord Moylan. Let me reassure the noble Baroness, however, that the Bill has been designed to avoid disproportionate or unnecessary burdens on smaller services. All duties on services are proportionate to the risk of harm and the capacity of companies. This means that small, low-risk services will have minimal duties imposed on them. Ofcom’s guidance and codes of practice will set out how they can comply with their duties, in a way that I hope is even clearer than the Explanatory Notes to the Bill, but certainly allowing for companies to have a conversation and ask for areas of clarification, if that is still needed. They will ensure that low-risk services do not have to undertake unnecessary measures if they do not pose a risk of harm to their users.
My Lords, while my noble friend is talking about the possibility of excessive and disproportionate burden on businesses, can I just ask him about the possibility of excessive and disproportionate burden on the regulator? He seems to be saying that Ofcom is going to have to maintain, and keep up to date regularly, 25,000 risk assessments—this is on the Government’s own assessment, produced 15 months ago, of the state of the market then—even if those assessments carried out by Ofcom result in very little consequence for the regulated entity.
We know from regulation in this country that regulators already cannot cope with the burdens placed on them. They become inefficient, sclerotic and unresponsive; they have difficulty in recruiting staff of the same level and skills as the entities that they regulate. We have a Financial Services and Markets Bill going through at the moment, and the FCA is a very good example of that. Do we really think that this is a sensible burden to place on a regulator that is actually able to discharge it?
The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.
I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.
Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be
“working to benefit the public”.
I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.
Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.
Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to
“search for … products or services … in a particular sector”.
It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.
The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.
The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.
I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.
My Lords, I thank people for such a wide-ranging and interesting set of contributions. I take comfort from the fact that so many people understood what the amendments were trying to do, even if they did not fully succeed in that. I thought it was quite interesting that in the first debate the noble Lord, Lord Allan of Hallam, said that he might be a bit isolated on the apps, but I actually agreed with him—which might not do his reputation any good. However, when he said that, I thought, “Welcome to my world”, so I am quite pleased that this has not all been shot down in flames before we started. My amendment really was a serious attempt to tackle something that is a real problem.
The Minister says that the Bill is designed to avoid disproportionate burdens on services. All I can say is, “Sack the designer”. It is absolutely going to have a disproportionate burden on a wide range of small services, which will not be able to cope, and that is why so many of them are worried about it. Some 80% of the companies that will be caught up in this red tape are small and micro-businesses. I will come to the small business point in a moment.
The noble Baroness, Lady Harding, warned us that small tech businesses become big tech businesses. As far as I am concerned, that is a success story—it is what I want; is it not what we all want? Personally, I think economic development and growth is a positive thing—I do not want them to fail. However, I do not think it will ever happen; I do not think that small tech businesses will ever grow into big tech businesses if they face a disproportionate burden in the regulatory sense, as I have tried to describe. That is what I am worried about, and it is not a positive thing to be celebrated.
I stress that it is not small tech and big tech. There are also community sites, based on collective moderation. Wikipedia has had a lot of discussion here. For a Bill that stresses that it wants to empower users, we should think about what it means when these user-moderated community sites are telling us that they will not be able to carry on and get through. That is what they are saying. It was interesting that the noble Lord, Lord Clement-Jones, said that he relies on Wikipedia—many of us do, although please do not believe what it says about me. There are all of these things, but then there was a feeling that, well, Reddit is a bit dodgy. The Bill is not meant to be deciding which ones to trust in quite that way, or people’s tastes.
I was struck that the noble Baroness, Lady Kidron, said that small is not safe, and used the incel example. I am not emphasising that small is safe; I am saying that the small entities will not survive this process. That is my fear. I do not mean that the big ones are nasty and dangerous and the small ones are cosy, lovely and Wikipedia-like. I am suggesting that smaller entities will not be able to survive the regulatory onslaught. That is the main reason I raised this.
The noble Baroness, Lady Merron, said that these entities can cause great harm. I am worried about a culture of fear, in which we demonise tens of thousands of innocent tech businesses and communities and end up destroying them when we do not intend to. I tried to put in the amendment an ability for Ofcom, if there are problematic sites that are risky, to deal with them. As the Minister kept saying, low-risk search engines have been exempted. I am suggesting that low-risk small and micro-businesses are exempted, which is the majority of them. That is what I am suggesting, rather than that we assume they are all guilty and then they have to get exempted.
Interestingly, the noble Lord, Lord McCrea, asked how many pornography sites are in scope and which pornographic websites have a million or fewer users. I am glad I do not know the answer to that, otherwise people might wonder why I did. The point is that there are always going to be sites that are threatening or a risk to children, as we are discussing. But we must always bear in mind—this was the important point that the noble Lord, Lord Moylan, made—that in our absolute determination to protect children via this Bill we do not unintendedly damage society as a whole. Adult access to free speech, for example, is one of my concerns, as are businesses and so on. We should not have that as an outcome.
Like others, I had prepared quite extensive notes to respond to what I thought the noble Lord was going to say about his amendments in this group, and I have not been able to find anything left that I can use, so I am going to have to extemporise slightly. I think it is very helpful to have a little non-focused discussion about what we are about to talk about in terms of age, because there is a snare and a delusion in quite a lot of it. I was put in mind of that in the discussions on the Digital Economy Act, which of course precedes the Minister but is certainly still alive in our thinking: in fact, we were talking about it earlier today.
The problem I see is that we have to find a way of squaring two quite different approaches. One is to prevent those who should not be able to see material, because it is illegal for them to see it. The other is to find a way of ensuring that we do not end up with an age-gated internet, which I am grateful to find that we are all, I think, agreed about: that is very good to know.
Age is very tricky, as we have heard, and it is not the only consideration we have to bear in mind in wondering whether people should be able to gain access to areas of the internet which we know will be bad and difficult for them. That leads us, of course, to the question about legal but harmful, now resolved—or is it? We are going to have this debate about age assurance and what it is. What is age verification? How do they differ? How does it matter? Is 18 a fixed and final point at which we are going to say that childhood ends and adulthood begins, and therefore one is open for everything? It is exactly the point made earlier about how to care for those who should not be exposed to material which, although legal for them by a number called age, is not appropriate for them in any of the circumstances which, clinically, we might want to bring to bear.
I do not think we are going to resolve these issues today—I hope not. We are going to talk about them for ever, but at this stage I think we still need a bit of thinking outside a box which says that age is the answer to a lot of the problems we have. I do not think it is, but whether the Bill is going to carry that forward I have my doubts. How we get that to the next stage, I do not know, but I am looking forward to hearing the Minister’s comments on it.
My Lords, I agree that this has been a rather unfortunate grouping and has led to a slightly strange debate. I apologise if it is the result of advice given to my noble friend. I know there has been some degrouping as well, which has led to slightly odd combinations today. However, as promised, I shall say a bit more about Wikipedia in relation to my noble friend’s Amendments 10 and 11.
The effect of these amendments would be that moderation actions carried out by users—in other words, community moderation of user-to-user and search services —would not be in scope of the Bill. The Government support the use of effective user or community moderation by services where this is appropriate for the service in question. As I said on the previous group, as demonstrated by services such as Wikipedia, this can be a valuable and effective means of moderating content and sharing information. That is why the Bill does not impose a one-size-fits-all requirement on services, but instead allows services to adopt their own approaches to compliance, so long as these are effective. The noble Lord, Lord Allan of Hallam, dwelt on this. I should be clear that duties will not be imposed on individual community moderators; the duties are on platforms to tackle illegal content and protect children. Platforms can achieve this through, among other things, centralised or community moderation. Ultimately, however, it is they who are responsible for ensuring compliance and it is platforms, not community moderators, who will face enforcement action if they fail to do so.
My Lords, this group of government amendments relates to risk assessments; it may be helpful if I speak to them now as the final group before the dinner break.
Risk management is at the heart of the Bill’s regulatory framework. Ofcom and services’ risk assessments will form the foundation for protecting users from illegal content and content which is harmful to children. They will ensure that providers thoroughly identify the risks on their own websites, enabling them to manage and mitigate the potential harms arising from them. Ofcom will set out the risks across the sector and issue guidance to companies on how to conduct their assessments effectively. All providers will be required to carry out risk assessments, keep them up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. Providers will then need to put in place measures to manage and mitigate the risks they identify in their risk assessments, including any emerging risks.
Given how crucial the risk assessments are to this framework, it is essential that we enable them to be properly scrutinised by the public. The government amendments in this group will place new duties on providers of the largest services—that is, category 1 and 2A services—to publish summaries of their illegal and child safety risk assessments. Through these amendments, providers of these services will also have a new duty to send full records of their risk assessments to Ofcom. This will increase transparency about the risk of harm on the largest platforms, clearly showing how risk is affected by factors such as the design, user base or functionality of their services. These amendments will further ensure that the risk assessments can be properly assessed by internet users, including by children and their parents and guardians, by ensuring that summaries of the assessments are publicly available. This will empower users to make informed decisions when choosing whether and how to use these services.
It is also important that Ofcom is fully appraised of the risks identified by service providers. That is why these amendments introduce duties for both category 1 and 2A services to send their records of these risk assessments, in full, to Ofcom. This will make it easier for Ofcom to supervise compliance with the risk assessment duties, as well as other duties linked to the findings of the risk assessments, rather than having to request the assessments from companies under its information-gathering powers.
These amendments also clarify that companies must keep a record of all aspects of their risk assessments, which strengthens the existing record-keeping duties on services. I hope that noble Lords will welcome these amendments. I beg to move.
My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.
The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.
My Lords, I am grateful to the Minister for introducing this group, and we certainly welcome this tranche of government amendments. We know that there are more to come both in Committee and as we proceed to Report, and we look forward to seeing them.
The amendments in this group, as other noble Lords have said, amount to a very sensible series of changes to services’ risk-assessment duties. This perhaps begs the question of why they were not included in earlier drafts of the Bill, but we are glad to see them now.
There is, of course, the issue of precisely where some of the information will appear, as well as the wider status of terms of service. I am sure those issues will be discussed in later debates. It is certainly welcome that the department is introducing stronger requirements around the information that must be made available to users; it will all help to make this a stronger and more practical Bill.
We all know that users need to be able to make informed decisions, and it will not be possible if they are required to view multiple statements and various documents. It seems that the requirements for information to be provided to Ofcom go to the very heart of the Bill, and I suggest that the proposed system will work best if there is trust and transparency between the regulator and those who are regulated. I am sure that there will be further debate on the scope of risk assessments, particularly on issues that were dropped from previous iterations of the Bill, and certainly this is a reasonable starting point today.
I will try to be as swift as possible as I raise a few key issues. One is about avoiding warnings that are at such a high level of generality that they get put on to everything. Perhaps the Minister could indicate how Ofcom will ensure that the summaries are useful and accessible to the reader. The test, of course, should be that a summary is suitable and sufficient for a prospective user to form an assessment of the likely risk they would encounter when using the service, taking into account any special vulnerabilities that they might have. That needs to be the test; perhaps the Minister could confirm that.
Is the terms of service section the correct place to put a summary of the illegal content risk assessment? Research suggests, unsurprisingly, that only 3% of people read terms before signing up—although I recall that, in an earlier debate, the Minister confessed that he had read all the terms and conditions of his mobile phone contract, so he may be one of the 3%. It is without doubt that any individual should be supported in their ability to make choices, and the duty should perhaps instead be to display a summary of the risks with due prominence, to ensure that anyone who is considering signing up to a service is really able to read it.
I also ask the Minister to confirm that, despite the changes to Clause 19 in Amendment 16B, the duty to keep records of risk assessments will continue to apply to all companies, but with an enhanced responsibility for category 1 companies.
I am grateful to noble Lords for their questions on this, and particularly grateful to the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, for their chorus of welcome. Where we are able to make changes, we will of course bring them forward, and I am glad to be able to bring forward this tranche now.
As the noble Lord, Lord Allan, said, ensuring the transparency of services’ risk assessments will further ensure that the framework of the Bill delivers its core objectives relating to effective risk management and increased accountability regarding regulated services. As we have discussed, it is imperative that these providers take a thorough approach to identifying risks, including emerging risks. The Government believe that it is of the utmost importance that the public are able effectively to scrutinise the risk assessments of the largest in-scope services, so that users can be empowered to make informed decisions about whether and how to use their services.
On the questions from the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, about why it is just category 1 and category 2A services, we estimate that there will be around 25,000 UK service providers in scope of the Bill’s illegal and child safety duties. Requiring all these companies to publish full risk assessments and proactively to send them to Ofcom could undermine the Bill’s risk-based and proportionate approach, as we have discussed in previous groups on the burdens to business. A large number of these companies are likely to be low risk and it is unlikely that many people will seek out their risk assessments, so requiring all companies to publish them would be an excessive regulatory burden.
There would also be an expectation that Ofcom would proactively monitor a whole range of services, even ones that posed a minimal risk to users. That in turn could distract Ofcom from taking a risk-based approach in its regulation by overwhelming it with paperwork from thousands of low-risk services. If Ofcom wants to see records of the risk assessments of providers that are not category 1 or category 2A services, it has extensive information-gathering powers that it can use to require a provider to send it such records.
The noble Baroness, Lady Merron, was right to say that I read the terms of my broadband supply—I plead guilty to the nerdiness of doing that—but I have not read all the terms and conditions of every application and social medium I have downloaded, and I agree that many people do skim through them. They say the most commonly told lie on the planet at the moment is “I agree to the terms and conditions”, and the noble Baroness is right to point to the need for these to be intelligible, easily accessible and transparent—which of course we want to see.
In answer to her other question, the record-keeping duty will apply to all companies, but the requirement to publish is only for category 1 and category 2A companies.
The noble Baroness, Lady Kidron, asked me about Amendment 27A. If she will permit me, I will write to her with the best and fullest answer to that question.
I am grateful to noble Lords for their questions on this group of amendments.
(1 year, 7 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, first, I will address Amendments 12BA, 183A and 183B, tabled by the noble Baroness, Lady Ritchie of Downpatrick, who I was grateful to discuss them with earlier today, and the noble Lord, Lord Morrow, whose noble friend, the noble Lord, Lord Browne of Belmont, I am grateful to for speaking to them on his behalf.
These amendments seek to apply the duties in Part 5 of the Bill, which are focused on published pornographic content and user-generated pornography. Amendments 183A and 183B are focused particularly on making sure that children are protected from user-to-user pornography in the same way as from published pornography, including through the use of age verification. I reassure the noble Baroness and the noble Lord that the Government share their concerns; there is clear evidence about the impact of pornography on young people and the need to protect children from it.
This is where I come to the questions posed earlier by the noble Lord, Lord McCrea of Magherafelt and Cookstown. The research we commissioned from the British Board of Film Classification assessed the functionality of and traffic to the UK’s top 200 most visited pornographic websites. The findings indicated that 128 of the top 200 most visited pornographic websites—that is just under two-thirds, or 64%—would have been captured by the proposed scope of the Bill at the time of the Government’s initial response to the online harms White Paper, and that represents 85% of the traffic to those 200 websites.
Since then, the Bill’s scope has been broadened to include search services and pornography publishers, meaning that children will be protected from pornography wherever it appears online. The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk to children, such as online pornography. Age-assurance technologies and other measures will be used to provide children with an age-appropriate experience on their service.
As noble Lords know, the Bill does not mandate that companies use specific approaches or technologies when keeping children safe online as it is important that the Bill is future-proofed: what is effective today might not be so effective in the future. Moreover, age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties under the Bill. For instance, if a user-to-user service, such as a social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. That would allow content to be better detected and removed, instead of restricting children from a service that is designed to be appropriate for their use—as my noble friend Lady Harding of Winscombe puts it, avoiding the situation where children are removed from these services altogether.
While I am sympathetic to the aims of these amendments, I assure noble Lords that the Bill already has robust, comprehensive protections in place to keep children safe from all pornographic content, wherever or however it appears online. This amendment is therefore unnecessary because it duplicates the existing provisions for user-to-user pornography in the child safety duties in Part 3.
It is important to be clear that, wherever they are regulated in the Bill, companies will need to ensure that children cannot access pornographic content online. This is made clear, for user-to-user content, in Clause 11(3); for search services, in Clause 25(3); and for published pornographic content in Clause 72(2). Moving the regulation of pornography from Part 3 to Part 5 would not be a workable or desirable option because the framework is effective only if it is designed to reflect the characteristics of the services in scope.
Part 3 has been designed to address the particular issues arising from the rapid growth in platforms that allow the sharing of user-generated content but are not the ones choosing to upload that content. The scale and speed of dissemination of user-generated content online demands a risk-based and proportionate approach, as Part 3 sets out.
It is also important that these companies understand the risks to children in the round, rather than focusing on one particular type of content. Risks to children will often be a consequence of the design of these services—for instance, through algorithms, which need to be tackled holistically.
I know that the noble Baroness is concerned about whether pornography will indeed be designated as primary priority content for the purposes of the child safety duties in Clauses 11(3) and 25(3). The Government fully intend this to be the case, which means that user-to-user services will need to have appropriate systems to prevent children accessing pornography, as defined in Clause 70(2).
The approach taken in Part 3 is very different from services captured under Part 5, which are publishing content directly, know exactly where it is located on their site and already face legal liability for the content. In this situation the service has full control over its content, so a risk-based approach is not appropriate. It is reasonable to expect that service to prevent children accessing pornography. We do not therefore consider it necessary or effective to apply the Part 5 duties to user-to-user pornographic content.
I also assure the noble Baroness and the noble Lord that, in a case where a provider of user-to-user services is directly publishing pornographic content on its own service, it will already be subject to the Part 5 duties in relation to that particular content. Those duties in relation to that published pornographic content will be separate from and in addition to their Part 3 duties in relation to user-generated pornographic content.
This means that, no matter where published pornographic content appears, the obligation to ensure that children are not normally able to encounter it will apply to all in-scope internet service providers that publish pornographic content. This is made clear in Clause 71(2) and is regardless of whether they also offer user-to-user or search services.
I am sorry, but can the Minister just clarify that? Is he saying that it is not possible to be covered by both Part 3 and Part 5, so that where a Part 5 service has user-generated content it is also covered by Part 3? Can he clarify that you cannot just escape Part 5 by adding user-generated content?
Yes, that is correct. I was trying to address the points raised by the noble Baroness, but the noble Lord is right. The point on whether people might try to be treated differently by allowing comments or reviews on their content is that they would be treated the same way. That is the motivation behind the noble Baroness’s amendment trying to narrow the definition. There is no risk that a publisher of pornographic content could evade their Part 5 duties by enabling comments or reviews on their content. That would be the case whether or not those reviews contained words, non-verbal indications that a user liked something, emojis or any other form of user-generated content.
That is because the Bill has been designed to confer duties on different types of content. Any service with provider pornographic content will need to comply with the Part 5 duties to ensure that children cannot normally encounter such content. If they add user-generated functionality—
I am sorry to come back to the same point, but let us take the Twitter example. As a publisher of pornography, does Twitter then inherit Part 5 responsibilities in as much as it is publishing pornography?
It is covered in the Bill as Twitter. I am not quite sure what my noble friend is asking me. The harms that he is worried about are covered in different ways. Twitter or another social medium that hosts such content would be hosting it, not publishing it, so would be covered by Part 3 in that instance.
Maybe my noble friend the Minister could write to me to clarify that point, because it is quite a significant one.
Perhaps I will speak to the noble Lord afterwards and make sure I have his question right before I do so.
I hope that answers the questions from the noble Baroness, Lady Ritchie, and that on that basis, she will be happy to withdraw her amendment.
My Lords, this has been a very wide-ranging debate, concentrating not only on the definition of pornography but on the views of noble Lords in relation to how it should be regulated, and whether it should be regulated, as the noble Baroness, Lady Kidron, the noble Lords, Lord Bethell and Lord Browne, and I myself believe, or whether it should be a graduated response, which seems to be the view of the noble Lords, Lord Allan and Lord Clement-Jones.
I believe that all pornography should be treated the same. There is no graduated response. It is something that is pernicious and leads to unintended consequences for many young people, so therefore it needs to be regulated in all its forms. I think that is the point that the noble Lord, Lord Bethell, was making. I believe that these amendments should have been debated along with those of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, because then we could have an ever wider-ranging debate, and I look forward to that in the further groups in the days to come. The focus should be on the content, not on the platform, and the content is about pornography.
I agree with the noble Baroness, Lady Kidron, that porn is not the only harm, and I will be supporting her amendments. I believe that they should be in the Bill because if we are serious about dealing with these issues, they have to be in there.
I do not think my amendments are suggesting that children will be removed from social media. I agree that it is a choice to remove pornography or to age-gate. Twitter is moving to subscriber content anyway, so it can do it; the technology is already available to do that. I believe you just age-gate the porn content, not the whole site. I agree with the noble Lord, Lord Clement-Jones, as I said. These amendments should have been debated in conjunction with those of the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, as I believe that the amendments in this group are complementary to those, and I think I already said that in my original submission.
I found the Minister’s response interesting. Obviously, I would like time to read Hansard. I think certain undertakings were given, but I want to see clearly spelled out where they are and to discuss with colleagues across the House where we take these issues and what we come back with on Report.
I believe that these issues will be debated further in Committee when the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, are debated. I hope that in the intervening period the Minister will have time to reflect on the issues raised today about Parts 3 and 5 and the issue of pornography, and that he will be able to help us in further sessions in assuaging the concerns that we have raised about pornography. There is no doubt that these issues will come back. The only way that they can be dealt with, that pornography can be dealt with and that all our children throughout the UK can be dealt with is through proper regulation.
I think we all need further reflection. I will see, along with colleagues, whether it is possible to come back on Report. In the meantime, I beg leave to withdraw the amendment.
(1 year, 7 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, I welcome this debate, which revisits some of the areas discussed in earlier debates about the scope of the Bill, as many noble Lords said. It allows your Lordships’ House to consider what has to be the primary driver for assessment. In my view and as others said, it ought to be about risk, which has to be the absolute driver in all this. As the noble Baroness, Lady Harding, said, businesses do not remain static: they start at a certain size and then change. Of course, we hope that many of the businesses we are talking about will grow, so this is about preparation for growth and the reality of doing businesses.
As we discussed, there certainly are cases where search providers may, by their very nature, be almost immune from presenting users with content that could be considered either harmful or illegal under this legislative framework. The new clause proposed by the noble Lord, Lord Moylan—I am grateful to him for allowing us to explore these matters—and its various consequential amendments, would limit the duty to prevent access to illegal content to core category 2A search providers, rather than all search providers, as is currently the case under Clause 23(3).
The argument that I believe the noble Lord, Lord Moylan, put forward is that the illegal content duty is unduly wide, placing a disproportionate and otherwise unacceptable burden on smaller and/or supposedly safer search providers. He clearly said he was not saying that small was safe—that is now completely understood—but he also said that absolute safety is not achievable. As the noble Baroness, Lady Kidron, said, that is indeed so. If this legislation is too complex and creates the wrong provisions, we will clearly be a long way away from our ambition, which here has to be to have in place the best legislative framework, one that everyone can work with and that provides the maximum opportunity for safety and what we all seek to achieve.
Of course, the flip side of the argument about an unacceptable burden on smaller, or on supposedly safer, search providers may be that they would in fact have very little work to do to comply with the illegal content duty, at least in the short term. But the duty would act as an important safeguard, should the provider’s usual systems prove ineffective with the passage of time. Again, that point was emphasised in this and the previous debate by the noble Baroness, Lady Harding.
We look forward to the Minister’s response to find out which view he and his department subscribe to or, indeed, whether they have another view they can bring to your Lordships’ House. But, on the face of it, the current arrangements do not appear unacceptably onerous.
Amendment 157 in the name of the noble Lord, Lord Pickles, and introduced by the noble Baroness, Lady Deech, deals with search by a different approach by inserting requirements about search services’ publicly available statements into Clause 65. In the debate, the noble Baroness and the noble Lord, Lord Weir, raised very important, realistic examples of where search engines can take us, including to material that encourages racism directed at Jews and other groups and encourages hatred of various groups, including Jews. The amendment talks about issues such as the changing of algorithms or the hiding of content and the need to ensure that the terms of providers’ publicly available statements are applied as consistently.
I look forward to hearing from the Minister in response to Amendment 157 as the tech certainly moves us beyond questions of scope and towards discussion of the conduct of platforms when harm is identified.
My Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.
Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.
Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.
The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.
The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.
Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.
The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious
“To be, or not to be”
pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.
I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.
As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.
Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.
As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.
My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.
I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.
My Lords, this has indeed been a very good debate on a large group of amendments. We have benefited from two former Ministers, the noble Lord, Lord McNally, and my noble friend Lord Kamall. I hope it is some solace to my noble friend that, such a hard act is he to follow, his role has been taken on by two of us on the Front Bench—myself at DCMS and my noble friend Lord Camrose at the new Department for Science, Innovation and Technology.
The amendments in this group are concerned with the protection of user privacy under the Bill and the maintenance of end-to-end encryption. As noble Lords have noted, there has been some recent coverage of this policy in the media. That reporting has not always been accurate, and I take this opportunity to set the record straight in a number of areas and seek to provide the clarity which the noble Lord, Lord Stevenson of Balmacara, asked for just now.
Encryption plays a crucial role in the digital realm, and the UK supports its responsible use. The Bill does not ban any service design, nor will it require services materially to weaken any design. The Bill contains strong safeguards for privacy. Broadly, its safety duties require platforms to use proportionate systems and processes to mitigate the risks to users resulting from illegal content and content that is harmful to children. In doing so, platforms must consider and implement safeguards for privacy, including ensuring that they are complying with their legal responsibilities under data protection law.
With regard to private messaging, Ofcom will set out how companies can comply with their duties in a way that recognises the importance of protecting users’ privacy. Importantly, the Bill is clear that Ofcom cannot require companies to use proactive technology, such as automated scanning, on private communications in order to comply with their safety duties.
In addition to these cross-cutting protections, there are further safeguards concerning Ofcom’s ability to require the use of proactive technology, such as content identification technology on public channels. That is in Clause 124(6) of the Bill. Ofcom must consider a number of matters, including the impact on privacy and whether less intrusive measures would have the equivalent effect, before it can require a proactive technology.
The implementation of end-to-end encryption in a way that intentionally blinds companies to criminal activity on their services, however, has a disastrous effect on child safety. The National Center for Missing & Exploited Children in the United States of America estimates that more than half its reports could be lost if end-to-end encryption were implemented without preserving the ability to tackle child sexual abuse—a conundrum with which noble Lords grappled today. That is why our new regulatory framework must encourage technology companies to ensure that their safety measures keep pace with this evolving and pernicious threat, including minimising the risk that criminals are able to use end-to-end encrypted services to facilitate child sexual abuse and exploitation.
Given the serious risk of harm to children, the regulator must have appropriate powers to compel companies to take the most effective action to tackle such illegal and reprehensible content and activity on their services, including in private communications, subject to stringent legal safeguards. Under Clause 110, Ofcom will have a stand-alone power to require a provider to use, or make best endeavours to develop, accredited technology to tackle child sexual exploitation and abuse, whether communicated publicly or privately, by issuing a notice. Ofcom will use this power as a last resort only when all other measures have proven insufficient adequately to address the risk. The only other type of harm for which Ofcom can use this power is terrorist content, and only on public communications.
The use of the power in Clause 110 is subject to additional robust safeguards to ensure appropriate protection of users’ rights online. Ofcom will be able to require the use of technology accredited as being highly accurate only in specifically detecting illegal child sexual exploitation and abuse content, ensuring a minimal risk that legal content is wrongly identified. In addition, under Clause 112, Ofcom must consider a number of matters, including privacy and whether less intrusive means would have the same effect, before deciding whether it is necessary and proportionate to issue a notice.
The Bill also includes vital procedural safeguards in relation to Ofcom’s use of the power. If Ofcom concludes that issuing a notice is necessary and proportionate, it will need to publish a warning notice to provide the company an opportunity to make representations as to why the notice should not be issued or why the detail contained in it should be amended. In addition, the final notice must set out details of the rights of appeal under Clause 149. Users will also be able to complain to and seek action from a provider if the use of a specific technology results in their content incorrectly being removed and if they consider that technology is being used in a way that is not envisaged in the terms of service. Some of the examples given by the noble Baroness, Lady Fox of Buckley, pertain in this instance.
The Bill also recognises that in some cases there will be no available technology compatible with the particular service design. As I set out, this power cannot be used by Ofcom to require a company to take any action that is not proportionate, including removing or materially weakening encryption. That is why the Bill now includes an additional provision for this scenario, to allow Ofcom to require technology companies to use their best endeavours to develop or find new solutions that work on their services while meeting the same high standards of accuracy and privacy protection. Given the ingenuity and resourcefulness of the sector, it is reasonable to ask it to do everything possible to protect children from abuse and exploitation. I echo the comments made by the noble Lord, Lord Allan, about the work being done across the sector to do that.
More broadly, the regulator must uphold the right to privacy under its Human Rights Act obligations when implementing the new regime. It must ensure that its actions interfere with privacy only where it is lawful, necessary and proportionate to do so. I hope that addresses the question posed by the noble Lord, Lord Stevenson. In addition, Ofcom will be required to consult the Information Commissioner’s Office when developing codes of practice and relevant pieces of guidance.
I turn now to Amendments 14—
Before the Minister does so, can he give a sense of what he means by “best endeavours” for those technology companies? If it is not going to be general monitoring of what is happening as the message moves from point to point—we have had some discussions about the impracticality and issues attached to monitoring at one end or the other—what, theoretically, could “best endeavours” possibly look like?
I am hesitant to give too tight a definition, because we want to remain technology neutral and make sure that we are keeping an open mind to developing changes. I will think about that and write to the noble Lord. The best endeavours will inevitably change over time as new technological solutions present themselves. I point to the resourcefulness of the sector in identifying those, but I will see whether there is anything more I can add.
While the Minister is reflecting, I note that the words “best endeavours” are always a bit of a worry. The noble Lord, Lord Allan, made the good point that once it is on your phone, you are in trouble and you must report it, but the frustration of many people outside this Chamber, if it has been on a phone and you cannot deal with it, is what comes next to find the journey of that piece of material without breaking encryption. I speak to the tech companies very often—indeed, I used to speak to the noble Lord, Lord Allan, when he was in position at then Facebook—but that is the question that we would like answered in this Committee, because the frustration that “It is nothing to do with us” is where we stop with our sympathy.
The noble Baroness’s intervention has given me an opportunity to note that I am about to say a little more on best endeavours, which will not fully answer the question from the noble Lord, Lord Knight, but I hope fleshes it out a little more.
I do that in turning to Amendments 14, 108 and 205, which seek to clarify that companies will not be required to undertake fundamental changes to the nature of their service, such as the removal or weakening of end-to-end encryption. As I previously set out, the Bill does not require companies to weaken or remove any design and there is no requirement for them to do so as part of their risk assessments or in response to a notice. Instead, companies will need to undertake risk assessments, including consideration of risks arising from the design of their services, before taking proportionate steps to mitigate and manage these risks. Where relevant, assessing the risks arising from end-to-end encryption will be an integral part of this process.
This risk management approach is well established in almost every other industry and it is right that we expect technology companies to take user safety into account when designing their products and services. We understand that technologies used to identify child sexual abuse and exploitation content, including on private communications, are in some cases nascent and complex. They continue to evolve, as I have said. That is why Ofcom has the power through the Bill to issue a notice requiring a company to make best endeavours to develop or source technology.
This notice will include clear, proportionate and enforceable steps that the company must take, based on the relevant information of the specific case. Before issuing a warning notice, Ofcom is expected to enter into informal consultation with the company and/or to exercise information-gathering powers to determine whether a notice is necessary and proportionate. This consultation period will assist in establishing what a notice to develop a technology may require and appropriate steps for the company to take to achieve best endeavours. That dialogue with Ofcom is part of the process.
There are a lot of phrases here—best endeavour, proportionate, appropriate steps—that are rather subjective. The concern of a number of noble Lords is that we want to address this issue but it is a matter of how it is applied. That is one of the reasons why noble Lords were asking for some input from the legal profession, a judge or otherwise, to make those judgments.
All the phrases used in the Bill are subject to the usual scrutiny through the judicial process—that is why we debate them now and think about their implications—but of course they can, and I am sure will, be tested in the usual legal ways. Once a company has developed a new technology that meets minimum standards of accuracy, Ofcom may require its use but not before considering matters including the impact on user privacy, as I have set out. The Bill does not specify which tools are likely to be required, as we cannot pre-empt Ofcom’s evidence-based and case-by-case assessment.
Amendment 285 intends to clarify that social media platforms will not be required to undertake general monitoring of the activity of their users. I agree that the protection of privacy is of utmost importance. I want to reassure noble Lords, in particular my noble friend Lady Stowell of Beeston, who asked about it, that the Bill does not require general monitoring of all content. The clear and strong safeguards for privacy will ensure that users’ rights are protected.
Setting out clear and specific safeguards will be more effective in protecting users’ privacy than adopting the approach set out in Amendment 285. Ofcom must consider a number of matters, including privacy, before it can require the use of proactive technology. The government amendments in this group, Amendments 290A to 290G, further clarify that technology which identifies words, phrases or images that indicate harm is subject to all of these restrictions. General monitoring is not a clearly defined concept—a point made just now by my noble friend Lord Kamall. It is used in EU law but is not defined clearly in that, and it is not a concept in UK law. This lack of clarity could create uncertainty that some technology companies might attempt to exploit in order to avoid taking necessary and proportionate steps to protect their users. That is why we resist Amendment 285.
I understand the point the Minister is making, but it is absolutely crystal clear that, whatever phrase is used, the sensibility is quite clear that the Government are saying on record, at the Dispatch Box, that the Bill can in no way be read as requiring anybody to provide a view into private messaging or encrypted messaging unless there is good legal cause to suspect criminality. That is a point that the noble Baroness, Lady Stowell, made very clearly. One may not like the phrasing used in other legislatures, but could we find a form of words that will make it clear that those who are operating in this legal territory are absolutely certain about where they stand on that?
My Lords, I want to give clear reassurance that the Bill does not require general monitoring of all content. We have clear and strong safeguards for privacy in the Bill to ensure that users’ rights are protected. I set out the concerns about use of the phrase “general monitoring”. I hope that provides clarity, but I may have missed the noble Lord’s point. The brief answer to the question I think he was asking is yes.
Let the record stand clear: yes. It was the slight equivocation around how the Minister approached and left that point that I was worried about, and that people might seek to use that later. Words from the Dispatch Box are never absolute and they are never meant to be, but the fact that they have been said is important. I am sure that everybody understands that point, and the Minister did say “yes” to my question.
I did, and I am happy to say it again: yes.
Perhaps I might go back to an earlier point. When the Minister said the Government want to make sure, I think he was implying that certain companies would try to avoid obligations to keep their users safe by threatening to leave or whatever. I want it to be clear that the obligations to the users of the service are, in the instance of encrypted services, to protect their privacy, and they see that as keeping them safe. It would be wrong to make that a polar opposite. I think that companies that run unencrypted services believe that to be what their duties are—so that in a way is a clash.
Secondly, I am delighted by the clarity in the Minister’s “yes” answer, but I think that maybe there needs to be clearer communication with people outside this Chamber. People are worried about whether duties placed on Ofcom to enact certain things would lead to some breach of encryption. No one thinks that the Government intend to do this or want to spy on anyone, but that the unintended consequences of the duty on Ofcom might have that effect. If that is not going to be the case, and that can be guaranteed by the Government, and they made that clear, it would reassure not just the companies but the users of messaging services, which would be helpful.
The points the noble Baroness has just made bring me neatly to what I was about to say in relation to the question raised earlier by the noble Lord, Lord Knight of Weymouth. But first, I would say that Ofcom as a public body is subject to public law principles already, so those apply in this case.
The noble Lord, Lord Knight, asked about virtual private networks and the risk of displacing people on to VPNs or other similar alternatives. That is a point worth noting, not just in this group but as we consider all these amendments, particularly when we talk later on about age verification, pornography and so on. Services will need to think about how safety measures could be circumvented and take steps to prevent that, because they need to mitigate risk effectively. There may also be a role in enforcement action, too; Ofcom will be able to apply to the courts to require these services where appropriate to apply business disruption measures. We should certainly be mindful of the incentives for people to do that, and the example the noble Lord, Lord Knight, gave earlier is a useful lesson in the old adage “Caveat emptor” when looking at some of these providers.
I want to say a little bit about Amendments 205A and 290H in my name. Given the scale of child sexual abuse and exploitation that takes place online, and the reprehensible nature of these crimes, it is important that Ofcom has effective powers to require companies to tackle it. This brings me to these government amendments, which make small changes to the powers in Clause 110 to ensure that they are effective. I will focus particularly, in the first instance, on Amendment 290H, which ensures that Ofcom considers whether a service has features that allow content to be shared widely via another service when deciding whether content has been communicated publicly or privately, including for the purposes of issuing a notice. This addresses an issue highlighted by the Independent Reviewer of Terrorism Legislation, Jonathan Hall, and Professor Stuart Macdonald in a recent paper. The separate, technical amendment, Amendment 205A, clarifies that Clause 110(7) refers only to a notice on a user-to-user service.
Amendment 190 in the name of the noble Lord, Lord Clement-Jones, seeks to introduce a new privacy duty on Ofcom when considering whether to use any of its powers. The extensive privacy safeguards that I have already set out, along with Ofcom’s human rights obligations, would make this amendment unnecessary. Ofcom must also explicitly consult persons whom it considers to have expertise in the enforcement of the criminal law and the protection of national security, which is relevant to online safety matters in the course of preparing its draft codes. This may include the integrity and security of internet services where relevant.
Amendments 202 and 206, in the name of the noble Lord, Lord Stevenson of Balmacara, and Amendments 207, 208, 244, 246, 247, 248, 249 and 250 in the name of the noble Lord, Lord Clement-Jones, all seek to deliver privacy safeguards to notices issued under Clause 110 through additional review and appeals processes. There are already strong safeguards concerning this power. As part of the warning notice process, companies will be able to make representations to Ofcom which it is bound to consider before issuing a notice. Ofcom must also review any notice before the end of the period for which it has effect.
Amendment 202 proposes mirroring the safeguards of the investigatory powers Act when issuing notices to encrypted messaging services under this power. First, this would be inappropriate, because the powers in the investigatory powers Act serve different purposes from those in this Bill. The different legal safeguards in the investigatory powers Act reflect the potential intrusion by the state into an individual’s private communications; that is not the case with this Bill, which does not grant investigatory powers to state bodies, such as the ability to intercept private communications. Secondly, making a reference to encryption would be—
Is that right? I do not need a yes or no answer. It was rhetorical; I am just trying to frame the right question. The Minister is making a very strong point about the difference between RIPA requirements and those that might be brought in under this Bill. But it does not really get to the bottom of the questions we were asking. In this situation, whatever the exact analogy between the two systems is, it is clear that Ofcom is marking its own homework—which is fair enough, as there are representations, but it is not getting external advice or seeking judicial approval.
The Minister’s point was that that was okay because it was private companies involved. But we are saying here that these would be criminal offences taking place and therefore there is bound to be interest from the police and other agencies, including anti-terrorism agencies. It is clearly similar to the RIPA arrangements, so he could he just revisit that?
Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.
The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.
I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.
Before my noble friend moves on, when he is reviewing that back in the office, could he also satisfy himself that the concerns coming from the journalism and news organisations in the context of RIPA are also understood and have been addressed? That is another angle which, from what my noble friend has said so far, I am not sure has really been acknowledged. That is not a criticism but it is worth him satisfying himself on it.
I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.
Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.
I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.
Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.
I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.
I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.
Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.
To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.
The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.
Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.
As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.
Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.
Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.
Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.
Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.
I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.
My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.
I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.
The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.
The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.
The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.
The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.
The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.
The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.
(1 year, 7 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, I am grateful for this short and focused debate, which has been helpful, and for the points made by the noble Lords, Lord Stevenson and Lord Allan, and the noble Baroness, Lady Kidron. I think we all share the same objective: ensuring that terms of service promote accountability and transparency, and empower users.
One of the Bill’s key objectives is to ensure that the terms of service of user-to-user platforms are suitable and effective. Under the Bill, companies will be required both to set out clearly how they will tackle illegal content and protect children and to ensure that their terms of service are properly enforced. The additional transparency and accountability duties on category 1 services will further ensure that users know what to expect on the largest platforms. This will put an end to these services arbitrarily removing content or, conversely, failing to remove content that they profess to prohibit.
The Bill will also ensure that search services are clear to their users about how they are complying with their adult and child safety duties under this new law. Given the very different way in which search services operate, however, this will be achieved through a publicly available statement rather than through terms of service. The two are meant distinctly.
Noble Lords are right to point to the question of intelligibility. It struck me that, if it takes 10 days to read terms of service, perhaps we should have a race during the 10 days allotted to this Committee stage to see which is quicker—but I take the point. The noble Lord, Lord Allan, is also right that the further requirements imposed through this Bill will only add to that.
The noble Baroness, Lady Kidron, asked a fair question about what “accessibility” means. The Bill requires all platforms’ terms of service for illegal content and child safety duties to be clear and accessible. Ofcom will provide guidance on what that means, including ensuring that they are suitably prominent. The same applies to terms of service for category 1 services relating to content moderation.
I will focus first on Amendments 16, 21, 66DA, 75 and 197, which seek to ensure that both Ofcom and platforms consider the risks associated with platforms’ terms of service with regard to the illegal content and child safety duties in the Bill. We do not think that these amendments are needed. User-to-user services will already be required to assess the risks regarding their terms of service for illegal content. Clause 8 requires companies to assess the “design and operation” of a service in relation to illegal content. As terms of service are integral to how a service operates, they would be covered by this provision. Similarly, Clause 10 sets out that companies likely to be accessed by children will be required to assess the “design and operation” of a service as part of their child risk assessments, which would include the extent to which their terms of service may reduce or increase the risk of harm to children.
In addition to those risk assessment duties, the safety duties will require companies to take proportionate measures effectively to manage and mitigate the risk of harm to people whom they have identified through risk assessments. This will include making changes to their terms of service, if appropriate. The Bill does not impose duties on search services relating to terms of service, as search services’ terms of service play a less important role in determining how users can engage on a platform. I will explain this point further when responding to specific amendments relating to search services but I can assure the noble Lord, Lord Stevenson, that search services will have comprehensive duties to understand and mitigate how the design and operation of their service affects risk.
Amendment 197 would require Ofcom to assess how platforms’ terms of service affect the risk of harm to people that the sector presents. While I agree that this is an important risk factor which Ofcom must consider, it is already provided for in Clause 89, which requires Ofcom to undertake an assessment of risk across regulated services. That requires Ofcom to consider which characteristics of regulated services give rise to harm. Given how integral terms of service are to how many technology companies function, Ofcom will necessarily consider the risk associated with terms of service when undertaking that risk assessment.
However, elevating terms of service above other systems and processes, as mentioned in Clause 89, would imply that Ofcom needs to take account of the risk of harm on the regulated service, more than it needs to do so for other safety-by-design systems and processes or for content moderation processes, for instance. That may not be suitable, particularly as the service delivery methods will inevitably change over time. Instead, Clause 89 has been written to give Ofcom scope to organise its risk assessment, risk register and risk profiles as it thinks suitable. That is appropriate, given that it is best placed to develop detailed knowledge of the matters in question as they evolve over time.
Amendments 70, 71, 72, 79, 80, 81, 174 and 302 seek to replace the Bill’s references to publicly available statements, in relation to search services, with terms of service. This would mean that search services would have to publish how they are complying with their illegal content and child protection duties in terms of service rather than in publicly available statements. I appreciate the spirit in which the noble Lord has tabled and introduced these amendments. However, they do not consider the very different ways in which search services operate.
User-to-user services’ terms of service fulfil a very specific purpose. They govern a user’s behaviour on the service and set rules on what a user is allowed to post and how they can interact with others. If a user breaks these terms, a service can block his or her access or remove his or her content. Under the status quo, users have very few mechanisms by which to hold user-to-user platforms accountable to these terms, meaning that users can arbitrarily see their content removed with few or no avenues for redress. Equally, a user may choose to use a service because its terms and conditions lead them to believe that certain types of content are prohibited while in practice the company does not enforce the relevant terms.
The Bill’s duties relating to user-to-user services’ terms of service seek to redress this imbalance. They will ensure that people know what to expect on a platform and enable them to hold platforms accountable. In contrast, users of search services do not create content or interact with other users. Users can search for anything without restriction from the search service provider, although a search term may not always return results. It is therefore not necessary to provide detailed information on what a user can and cannot do on a search service. The existing duties on such services will ensure that search engines are clear to users about how they are complying with their safety duties. The Bill will require search services to set out how they are fulfilling them, in publicly available statements. Their actions must meet the standards set by Ofcom. Using these statements will ensure that search services are as transparent as user-to-user services about how they are complying with their safety duties.
The noble Lord’s Amendment 174 also seeks to expand the transparency reporting requirements to cover the scope and application of the terms of service set out by search service providers. This too is unnecessary because, via Schedule 8, the Bill already ensures transparency about the scope and application of the provisions that search services must make publicly available. I hope that gives the noble Lord some reassurance that the concerns he has raised are already covered. With that, I invite him to withdraw Amendment 16.
My Lords, I am very grateful to the Minister for that very detailed response, which I will have to read very carefully because it was quite complicated. That is the answer to my question. Terms of service will not be very easy to identify because to answer my questions he has had to pray in aid issues that Ofcom will necessarily have to assess—terms of services—to get at whether the companies are performing the duties that the Bill requires of them.
I will not go further on that. We know that there will be enough there to answer the main questions I had about this. I take the point about search being distinctively different in this area, although a tidy mind like mine likes to see all these things in one place and understand all the words. Every time I see “publicly available statement”, I do not know why but I think about people being hanged in public rather than a term of service or a contract.
My Lords, we seem to have done it again—a very long list of amendments in a rather ill-conceived group has generated a very interesting discussion. We are getting quite good at this, exchanging views across the table, across the Committee, even within the Benches—Members who perhaps have not often talked together are sharing ideas and thoughts, and that is a wonderful feeling.
I want to start with an apology. I think I may be the person who got the noble Baroness, Lady Kidron, shopped by the former leader—once a leader, always a leader. What I thought I was being asked was whether the Committee would be interested in hearing the views of the noble Viscount who could not be present, and I was very keen, because when he does speak it is from a point of view that we do not often hear. I did not know that it was a transgression of the rules—but of course it is not, really, because we got round it. Nevertheless, I apologise for anything that might have upset the noble Baroness’s blood pressure—it did not stop her making a very good contribution later.
We have covered so much ground that I do not want to try and summarise it in one piece, because you cannot do that. The problem with the group as it stands is that the right reverend Prelate the Bishop of Derby and myself must have some secret connection, because we managed to put down almost the same amendments. They were on issues that then got overtaken by the Minister, who finally got round to—I mean, who put down a nice series of amendments which exactly covered the points we made, so we can lose all those. But this did not stop the right reverend Prelate the Bishop of Guildford making some very good additional points which I think we all benefited from.
I welcome back the noble Baroness, Lady Buscombe, after her illness; she gave us a glimpse of what is to come from her and her colleagues, but I will leave the particular issue that she raised for the Minister to respond to. It raises an issue that I am not competent on, but it is a very important one—we need to get the right balance between what is causing the alarm and difficulty outside in relation to what is happening on the internet, and I think we all agree with her that we should not put any barrier in the way of dealing with that.
Indeed, that was the theme of a number of the points that have been raised on the question of what is or can constitute illegal content, and how we judge it. It is useful to hear again from the master about how you do it in practice. I cannot imagine being in a room of French lawyers and experts and retaining my sanity, let alone making decisions that affect the ability of people to carry on, but the noble Lord did it; he is still here and lives to tell the tale—bearded or otherwise.
The later amendments, particularly from the noble Lord, Lord Clement-Jones, are taking us round in a circle towards the process by which Ofcom will exercise the powers that it is going to get in this area. These are probably worth another debate on their own, and maybe it will come up in a different form, because—I think the noble Baroness, Lady Stowell, made this point as well—there is a problem in having an independent regulator that is also the go-to function for getting advice on how others have to make decisions that are theirs to rule on at the end if they go wrong. That is a complicated way of saying that we may be overloading Ofcom if we also expect it to provide a reservoir of advice on how you deal with the issues that the Bill puts firmly on the companies—I agree that this is a problem that we do not really have an answer to.
My amendments were largely overtaken by the Government’s amendments, but the main one I want to talk about was Amendment 272. I am sorry that the noble Baroness, Lady Morgan, is not here, because her expertise is in an area that I want to talk about, which is fraud—cyber fraud in particular—and how that is going to be brought into the Bill. The issue, which I think has been raised by Which?, but a number of other people have also written to us about it, is that the Bill in Clauses 170 and 171 is trying to establish how a platform should identify illegal content in relation to fraud—but it is quite prescriptive. In particular, it goes into some detail which I will leave for the Minister to respond to, but uniquely it sets out a specific way for gathering information to determine whether content is illegal in this area, although it may have applicability in other areas.
One of the points that have to be taken into account is whether the platform is using human moderators, automated systems or a combination of the two. I am not quite sure why that is there in the Bill; that is really the basis for the tabling of our amendments. Clearly, one would hope that the end result is whether or not illegality has taken place, not how that information has been gathered. If one must make concessions to the process of law because a judgment is made that, because it is automated, it is in some way not as valid as if it had been done by a human moderator, there seems to be a whole world there that we should not be going into. I certainly hope that that is not going to be the case if we are talking about illegality concerning children or other vulnerable people, but that is how the Bill reads at present; I wonder whether the Minister can comment on that.
There is a risk of consumers being harmed here. The figures on fraud in the United Kingdom are extraordinary; the fact that it is not the top priority for everybody, let alone the Government, is extraordinary. It is something like the equivalent of consumers being scammed at the rate of around £7.5 billion per year. A number of awful types of scamming have emerged only because of the internet and social media. They create huge problems of anxiety and emotional distress, with lots of medical care and other things tied in if you want to work out the total bill. So we have a real problem here that we need to settle. It is great that it is in the Bill, but it would be a pity if the movement towards trying to resolve it is in any way infringed on by there being imperfect instructions in the Bill. I wonder whether the Minister would be prepared to respond to that; I would be happy to discuss it with him later, if that is possible.
As a whole, this is an interesting question as we move away from what a crime is towards how people judge how to deal with what they think is a crime but may not be. The noble Lord, Lord Allan, commented on how to do it in practice but one hopes that any initial problems will be overcome as we move forward and people become more experienced with this.
When the Joint Committee considered this issue, we spent a long time talking about why we were concerned about having certainty on the legal prescription in the Bill; that is why we were very much against the idea of “legal but harmful” because it seemed too subjective and too subject to difficulties. Out of that came another thought, which answers the point made by the noble Baroness, Lady Stowell: so much of this is about fine judgments on certain things that are there in stone and that you can work to but you then have to interpret them.
There is a role for Parliament here, I think; we will come on to this in later amendments but, if there is a debate to be had on this, let us not forget the points that have been made here today. If we are going to think again about Ofcom’s activity in practice, that is the sort of thing that either a Joint Committee or Select Committees of the two Houses could easily take on board as an issue that needs to be reflected on, with advice given to Parliament about how it might be taken forward. This might be the answer in the medium term.
In the short term, let us work to the Bill and make sure that it works. Let us learn from the experience but let us then take time out to reflect on it; that would be my recommendation but, obviously, that will be subject to the situation after we finish the Bill. I look forward to hearing the Minister’s response.
My Lords, as well as throwing up some interesting questions of law, this debate has provoked some interesting tongue-twisters. The noble Lord, Lord Allan of Hallam, offered a prize to the first person to pronounce the Netzwerkdurchsetzungsgesetz; I shall claim my prize in our debate on a later group when inviting him to withdraw his amendment.
I thank the noble Lord.
I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.
I take the noble Lord’s point and my noble friend’s further contribution. I will see whether I can give a clearer and more succinct description in writing to flesh that out, but that it is the reason that we have alighted on the words that we have.
The noble Lord, Lord Allan, also asked about jurisdiction. If an offence has been committed in the UK and viewed by a UK user, it can be treated as illegal content. That is set out in Clause 53(11), which says:
“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom”.
I hope that that bit, at least, is clearly set out to the noble Lord’s satisfaction. It looks like it may not be.
Again, I think that that is clear. I understood from the Bill that, if an American says something that would be illegal were they to be in the United Kingdom, we would still want to exclude that content. But that still leaves it open, and I just ask the question again, for confirmation. If all of the activities are outside the United Kingdom—Americans talking to each other, as it were—and a British person objects, at what point would the platform be required to restrict the content of the Americans talking to each other? Is it pre-emptively or only as and when somebody in the United Kingdom objects to it? We should flesh out that kind of practical detail before this becomes law.
If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.
Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.
My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.
Before he moves on, is my noble friend going to give any advice to, for example, Welsh Wikipedia, as to how it will be able to continue, or are the concerns about smaller sites simply being brushed aside, as my noble friend explicates what the Bill already says?
I will deal with all the points in the speech. If I have not done so by the end, and if my noble friend wants to intervene again, I would be more than happy to hear further questions, either to answer now or write to him about.
Amendments 128 to 133 and 143 to 153, in the names of the right reverend Prelate the Bishop of Derby and the noble Lord, Lord Stevenson of Balmacara, seek to ensure that priority offences relating to modern slavery and human trafficking, where they victimise children, are included in Schedule 6. These amendments also seek to require technology companies to report content which relates to modern slavery and the trafficking of children—including the criminal exploitation of children—irrespective of whether it is sexual exploitation or not. As noble Lords know, the strongest provisions in the Bill relate to children’s safety, and particularly to child sexual exploitation and abuse content. These offences are captured in Schedule 6. The Bill includes a power for Ofcom to issue notices to companies requiring them to use accredited technology or to develop new technology to identify, remove and prevent users encountering such illegal content, whether communicated publicly or privately.
These amendments would give Ofcom the ability to issue such notices for modern slavery content which affects children, even when there is no child sexual exploitation or abuse involved. That would not be appropriate for a number of reasons. The power to tackle illegal content on private communications has been restricted to the identification of content relating to child sexual exploitation and abuse because of the particular risk to children posed by content which is communicated privately. Private spaces online are commonly used by networks of criminals to share illegal images—as we have heard—videos, and tips on the commitment of these abhorrent offences. This is highly unlikely to be reported by other offenders, so it will go undetected if companies do not put in place measures to identify it. Earlier in Committee, the noble Lord, Lord Allan, suggested that those who receive it should report it, but of course, in a criminal context, a criminal recipient would not do that.
Extending this power to cover the identification of modern slavery in content which is communicated privately would be challenging to justify and could represent a disproportionate intrusion into someone’s privacy. Furthermore, modern slavery is usually identified through patterns of behaviour or by individual reporting, rather than through content alone. This reduces the impact that any proactive technology required under this power would have in tackling such content. Schedule 6 already sets out a comprehensive list of offences relating to child sexual exploitation and abuse which companies must tackle. If these offences are linked to modern slavery—for example, if a child victim of these offences has been trafficked—companies must take action. This includes reporting content which amounts to an offence under Schedule 6 to the National Crime Agency or another reporting body outside of the UK.
My noble friend Lord Moylan’s Amendment 135 seeks to remove the offence in Section 5 of the Public Order Act 1986 from the list of priority offences. His amendment would mean that platforms were not required to take proactive measures to reduce the risk of content which is threatening or abusive, and intended to cause a user harassment, alarm or distress, from appearing on their service. Instead, they would be obliged to respond only once they are made aware of the content, which would significantly reduce the impact of the Bill’s framework for tackling such threatening and abusive content. Given the severity of the harm which can be caused by that sort of content, it is right that companies tackle it. Ofcom will have to include the Public Order Act in its guidance about illegal content, as provided for in Clause 171.
Government Amendments 136A to 136C seek to strengthen the illegal content duties by adding further priority offences to Schedule 7. Amendments 136A and 136B will add human trafficking and illegal entry offences to the list of priority offences in the Bill. Crucially, this will mean that platforms will need to take proactive action against content which encourages or assists others to make dangerous, illegal crossings of the English Channel, as well as those who use social media to arrange or facilitate the travel of another person with a view to their exploitation.
The noble Lord, Lord Allan, asked whether these amendments would affect the victims of trafficking themselves. This is not about going after the victims. Amendment 136B addresses only content which seeks to help or encourage the commission of an existing immigration offence; it will have no impact on humanitarian communications. Indeed, to flesh out a bit more detail, Section 2 of the Modern Slavery Act makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation. Facilitating a victim’s travel includes recruiting them. This offence largely appears online in the form of advertisements to recruit people into being exploited. Some of the steps that platforms could put in place include setting up trusted flagger programmes, signposting users to support and advice, and blocking known bad actors. Again, I point to some of the work which is already being done by social media companies to help tackle both illegal channel crossings and human trafficking.
My Lords, over the last few hours I have praised us for having developed a style of discussion and debate that is certainly relatively new and not often seen in the House, where we have tried to reach out to each other and find common ground. That was not a problem in this last group of just over an hour; I think we are united around the themes that were so brilliantly introduced in a very concise and well-balanced speech by the noble Baroness, Lady Kidron, who has been a leading and inspirational force behind this activity for so long.
Although different voices have come in at different times and asked questions that still need to be answered, I sense that we have reached a point in our thinking, if not in our actual debates, where we need a plan. I too reached this point; that was exactly the motivation I had in tabling Amendment 1, which was discussed on the first day. Fine as the Bill is—it is a very impressive piece of work in every way—it lacks what we need as a Parliament to convince others that we have understood the issues and have the answers to their questions about what this Government, or this country as a whole, are going to do about this tsunami of difference, which has arrived in the wake of the social media companies and search engines, in the way we do our business and live our lives these days. There is consensus, but it is slightly different to the consensus we had in earlier debates, where we were reassuring ourselves about the issues we were talking about but were not reaching out to the Government to change anything so much as being happy that we were speaking the same language and that they were in the same place as we are gradually coming to as a group, in a way.
Just before we came back in after the lunch break, I happened to talk to the noble Lord, Lord Grade, who is the chair of Ofcom and is listening to most of our debates and discussions when his other duties allow. I asked him what he thought about it, and he said that it was fascinating for him to recognise the level of expertise and knowledge that was growing up in the House, and that it would be a useful resource for Ofcom in the future. He was very impressed by the way in which everyone was engaging and not getting stuck in the niceties of the legislation, which he admitted he was experiencing himself. I say that softly; I do not want to embarrass him in any way because he is an honourable man. However, the point he makes is really important.
I say to the Minister that I do not think we are very far apart on this. He knows that, because we have discussed it at some length over the last six to eight weeks. What I think he should take away from this debate is that this is a point where a decision has to be taken about whether the Government are going to go with the consensus view being expressed here and put deliberately into the Bill a repetitive statement, but one that is clear and unambiguous, about the intention behind the Government’s reason for bringing forward the Bill and for us, the Opposition and other Members of this House, supporting it, which is that we want a safe internet for our children. The way we are going to do that is by having in place, up front and clearly in one place, the things that matter when the regulatory structure sits in place and has to deal with the world as it is, of companies with business plans and business models that are at variance with what we think should be happening and that we know are destroying the lives of people we love and the future of our country—our children—in a way that is quite unacceptable when you analyse it down to its last detail.
It is not a question of saying back to us across the Dispatch Box—I know he wants to but I hope he will not—“Everything that you have said is in the Bill; we don’t need to go down this route, we don’t need another piece of writing that says it all”. I want him to forget that and say that actually it will be worth it, because we will have written something very special for the world to look at and admire. It is probably not in its perfect form yet, but that is what the Government can do: take a rough and ready potential diamond, polish it, chamfer it, and bring it back and set it in a diadem we would all be proud to wear—Coronations excepted—so that we can say, “Look, we have done the dirty work here. We’ve been right down to the bottom and thought about it. We’ve looked at stuff that we never thought in our lives we would ever want to see and survived”.
I shake at some of the material we were shown that Molly Russell was looking at. But I never want to be in a situation where I will have to say to my children and grandchildren, “We had the chance to get this right and we relied on a wonderful piece of work called the Online Safety Act 2023; you will find it in there, but it is going to take you several weeks and a lot of mental harm and difficulty to understand what it means”.
So, let us make it right. Let us not just say “It’ll be alright on the night”. Let us have it there. It is almost right but, as my noble friend Lord Knight said, it needs to be patched back into what is already in the Bill. Somebody needs to look at it and say, “What, out of that, will work as a statement to the world that we care about our kids in a way that will really make a difference?” I warn the Minister that, although I said at Second Reading that I wanted to see this Bill on the statute book as quickly as possible, I will not accept a situation where we do not have more on this issue.
I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.
Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.
I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.
I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.
As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.
It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.
First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as
“anything communicated by means of an internet service”.
Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.
Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.
Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.
In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?
I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.
I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.
With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.
I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that
“primary priority content harmful to children”
will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?
I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.
For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.
Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.
This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.
The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.
The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.
I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.
Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.
I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.
Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.
We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.
I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.
I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.
What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.
Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.
Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.
(1 year, 7 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.
Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.
Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.
My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.
Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.
The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.
I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.
Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.
It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.
Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as
“anything communicated by means of an internet service”.
Under this definition, in essence, all communication and activity is facilitated by content.
I hope that the Minister has in his brief a response to the noble Baroness’s point about Clause 11(14), which, I must admit, comes across extraordinarily in this context. She quoted it, saying:
“The duties set out … are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.
Is not that exception absolutely at the core of what we are talking about today? It is surely therefore very difficult for the Minister to say that this applies in a very broad way, rather than purely to content.
I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.
I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.
I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.
Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.
Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.
Can the Minister assure us that he will take another look at this between Committee and Report? He has almost made the case for this wording to be taken out—he said that it is already covered by a whole number of different clauses in the Bill—but it is still here. There is still an exception which, if the Minister is correct, is highly misleading: it means that you have to go searching all over the Bill to find a way of attacking the algorithm, essentially, and the way that it amplifies, disseminates and so on. That is what we are trying to get to: how to address the very important issue not just of content but of the way that the algorithm operates in social media. This seems to be highly misleading, in the light of what the Minister said.
I do not think so, but I will certainly look at it again, and I am very happy to speak to the noble Lord as I do. My point is that it would not be workable or proportionate for a provider to prevent or protect all children from encountering every single instance of the sort of content that I have just outlined, which would be the effect of these amendments. I will happily discuss that with the noble Lord and others between now and Report.
Amendment 27, by the noble Lord, Lord Stevenson, seeks to add a duty to prevent children encountering targeted paid-for advertising. As he knows, the Bill has been designed to tackle harm facilitated through user-generated content. Some advertising, including paid-for posts by influencers, will therefore fall under the scope of the Bill. Companies will need to ensure that systems for targeting such advertising content to children, such as the use of algorithms, protect them from harmful material. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. The Bill is designed to reduce harm on services which host user-generated content, whereas online advertising poses a different set of problems, with different actors. The Government are taking forward work in this area through the online advertising programme, which will consider the full range of actors and sector-appropriate solutions to those problems.
I understand the Minister’s response, and I accept that there is a parallel stream of work that may well address this. However, we have been waiting for the report from the group that has been looking at that for some time. Rumours—which I never listen to—say that it has been ready for some time. Can the Minister give us a timescale?
I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.
Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.
Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.
Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.
Technical possibility is a matter for the sector—
I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?
This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.
The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.
Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.
Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.
As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.
I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.
My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—
My noble friend the Minister did not address the concern I set out that the Bill’s approach will overburden Ofcom. If Ofcom has to review the suitability of each set of alternative measures, we will create an even bigger monster than we first thought.
I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.
Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.
Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.
Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.
As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.
My Lords, we too support the spirit of these amendments very much and pay tribute to the noble Lord, Lord Russell, for tabling them.
In many ways, I do not need to say very much. I think the noble Baroness, Lady Kidron, made a really powerful case, alongside the way the group was introduced in respect of the importance of these things. We do want the positivity that the noble Baroness, Lady Harding, talked about in respect of the potential and opportunity of technology for young people. We want them to have the right to freedom of expression, privacy and reliable information, and to be protected from exploitation by the media. Those happen to be direct quotes from the UN Convention on the Rights of the Child, as some of the rights they would enjoy. Amendments 30 and 105, which the noble Lord, Lord Clement-Jones, tabled—I attached my name to Amendment 30—are very much in that spirit of trying to promote well-being and trying to say that there is something positive that we want to see here.
In particular, I would like to see that in respect of Ofcom. Amendment 187 is, in some ways, the more significant amendment and the one I most want the Minister to reflect on. That is the one that applies to Ofcom: that it should have reference to the UN Convention on the Rights of the Child. I think even the noble Lord, Lord Weir, could possibly agree. I understand his thoughtful comments around whether or not it is right to encumber business with adherence to the UN convention, but Ofcom is a public body in how it carries out its duties as a regulator. There are choices for regulation. Regulation can just be about minimum standards, but it can also be about promoting something better. What we are seeking here in trying to have reference to the UN convention is for Ofcom to regulate for something more positive and better, as well as police minimum standards. On that basis, we support the amendments.
My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.
Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.
As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.
Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.
The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.
The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.
Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.
More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.
My Lords, I thank all noble Lords for taking part in this discussion. I thank the noble Lord, Lord Weir, although I would say to him that his third point—that, in his experience, the UNCRC is open to different interpretations by different departments—is my experience of normal government. Name me something that has not been interpreted differently by different departments, as it suits them.
(1 year, 7 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
Yes; the noble Baroness is right. She has pointed out in other discussions I have been party to that, for example, gaming technology that looks at the movement of the player can quite accurately work out from their musculoskeletal behaviour, I assume, the age of the gamer. So there are alternative methods. Our challenge is to ensure that if they are to be used, we will get the equivalent of age verification or better. I now hand over to the Minister.
My Lords, I think those last two comments were what are known in court as leading questions.
As the noble Baroness, Lady Ritchie of Downpatrick, said herself, some of the ground covered in this short debate was covered in previous groups, and I am conscious that we have a later grouping where we will cover it again, including some of the points that were made just now. I therefore hope that noble Lords will understand if I restrict myself at this point to Amendments 29, 83 and 103, tabled by the noble Baroness, Lady Ritchie.
These amendments seek to mandate age verification for pornographic content on a user-to-user or search service, regardless of the size and capacity of a service provider. The amendments also seek to remove the requirement on Ofcom to have regard to proportionality and technical feasibility when setting out measures for providers on pornographic content in codes of practice. While keeping children safe online is the top priority for the Online Safety Bill, the principle of proportionate, risk-based regulation is also fundamental to the Bill’s framework. It is the Government’s considered opinion that the Bill as drafted already strikes the correct balance between these two.
The provisions in the Bill on proportionality are important to ensure that the requirements in the child-safety duties are tailored to the size and capacity of providers. It is also essential that measures in codes of practice are technically feasible. This will ensure that the regulatory framework as a whole is workable for service providers and enforceable by Ofcom. I reassure your Lordships that the smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services, and will need to make sure that these systems and processes achieve the required outcomes of the child safety duty. Wherever in the Bill they are regulated, companies will need to take steps to ensure that they cannot offer pornographic content online to those who should not see it. Ofcom will set out in its code of practice the steps that companies in the scope of Part 3 can take to comply with their duties under the Bill, and will take a robust approach to sites that pose the greatest risk of harm to children, including sites hosting online pornography.
The passage of the Bill should be taken as a clear message to providers that they need to begin preparing for regulation now—indeed, many are. Responsible providers should already be factoring in regulatory compliance as part of their business costs. Ofcom will continue to work with providers to ensure that the transition to the new regulatory framework will be as smooth as possible.
The Government expect companies to use age-verification technologies to prevent children accessing services that pose the highest risk of harm to children, such as online pornography. The Bill will not mandate that companies use specific technologies to comply with new duties because, as noble Lords have heard me say before, what is most effective in preventing children accessing pornography today might not be equally effective in future. In addition, age verification might not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For instance, if a user-to-user service, such as a particular social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. This would allow content to be better detected and taken down, instead of restricting children from seeing content which is not allowed on the service in the first place. Companies may also use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.
In addition, the amendments in the name of the noble Baroness, Lady Ritchie, risk inadvertently shutting children out of large swathes of the internet that are entirely appropriate for them to access. This is because it is impossible totally to eliminate the risk that a single piece of pornography or pornographic material might momentarily appear on a site, even if that site prohibits it and has effective systems in place to prevent it appearing. Her amendments would have the effect of essentially requiring every service to block children through the use of age verification.
Those are the reasons why the amendments before us are not ones that we can accept. Mindful of the fact that we will return to these issues in a future group, I invite the noble Baroness to withdraw her amendment.
My Lords, I thank all noble Lords who have participated in this wide-ranging debate, in which various issues have been raised.
The noble Baroness, Lady Benjamin, made the good point that there needs to be a level playing field between Parts 3 and 5, which I originally raised and which other noble Lords raised on Tuesday of last week. We keep coming back to this point, so I hope that the Minister will take note of it on further reflection before we reach Report. Pornography needs to be regulated on a consistent basis across the Bill.
My Lords, I wonder whether I can make a brief intervention—I am sorry to do so after the noble Lord, Lord Clement-Jones, but I want to intervene before my noble friend the Minister stands up, unless the Labour Benches are about to speak.
I have been pondering this debate and have had a couple of thoughts. Listening to the noble Lord, Lord Clement-Jones, I am reminded of something which was always very much a guiding light for me when I chaired the Charity Commission, and therefore working in a regulatory space: regulation is never an end in itself; you regulate for a reason.
I was struck by the first debate we had on day one of Committee about the purpose of the Bill. If noble Lords recall, I said in that debate that, for me, the Bill at its heart was about enhancing the accountability of the platforms and the social media businesses. I felt that the contribution from my noble friend Lady Harding was incredibly important. What we are trying to do here is to use enforcement to drive culture change, and to force the organisations not to never think about profit but to move away from profit-making to focusing on child safety in the way in which they go about their work. That is really important when we start to consider the whole issue of enforcement.
It struck me at the start of this discussion that we have to be clear what our general approach and mindset is about this part of our economy that we are seeking to regulate. We have to be clear about the crimes we think are being committed or the offences that need to be dealt with. We need to make sure that Ofcom has the powers to tackle those offences and that it can do so in a way that meets Parliament’s and the public’s expectations of us having legislated to make things better.
I am really asking my noble friend the Minister, when he comes to respond on this, to give us a sense of clarity on the whole question of enforcement. At the moment, it is insufficiently clear. Even if we do not get that level of clarity today, when we come back later on and look at enforcement, it is really important that we know what we are trying to tackle here.
My Lords, I will endeavour to give that clarity, but it may be clearer still if I flesh some points out in writing in addition to what I say now.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I am very grateful to the noble Lords who have spoken on the amendments in this group, both this afternoon and last Tuesday evening. As this is a continuation of that debate, I think my noble friend Lord Moylan is technically correct still to wish the noble Baroness, Lady Kidron, a happy birthday, at least in procedural terms.
We have had a very valuable debate over both days on the Bill’s approach to holding platforms accountable to their users. Amendments 33B, 41A, 43ZA, 138A and 194A in the names of the noble Lords, Lord Lipsey and Lord McNally, and Amendment 154 in the name of the noble Lord, Lord Stevenson of Balmacara, seek to bring back the concept of legal but harmful content and related adult risk assessments. They reintroduce obligations for companies to consider the risk of harm associated with legal content accessed by adults. As noble Lords have noted, the provisions in the Bill to this effect were removed in another place, after careful consideration, to protect freedom of expression online. In particular, the Government listened to concerns that the previous legal but harmful provisions could create incentives for companies to remove legal content from their services.
In place of adult risk assessments, we introduced new duties on category 1 services to enable users themselves to understand how these platforms treat different types of content, as set out in Clauses 64 and 65. In particular, this will allow Ofcom to hold them to account when they do not follow through on their promises regarding content they say that they prohibit or to which they say that they restrict access. Major platforms already prohibit much of the content listed in Clause 12, but these terms of service are often opaque and not consistently enforced. The Bill will address and change that.
I would also like to respond to concerns raised through Amendments 41A and 43ZA, which seek to ensure that the user empowerment categories cover the most harmful categories of content to adults. I reassure noble Lords that the user empowerment list reflects input from a wide range of interested parties about the areas of greatest concern to users. Platforms already have strong commercial incentives to tackle harmful content. The major technology companies already prohibit most types of harmful and abusive content. It is clear that most users do not want to see that sort of content and most advertisers do not want their products advertised alongside it. Clause 12 sets out that providers must offer user empowerment tools with a specified list of content to the extent that it is proportionate to do so. This will be based on the size or capacity of the service as well as the likelihood that adult users will encounter the listed content. Providers will therefore need internally to assess the likelihood that users will encounter the content. If Ofcom disagrees with the assessment that a provider has made, it will have the ability to request information from providers for the purpose of assessing compliance.
Amendments 44 and 158, tabled by the right reverend Prelate the Bishop of Oxford, seek to place new duties on providers of category 1 services to produce an assessment of their compliance with the transparency, accountability, freedom of expression and user empowerment duties as set out in Clauses 12, 64 and 65 and to share their assessments with Ofcom. I am sympathetic to the aim of ensuring that Ofcom can effectively assess companies’ compliance with these duties. But these amendments would enable providers to mark their own homework when it comes to their compliance with the duties in question. The Bill has been designed to ensure that Ofcom has responsibility for assessing compliance and that it can obtain sufficient information from all regulated services to make judgments about compliance with their duties. The noble Baroness, Lady Kidron, asked about this—and I think the noble Lord, Lord Clement-Jones, is about to.
I hope the Minister will forgive me for interrupting, but would it not be much easier for Ofcom to assess compliance if a risk assessment had been carried out?
I will come on to say a bit more about how Ofcom goes about that work.
The Bill will ensure that providers have the information they need to understand whether they are in compliance with their duties under the Bill. Ofcom will set out how providers can comply in codes of practice and guidance that it publishes. That information will help providers to comply, although they can take alternative action if they wish to do so.
The right reverend Prelate’s amendments also seek to provide greater transparency to Ofcom. The Bill’s existing duties already account for this. Indeed, the transparency reporting duties set out in Schedule 8 already enable Ofcom to require category 1, 2A and 2B services to publish annual transparency reports with relevant information, including about the effectiveness of the user empowerment tools, as well as detailed information about any content that platforms prohibit or restrict, and the application of their terms of service.
Amendments 159, 160 and 218, tabled by the noble Lord, Lord Stevenson, seek to require user-to-user services to create and abide by minimum terms of service recommended by Ofcom. The Bill already sets detailed and binding requirements on companies to achieve certain outcomes. Ofcom will set out more detail in codes of practice about the steps providers can take to comply with their safety duties. Platforms’ terms of service will need to provide information to users about how they are protecting users from illegal content, and children from harmful content.
These duties, and Ofcom’s codes of practice, ensure that providers take action to protect users from illegal content and content that is harmful to children. As such, an additional duty to have adequate and appropriate terms of service, as envisaged in the amendments, is not necessary and may undermine the illegal and child safety duties.
I have previously set out why we do not agree with requiring platforms to set terms of service for legal content. In addition, it would be inappropriate to delegate this much power to Ofcom, which would in effect be able to decide what legal content adult users can and cannot see.
Amendment 155, tabled by my noble friend Lord Moylan, seeks to clarify whether and how the Bill makes the terms of service of foreign-run platforms enforceable by Ofcom. Platforms’ duties under Clause 65 apply only to the design, operation and use of the service in the United Kingdom and to UK users, as set out in Clause 65(11). Parts or versions of the service which are used in foreign jurisdictions—
On that, in an earlier reply the Minister explained that platforms already remove harmful content because it is harmful and because advertisers and users do not like it, but could he tell me what definition of “harmful” he thinks he is using? Different companies will presumably have a different interpretation of “harmful”. How will that work? It would mean that UK law will require the removal of legal speech based on a definition of harmful speech designed by who—will it be Silicon Valley executives? This is the problem: UK law is being used to implement the removal of content based on decisions that are not part of UK law but with implications for UK citizens who are doing nothing unlawful.
The noble Baroness’s point gets to the heart of the debate that we have had. I talked earlier about the commercial incentive that there is for companies to take action against harmful content that is legal which users do not want to see or advertisers do not want their products to be advertised alongside, but there is also a commercial incentive to ensure that they are upholding free speech and that there are platforms on which people can interact in a less popular manner, where advertisers that want to advertise products legally alongside that are able to do so. As with anything that involves the market, the majority has a louder voice, but there is room for innovation for companies to provide products that cater to minority tastes within the law.
My Lords, my noble friend has explained clearly how terms of service would normally work, which is that, as I said myself, a business might write its own terms of service to its own advantage but it cannot do so too egregiously or it will lose customers, and businesses may aim themselves at different customers. All this is part of normal commercial life, and that is understood. What my noble friend has not really addressed is the question of why uniquely and specifically in this case, especially given the egregious history of censorship by Silicon Valley, he has chosen to put that into statute rather than leave it as a commercial arrangement, and to make it enforceable by Ofcom. For example, when my right honourable friend David Davis was removed from YouTube for his remarks about Covid passes, it would have been Ofcom’s obligation not to vindicate his right to free speech but to cheer on YouTube and say how well it had done for its terms of service.
Our right honourable friend’s content was reuploaded. This makes the point that the problem at the moment is the opacity of these terms and conditions; what platforms say they do and what they do does not always align. The Bill makes sure that users can hold them to account for the terms of service that they publish, so that people can know what to expect on platforms and have some form of redress when their experience does not match their expectations.
I was coming on to say a bit more about that after making some points about foreign jurisdictions and my noble friend’s Amendment 155. As I say, parts or versions of the service that are used in foreign jurisdictions but not in the UK are not covered by the duties in Clause 65. As such, the Bill does not require a provider to have systems and processes designed to enforce any terms of service not applicable in the UK.
In addition, the duties do not give powers to Ofcom to enforce a provider’s terms of service directly. Ofcom’s role will be focused on ensuring that platforms have systems and processes in place to enforce their own terms of service consistently rather than assessing individual pieces of content.
Requiring providers to set terms of service for specific types of content suggests that the Government view that type of content as harmful or risky. That would encourage providers to prohibit such content, which of course would have a negative impact on freedom of expression, which I am sure is not what my noble friend wants to see. Freedom of expression is essential to a democratic society. Throughout the passage of the Bill, the Government have always committed to ensuring that people can speak freely online. We are not in the business of indirectly telling companies what legal content they can and cannot allow online. Instead, the approach that we have taken will ensure that platforms are transparent and accountable to their users about what they will and will not allow on their services.
Clause 65 recognises that companies, as private entities, have the right to remove content that is legal from their services if they choose to do so. To prevent them doing so, by requiring them to balance this against other priorities, would have perverse consequences for their freedom of action and expression. It is right that people should know what to expect on platforms and that they are able to hold platforms to account when that does not happen. On that basis, I invite the noble Lords who have amendments in this group not to press them.
My Lords, in his opening remarks, the Minister referred to the fact that this debate began last Tuesday. Well, it did, in that I made a 10-minute opening speech and the noble Baroness, Lady Stowell, rather elegantly hopped out of this group of amendments; perhaps she saw what was coming.
How that made me feel is perhaps best summed up by what the noble Earl, Lord Howe, said earlier when he was justifying the business for tomorrow. He said that adjournments were never satisfactory. In that spirit, I wrote to the Leader of the House, expressing the grumbles I made in my opening remarks. He has written back in a very constructive and thoughtful way. I will not delay the Committee any longer, other than to say that I hope the Leader of the House would agree to make his reply available for other Members to read. It says some interesting things about how we manage business. It sounds like a small matter but if what happened on Tuesday had happened in other circumstances in the other place, business would probably have been delayed for at least an hour while the usual suspects picked holes in it. If the usual channels would look at this, we could avoid some car crashes in future.
I am pleased that this group of amendments has elicited such an interesting debate, with fire coming from all sides. In introducing the debate, I said that probably the only real advice I could give the Committee came from my experience of being on the pre-legislative scrutiny committee in 2003. That showed just how little we were prepared for the tsunami of new technology that was about to engulf us. My one pleasure was that we were part of forming Ofcom. I am pleased that the chairman of Ofcom, the noble Lord, Lord Grade, has assiduously sat through our debates. I suspect he is thinking that he had better hire some more lawyers.
We are trying to get this right. I have no doubt that all sides of the House want to get this legislation through in good shape and for it to play an important role. I am sure that the noble Lord, Lord Grade, never imagined that he would become a state regulator in the kind of ominous way in which the noble Baroness, Lady Fox, said it. Ofcom has done a good job and will do so in future.
There is a problem of getting definitions right. When I was at the Ministry of Justice, I once had to entertain a very distinguished American lawyer. As I usually did, I explained that I was not a lawyer. He looked at me and said, “Then I will speak very slowly”. There is a danger, particularly in this part of the Bill, of wandering into a kind of lawyer-fest. It is important that we are precise about what powers we are giving to whom. Just to chill the Minister’s soul, I remember being warned as well about Pepper v Hart. What he says at the Dispatch Box will be used to interpret what Parliament meant when it gave this or that power.
The debate we have had thus far has been fully justified in sending a few warning signals to the Minister that it is perhaps not quite right yet. It needs further work. There is a lot of good will on all sides of the House to get it right. For the moment, I beg leave to withdraw my amendment.
As ever, the noble Baroness is an important voice in bursting our bubble in the Chamber. I continue to respect her for that. It will not be perfect; there is no perfect answer to all this. I am siding with safety and caution rather than a bit of a free-for-all. Sometimes there might be overcaution and aspects of debate where the platforms, the regulator, the media, and discussion and debate in this Chamber would say, “The toggles have got it wrong”, but we just have to make a judgment about which side we are on. That is what I am looking forward to hearing from the Minister.
These amendments are supported on all sides and by a long list of organisations, as listed by the noble Baroness, Lady Morgan, and the noble Lord, Lord Clement-Jones. The Minister has not conceded very much at all so far to this Committee. We have heard compelling speeches, such as those from the noble Baroness, Lady Parminter, that have reinforced my sense that he needs to give in on this when we come to Report.
I will also speak to my Amendment 38A. I pay tribute to John Penrose MP, who was mentioned by the noble Baroness, Lady Harding, and his work in raising concerns about misinformation and in stimulating discussion outside the Chambers among parliamentarians and others. Following discussions with him and others in the other place, I propose that users of social media should have the option to filter out content the provenance of which cannot be authenticated.
As we know, social media platforms are often awash with content that is unverified, misleading or downright false. This can be particularly problematic when it comes to sensitive or controversial topics such as elections, health or public safety. In these instances, it can be difficult for users to know whether the information presented to them is accurate. Many noble Lords will be familiar with the deep-fake photograph of the Pope in a white puffa jacket that recently went viral, or the use of imagery for propaganda purposes following the Russian invasion of Ukraine.
The Content Authenticity Initiative has created an open industry standard for content authenticity and provenance. Right now, tools such as Adobe Photoshop allow users to turn on content credentials to securely attach provenance data to images and any edits then made to those images. That technology has now been adopted by camera manufacturers such as Leica and Nikon, so the technology is there to do some of this to help give us some reassurance.
Amendment 38A would allow users to filter out unverified content and is designed to flag posts or articles that do not come from a reliable source or have not been independently verified by a reputable third party. Users could then choose to ignore or filter out such content, ensuring that they are exposed only to information that has been vetted and verified. This would not only help users to make more informed decisions but help to combat the spread of false information on social media platforms. By giving users the power to filter out unverified content, we can help to ensure that social media platforms are not used to spread harmful disinformation or misinformation.
Amendments 42 and 45, in the name of my noble friend Lord Stevenson, are good and straightforward improvements to the Bill, suggesting that Clause 12(11) should include all the protected characteristics in the Equality Act 2010. I listened closely to the argument of the noble Baroness, Lady Buscombe, about listing the various characteristics, but I think it makes much more sense to do what my noble friend is suggesting and just reference the Equality Act. If we chose separately to amend the Equality Act to change protected characteristics, that change would then flow through into this Bill. That seems to be a much more sensible and pragmatic way to proceed. Similarly, in Amendment 45 my noble friend is suggesting that consideration should be given by the Secretary of State as to whether any of these tools should also be available to child users, and we have heard good arguments in favour of that when it comes to dealing with the cliff-edge effect, as described by the noble Baroness, Lady Kidron.
Lastly—and, perhaps the Committee will forgive me, slightly bizarrely—Amendment 283ZA, in the name of the noble Baroness, Lady Fox, to Clause 170(7) in Part 11 invites us to consider, among other things, whether a bot can act reasonably. Given that self-coding bots are now technically possible, I would be fascinated to hear the Minister’s view on whether an autonomous bot can have reason. I asked ChatGPT this question. For the help of the Committee, it replied:
“As an artificial intelligence language model, I do not have consciousness or emotions, and therefore, I do not have a capacity for ‘reason’ in the way that humans understand it. However, I am programmed to use advanced algorithms that allow me to process and analyze large amounts of data, recognize patterns, and provide reasoned responses based on the information available to me. This allows me to simulate a form of reasoning, but it is important to note that it is not the same as human reasoning, as I do not have subjective experiences or personal biases. Ultimately, my abilities are limited to the algorithms and data that have been programmed into my system, and I cannot generate my own subjective experiences or judgments.”
That is the view of the algorithm as to whether or not bots can have reason. I look forward to the Minister’s response.
My Lords, the Government recognise the objectives of the amendments in this group: to strengthen protections for adults online. I hope noble Lords will agree that the Bill will indeed significantly improve the safety of all adult users, particularly those who are more vulnerable.
The user empowerment content features will not be the only measures in the Bill that will protect adults. They will act as a final layer of protection, coming after the duties on illegal content and the requirement on category 1 providers to uphold their terms of service. However, as the Clause 12 duties apply to legal content, we need to tread carefully and not inadvertently restrict free expression.
Amendments 34 and 35 in the name of my noble friend Lady Morgan of Cotes and Amendments 36 and 37 in the name of the noble Lord, Lord Clement-Jones, seek to require category 1 services to have their user empowerment content features in operation by default for adult users. The Government share concerns about users who experience disproportionate levels of abuse online or those who are more susceptible to suicide, self-harm or eating disorder content, but these amendments encroach on users’ rights in two ways.
First, the amendments intend to make the decision on behalf of users about whether to have these features turned on. That is aimed especially at those who might not otherwise choose to use those features. The Government do not consider it appropriate to take that choice away from adults, who must be allowed to decide for themselves what legal content they see online. That debate was distilled in the exchange just now between the noble Lord, Lord Knight, and the noble Baroness, Lady Fox, when the noble Lord said he would err on the side of caution, even overcaution, while he characterised the other side as a free-for-all. I might say that it was erring on the side of freedom. That is the debate that we are having, and should have, when looking at these parts of the Bill.
Secondly, the amendments would amount to a government requirement to limit adults’ access to legal content. That presents real concerns about freedom of expression, which the Government cannot accept.
Does the Minister therefore think that the Government condone the current system, where we are inundated algorithmically with material that we do not want? Are the Government condoning that behaviour, in the way that he is saying they would condone a safety measure?
We will come to talk about algorithms and their risks later on. There is an important balance to strike here that we have debated, rightly, in this group. I remind noble Lords that there are a range of measures that providers can put in place—
Because of the importance of that point in relation to what the Minister is about to say, we should be clear about this point: is he ruling out the ability to prioritise the needs and requirements of those who are effectively unable to take the decisions themselves in favour of a broader consideration of freedom of expression? It would be helpful for the future of this debate to be clear on that point.
We will come in a moment to the provisions that are in the Bill to make sure that decisions can be taken by adults, including vulnerable adults, easily and clearly. If the noble Lord will allow, I will cover that point.
I was in the middle of reminding noble Lords that there are a range of measures that providers can put in place under these duties, some of which might have an impact on a user’s experience if they were required to be switched on by default. That may include, for example, restricting a user’s news feed to content from connected users, adding to the echo chamber and silos of social media, which I know many noble Lords would join me in decrying. We think it is right that that decision is for individual users to make.
The Bill sets out that the user empowerment content tools must be offered to all adult users and must be easy to access—to go the point raised just now as well as by my noble friend Lady Harding, and the noble Baroness, Lady Burt, and, as noble Lords were right to remind us, pushed by the noble Baroness, Lady Campbell of Surbiton, who I am pleased to say I have been able to have discussions with separately from this Committee.
Providers will also be required to have clear and accessible terms of service about what tools are offered on their service and how users might take advantage of them. Ofcom will be able to require category 1 services to report on user empowerment tools in use through transparency reports. Ofcom is also bound by the Communications Act 2003 and the public sector equality duty, so it will need to take into account the ways that people with certain characteristics, including people with disabilities, may be affected when performing its duties, such as writing the codes of practice for the user empowerment duties.
I think the Minister is trying to answer the point raised by my noble friend about vulnerable adults. I am interested in the extent to which he is relying on the Equality Act duty on Ofcom then to impact the behaviour of the platforms that it is regulating in respect of how they are protecting vulnerable adults. My understanding is that the Equality Act duty will apply not to the platforms but only to Ofcom in the way that it regulates them. I am unclear how that is going to provide the protection that we want.
That is right. Platforms are not in the public sector, so the public sector equality duty does not apply to them. However, that duty applies to Ofcom, taking into account the ways in which people with certain characteristics can be affected through the codes of practice and the user empowerment duties that it is enforcing. So it suffuses the thinking there, but the duty is on Ofcom as a public sector body.
We talk later in Clause 12(11) of some of the characteristics that are similar in approach to the protected characteristics in the Equality Act 2010. I will come to that again shortly in response to points made by noble Lords.
I want to say a bit about the idea of there being a cliff edge at the age of 18. This was raised by a number of noble Lords, including the noble Lord, Lord Griffiths, my noble friends Lady Morgan and Lady Harding and the noble Baroness, Lady Kidron. The Bill’s protections recognise that, in law, people become adults when they turn 18—but it is not right to say that there are no protections for young adults. As noble Lords know, the Bill will provide a triple shield of protection, of which the user empowerment duties are the final element.
The Bill already protects young adults from illegal content and content that is prohibited in terms and conditions. As we discussed in the last group, platforms have strong commercial incentives to prohibit content that the majority of their users do not want to see. Our terms of service duties will make sure that they are transparent about and accountable for how they treat this type of content.
My Lords, what distinguishes young adults from older adults in what the Minister in saying?
In law, there is nothing. I am engaging with the point that there is no cliff edge. There are protections for people once they turn 18. People’s tastes and risk appetites may change over time, but there are protections in the Bill for people of all ages.
Surely, this is precisely the point that the noble Baroness, Lady Kidron, was making. As soon as you reach 18, there is no graduation at all. There is no accounting for vulnerable adults.
There is not this cliff edge which noble Lords have feared—that there are protections for children and then, at 18, a free for all. There are protections for adult users—young adults, older adults, adults of any age—through the means which I have just set out: namely, the triple shield and the illegal content provisions. I may have confused the noble Lord in my attempt to address the point. The protections are there.
There is an element of circularity to what the Minister is saying. This is precisely why we are arguing for the default option. It allows this vulnerability to be taken account of.
Perhaps it would help if the Minister wanted to just set out the difference for us. Clearly, this Committee has spent some time debating the protection for children, which has a higher bar than protection for adults. It is not possible to argue that there will be no difference at the age of 18, however effective the first two elements of the triple shield are. Maybe the Minister needs to think about coming at it from the point of view of a child becoming an adult, and talk us through what the difference will be.
Once somebody becomes an adult in law at the age of 18, they are protected through the triple shield in the Bill. The user empowerment duties are one element of this, along with the illegal content duties and the protection against content prohibited in terms and conditions and the redress through Ofcom.
The legislation delivers protection for adults in a way that preserves their choice. That is important. At the age of 18, you can choose to go into a bookshop and to encounter this content online if you want. It is not right for the Government to make decisions on behalf of adults about the legal content that they see. The Bill does not set a definition of a vulnerable adult because this would risk treating particular adults differently, or unfairly restricting their access to legal content or their ability to express themselves. There is no established basis on which to do that in relation to vulnerability.
Finally, we remain committed to introducing a new criminal offence to capture communications that intentionally encourage or assist serious self-harm, including eating disorders. This will provide another layer of protection on top of the regulatory framework for both adults and children.
I understand all of that—I think—but that is not the regime being applied to children. It is really clear that children have a safer, better experience. The difference between those experiences suddenly happening on an 18th birthday is what we are concerned about.
Before the Minister stands up—a new phrase—can he confirm that it is perfectly valid to have a choice to lift the user empowerment tool, just as it is to impose it? Choice would still be there if our amendments were accepted.
It would be, but we fear the chilling effect of having the choice imposed on people. As the noble Baroness, Lady Fox, rightly put it, one does not know what one has not encountered until one has engaged with the idea. At the age of 18, people are given the choice to decide what they encounter online. They are given the tools to ensure that they do not encounter it if they do not wish to do so. As the noble Lord has heard me say many times, the strongest protections in the Bill are for children. We have been very clear that the Bill has extra protections for people under the age of 18, and it preserves choice and freedom of expression online for adult users—young and old adults.
My noble friend Lady Buscombe asked about the list in Clause 12(11). We will keep it under constant review and may consider updating it should compelling evidence emerge. As the list covers content that is legal and designed for adults, it is right that it should be updated by primary legislation after a period of parliamentary scrutiny.
Amendments 42 and 38A, tabled by the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, respectively, seek to change the scope of user empowerment content features. Amendment 38A seeks to expand the user empowerment content features to include the restriction of content the provenance of which cannot be authenticated. Amendment 42 would apply features to content that is abusive on the basis of characteristics protected under the Equality Act 2010.
The user empowerment content list reflects areas where there is the greatest need for users to be offered choice about reducing their exposure to types of content. While I am sympathetic to the intention behind the amendments, I fear they risk unintended consequences for users’ rights online. The Government’s approach recognises the importance of having clear, enforceable and technically feasible duties that do not infringe users’ rights to free expression. These amendments risk undermining this. For instance, Amendment 38A would require the authentication of the provenance of every piece of content present on a service. This could have severe implications for freedom of expression, given its all-encompassing scope. Companies may choose not to have anything at all.
I will try to help the Minister. If the amendment has been poorly drafted, I apologise. It does not seek to require a platform to check the provenance of every piece of content, but content that is certified as having good provenance would have priority for me to be able to see it. In the Bill, I can see or not see verified users. In the same way, I could choose to see or not see verified content.
Thank you. I may be reading the noble Lord’s Amendment 38A excessively critically. I will look at it again. To try to reassure the noble Lord, the Bill already ensures that all services take steps to remove illegal manufactured or manipulated content when they become aware of it. Harmful and illegal misinformation and disinformation is covered in that way.
Amendment 42 would require providers to try to establish on a large scale what is a genuinely held belief that is more than an opinion. In response, I fear that providers would excessively apply the user empowerment features to manage that burden.
A number of noble Lords referred to the discrepancy between the list—
Several times in the Bill—but this is a clear example—the drafters have chosen to impose a different sequence of words from that which exists in statute. The obvious one here is the Equality Act, which we have touched on before. The noble Baroness, Lady Buscombe, made a number of serious points about that. Why have the Government chosen to list, separately and distinctively, the characteristics which we have also heard, through a different route, the regulator will be required to uphold in respect of the statute, while the companies will be looking to the text of the Bill, when enacted? Is that not just going to cause chaos?
The discrepancy comes from the point we touched on earlier. Ofcom, as a public body, is subject to the public sector equality duty and therefore the list set out in the Equality Act 2010. The list at Clause 12(11) relates to content which is abusive, and is therefore for providers to look at. While the Equality Act has established an understanding of characteristics which should be given special protection in law, it is not necessarily desirable to transpose those across. They too are susceptible to the point made by my noble friend Lady Buscombe about lists set out in statute. If I remember rightly, the Equality Act was part of a wash-up at the end of that Parliament, and whether Parliament debated that Bill as thoroughly as it is debating this one is a moot point.
The noble Lord made that point before, and I was going to pick him up on it. It really is not right to classify our legislation by whether it came through in a short or long period. We are spending an awfully long time on this but that is not going to make it any better. I was involved in the Equality Act, and I have the scars on my back to prove it. It is jolly good legislation and has stood the test of time. I do not think the point is answered properly by simply saying that this is a better way of doing it. The Minister said that Clause 12(11) was about abuse targets, but Clause 12(12) is about “hatred against people” and Clause 12(13) is a series of explanatory points. These provisions are all grist to the lawyers. They are not trying to clarify the way we operate this legislation, in my view, to the best benefit of those affected by it.
The content which we have added to Clause 12 is a targeted approach. It reflects input from a wide range of interested parties, with whom we have discussed this, on the areas of content that users are most concerned about. The other protected characteristics that do not appear are, for instance, somebody’s marriage or civil partnership status or whether they are pregnant. We have focused on the areas where there is the greatest need for users to be offered the choice about reducing their exposure to types of content because of the abuse they may get from it. This recognises the importance of clear, enforceable and technically feasible duties. As I said a moment ago in relation to the point made by my noble friend Lady Buscombe, we will keep it under review but it is right that these provisions be debated at length—greater length than I think the Equality Bill was, but that was long before my time in your Lordships’ House, so I defer to the noble Lord’s experience and I am grateful that we are debating them thoroughly today.
I will move now, if I may, to discuss Amendments 43 and 283ZA, tabled by the noble Baroness, Lady Fox of Buckley. Amendment 43 aims to ensure that the user empowerment content features do not capture legitimate debate and discussion, specifically relating to the characteristics set out in subsections (11) and (12). Similarly, her Amendment 283ZA aims to ensure that category 1 services apply the features to content only when they have reasonable grounds to infer that it is user empowerment content.
With regard to both amendments, I can reassure the noble Baroness that upholding users’ rights to free expression is an integral principle of the Bill and it has been accounted for in drafting these duties. We have taken steps to ensure that legitimate online discussion or criticism will not be affected, and that companies make an appropriate judgment on the nature of the content in question. We have done this by setting high thresholds for inclusion in the content categories and through further clarification in the Bill’s Explanatory Notes, which I know she has consulted as well. However, the definition here deliberately sets a high threshold. By targeting only abuse and incitement to hatred, it will avoid capturing content which is merely challenging or robust discussion on controversial topics. Further clarity on definitions will be provided by Ofcom through regulatory guidance, on which it will be required to consult. That will sit alongside Ofcom’s code of practice, which will set out the steps companies can take to fulfil their duties.
I appreciate the Minister’s comments but, as I have tried to indicate, incitement to hatred and abuse, despite people thinking they know what those words mean, is causing huge difficulty legally and in institutions throughout the land. Ofcom will have its work cut out, but it was entirely for that reason that I tabled this amendment. There needs to be an even higher threshold, and this needs to be carefully thought through.
But as I think the noble Baroness understands from that reference, this is a definition already in statute, and with which Parliament and the courts are already engaged.
The Bill’s overarching freedom of expression duties also apply to Clause 12. Subsections (4) to (7) of Clause 18 stipulate that category 1 service providers are required to assess the impact on free expression from their safety policies, including the user empowerment features. This is in addition to the duties in Clause 18(2), which requires all user-to-user services to have particular regard to the importance of protecting freedom of expression when complying with their duties. The noble Baroness’s Amendment 283ZA would require category 1 providers to make judgments on user empowerment content to a similar standard required for illegal content. That would be disproportionate. Clause 170 already specifies how providers must make judgments about whether content is of a particular kind, and therefore in scope of the user empowerment duties. This includes making their judgment based on “all relevant information”. As such, the Bill already ensures that the user empowerment content features will be applied in a proportionate way that will not undermine free speech or hinder legitimate debate online.
Amendment 45, tabled by the noble Lord, Lord Stevenson of Balmacara, would require the Secretary of State to lay a Statement before Parliament outlining whether any of the user empowerment duties should be applied to children. I recognise the significant interest that noble Lords have in applying the Clause 12 duties to children. The Bill already places comprehensive requirements on Part 3 services which children are likely to access. This includes undertaking regular risk assessments of such services, protecting children from harmful content and activity, and putting in place age-appropriate protections. If there is a risk that children will encounter harm, such as self-harm content or through unknown or unverified users contacting them, service providers will need to put in place age- appropriate safety measures. Applying the user empowerment duties for child users runs counter to the Bill’s child safety objectives and may weaken the protections for children—for instance, by giving children an option to see content which is harmful to them or to engage with unknown, unverified users. While we recognise the concerns in this area, for the reasons I have set out, the Government do not agree with the need for this amendment.
I will resist the challenge of the noble Lord, Lord Knight, to talk about bots because I look forward to returning to that in discussing the amendments on future-proofing. With that, I invite noble Lords—
I noted the points made about the way information is pushed and, in particular, the speech of the right reverend Prelate. Nothing in the Government’s response has really dealt with that concern. Can the Minister say a few words about not the content but the way in which users are enveloped? On the idea that companies always act because they have a commercial imperative not to expose users to harmful material, actually, they have a commercial imperative to spread material and engage users. It is well recorded that a lot of that is in fact harmful material. Can the Minister speak a little more about the features rather than the content?
We will discuss this when it comes to the definition of content in the Bill, which covers features. I was struck by the speech by the right reverend Prelate about the difference between what people encounter online, and the analogy used by the noble Baroness, Lady Fox, about a bookshop. Social media is of a different scale and has different features which make that analogy not a clean or easy one. We will debate in other groups the accumulated threat of features such as algorithms, if the noble Baroness, Lady Kidron, will allow me to go into greater detail then, but I certainly take the points made by both the right reverend Prelate and the noble Baroness, Lady Fox, in their contributions.
My Lords, I thank my noble friend very much indeed, and thank all noble Lords who have taken part. As the noble Lord, Lord Knight, said, this has been an important debate—they are all important, of course—but I think this has really got to the heart of parts of the Bill, parts of why it has been proposed in the first place, and some choices the Government made in their drafting and the changes they have made to the Bill. The right reverend Prelate reminded us, as Bishops always do, of the bigger picture, and he was quite right to do so. There is no equality of arms, as he put it, between most of us as internet users and these enormous companies that are changing, and have changed, our society. My noble friend was right—and I was going to pick up on it too—that the bookshop example given by the noble Baroness, Lady Fox, is, I am afraid, totally misguided. I love bookshops; the point is that I can choose to walk into one or not. If I do not walk into a bookshop, I do not see the books promoting some of the content we have discussed today. If they spill out on to the street where I trip over them, I cannot ignore them. This would be even harder if I were a vulnerable person, as we are going to discuss.
Noble Lords said that this is not a debate about content or freedom of expression, but that it is about features; I think that is right. However, it is a debate about choice, as the noble Lord, Lord Clement-Jones, said. I am grateful to each of those noble Lords who supported my amendments; we have had a good debate on both sets of amendments, which are similar. But as the noble Lord, Lord Griffiths, said, some of the content we are discussing, particularly in subsection (10), relating to suicide, pro-self-harm and pro-anorexia content, has literal life or death repercussions. To those noble Lords, and those outside this House, who seem to think we should not worry and should allow a total free-for-all, I say that we are doing so, in that the Government, in choosing not to adopt such amendments, are making an active choice. I am afraid the Government are condoning the serving up of insidious, deliberately harmful and deliberately dangerous content to our society, to younger people and vulnerable adults. The Minister and the Government would be better off if they said, “That is the choice that we have made”. I find it a really troubling choice because, as many noble Lords will know, I was involved in this Bill a number of years ago—there has been a certain turnover of Culture Secretaries in the last couple of years, and I was one of them. I find the Government’s choice troubling, but it has been made. As the noble Lord, Lord Knight, said, we are treating children differently from how we are treating adults. As drafted, there is a cliff edge at the age of 18. As a society, we should say that there are vulnerabilities among adults, as we do in many walks of life; and exactly as the noble Baroness, Lady Parminter, so powerfully said, there are times when we as a House, as a Parliament, as a society and as a state, should say we want to protect people. There is an offer here in both sets of amendments—I am not precious about which ones we choose—to have that protection.
I will of course withdraw the amendment today, because that is the convention of the House, but I ask my noble friend to reflect on the strength of feeling expressed by the House on this today; I think the Whip on the Bench will report as well. I am certain we will return to this on Report, probably with a unified set of amendments. In the algorithmic debate we will return to, the Government will have to explain, in words of one syllable, to those outside this House who worry about the vulnerable they work with or look after, about the choice that the Government have made in not offering protections when they could have done, in relation to these enormously powerful platforms and the insidious content they serve up repeatedly.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, before we continue this debate, I want to understand why we have changed the system so that we break part way through a group of amendments. I am sorry, but I think this is very poor. It is definitely a retrograde step. Why are we doing it? I have never experienced this before. I have sat here and waited for the amendment I have just spoken to. We have now had a break; it has broken the momentum of that group. It was even worse last week, because we broke for several days half way through the debate on an amendment. This is unheard of in my memory of 25 years in this House. Can my noble friend the Minister explain who made this decision, and how this has changed?
I have not had as long in your Lordships’ House, but this is not unprecedented, in my experience. These decisions are taken by the usual channels; I will certainly feed that back through my noble friend. One of the difficulties, of course, is that because there are no speaking limits on legislation and we do not know how many people want to speak on each amendment, the length of each group can be variable, so I think this is for the easier arrangement of dinner-break business. Also, for the dietary planning of those of us who speak on every group, it is useful to have some certainty, but I do appreciate my noble friend’s point.
Okay; I thank my noble friend for his response. However, I would just say that we never would have broken like that, before 7.30 pm. I will leave it at that, but I will have a word with the usual channels.
My Lords, the range of the amendments in this group indicates the importance of the Government’s approach to user verification and non-verified user duties. The way these duties have been designed seeks to strike a careful balance between empowering adults while safeguarding privacy and anonymity.
Amendments 38, 39, 139 and 140 have been tabled by my noble friend Lord Moylan. Amendments 38 and 39 seek to remove subsections (6) and (7) of the non-verified users’ duties. These place a duty on category 1 platforms to give adult users the option of preventing non-verified users interacting with their content, reducing the likelihood that a user sees content from non-verified users. I want to be clear that these duties do not require the removal of legal content from a service and do not impinge on free speech.
In addition, there are already existing duties in the Bill to safeguard legitimate online debate. For example, category 1 services will be required to assess the impact on free expression of their safety policies, including the impact of their user empowerment tools. Removing subsections (6) and (7) of Clause 12 would undermine the Bill’s protection for adult users of category 1 services, especially the most vulnerable. It would be entirely at the service provider’s discretion to offer users the ability to minimise their exposure to anonymous and abusive users, sometimes known as trolls. In addition, instead of mandating that users verify their identity, the Bill gives adults the choice. On that basis, I am confident that the Bill already achieves the effect of Amendment 139.
Amendment 140 seeks to reduce the amount of personal data transacted as part of the verification process. Under subsection (3) of Clause 57, however, providers will be required to explain in their terms of service how the verification process works, empowering users to make an informed choice about whether they wish to verify their identity. In addition, the Bill does not alter the UK’s existing data protection laws, which provide people with specific rights and protections in relation to the processing of their personal data. Ofcom’s guidance in this area will reflect existing laws, ensuring that users’ data is protected where personal data is processed. I hope my noble friend will therefore be reassured that these duties reaffirm the concept of choice and uphold the importance of protecting personal data.
While I am speaking to the questions raised by my noble friend, I turn to those he asked about Wikipedia. I have nothing further to add to the comments I made previously, not least that it is impossible to pre-empt the assessments that will be made of which services fall into which category. Of course, assessments will be made at the time, based on what the services do at the time of the assessment, so if he will forgive me, I will not be drawn on particular services.
To speak in more general terms, category 1 services are those with the largest reach and the greatest influence over public discourse. The Bill sets out a clear process for determining category 1 providers, based on thresholds set by the Secretary of State in secondary legislation following advice from Ofcom. That is to ensure that the process is objective and evidence based. To deliver this advice, Ofcom will undertake research into the relationship between how quickly, easily and widely user-generated content is disseminated by that service, the number of users and functionalities it has and other relevant characteristics and factors.
Will my noble friend at least confirm what he said previously: namely, that it is the Government’s view—or at least his view—that Wikipedia will not qualify as a category 1 service? Those were the words I heard him use at the Dispatch Box.
That is my view, on the current state of play, but I cannot pre-empt an assessment made at a point in the future, particularly if services change. I stand by what I said previously, but I hope my noble friend will understand if I do not elaborate further on this, at the risk of undermining the reassurance I might have given him previously.
Amendments 40, 41, 141 and 303 have been tabled by the noble Lord, Lord Stevenson of Balmacara, and, as noble Lords have noted, I have added my name to Amendment 40. I am pleased to say that the Government are content to accept it. The noble Baroness, Lady Merron, should not minimise this, because it involves splitting an infinitive, which I am loath to do. If this is a statement of intent, I have let that one go, in the spirit of consensus. Amendment 40 amends Clause 12(7) to ensure that the tools which will allow adult users to filter out content from non-verified users are effective and I am pleased to add my name to it.
Amendment 41 seeks to make it so that users can see whether another user is verified or not. I am afraid we are not minded to accept it. While I appreciate the intent, forcing users to show whether they are verified or not may have unintended consequences for those who are unable to verify themselves for perfectly legitimate reasons. This risks creating a two-tier system online. Users will still be able to set a preference to reduce their interaction with non-verified users without making this change.
Amendment 141 seeks to prescribe a set of principles and standards in Ofcom’s guidance on user verification. It is, however, important that Ofcom has discretion to determine, in consultation with relevant persons, which principles will have the best outcomes for users, while ensuring compliance with the duties. Further areas of the Bill also address several issues raised in this amendment. For example, all companies in scope will have a specific legal duty to have effective user reporting and redress mechanisms.
Existing laws also ensure that Ofcom’s guidance will reflect high standards. For example, it is a general duty of Ofcom under Section 3 of the Communications Act 2003 to further the interests of consumers, including by promoting competition. This amendment would, in parts, duplicate existing duties and undermine Ofcom’s independence to set standards on areas it deems relevant after consultation with expert groups.
Amendment 303 would add a definition of user identity verification. The definition it proposes would result in users having to display their real name online if they decide to verify themselves. In answer to the noble Baroness’s question, the current requirements do not specify that users must display their real name. The amendment would have potential safety implications for vulnerable users, for example victims and survivors of domestic abuse, whistleblowers and others of whom noble Lords have given examples in their contributions. The proposed definition would also create reliance on official forms of identification. That would be contrary to the existing approach in Clause 57 which specifically sets out that verification need not require such forms of documentation.
The noble Baroness, Lady Kidron, talked about paid-for verification schemes. The user identity verification provisions were brought in to ensure that adult users of the largest services can verify their identity if they so wish. These provisions are different from the blue tick schemes and others currently in place, which focus on a user’s status rather than verifying their identity. Clause 57 specifically sets out that providers of category 1 services will be required to offer all adult users the option to verify their identity. Ofcom will provide guidance for user identity verification to assist providers in complying with these duties. In doing so, it will consult groups that represent the interests of vulnerable adult users. In setting out recommendations about user verification, Ofcom must have particular regard to ensuring that providers of category 1 services offer users a form of identity verification that is likely to be available to vulnerable adult users. Ofcom will also be subject to the public sector equality duty, so it will need to take into account the ways in which people with certain characteristics may be affected when it performs this and all its duties under the Bill.
A narrow definition of identity verification could limit the range of measures that service providers might offer their users in the future. Under the current approach, Ofcom will produce and publish guidance on identity verification after consulting those with technical expertise and groups which represent the interests of vulnerable adult users.
I am sorry to interrupt the noble Lord. Is the answer to my question that the blue tick and the current Meta system will not be considered as verification under the terms of the Bill? Is that the implication of what he said?
Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.
Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.
Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.
The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.
Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.
My Lords, I would like to say that that was a rewarding and fulfilling debate in which everyone heard very much what they wanted to hear from my noble friend the Minister. I am afraid I cannot say that. I think it has been one of the most frustrating debates I have been involved in since I came into your Lordships’ House. However, it gave us an opportunity to admire the loftiness of manner that the noble Lord, Lord Clement-Jones, brought to dismissing my concerns about Wikipedia—that I was really just overreading the whole thing and that I should not be too bothered with words as they appear in the Bill because the noble Lord thinks that Wikipedia is rather a good thing and why is it not happy with that as a level of assurance?
I would like to think that the Minister had dealt with the matter in the way that I hoped he would, but I do thin, if I may say so, that it is vaguely irresponsible to come to the Dispatch Box and say, “I don’t think Wikipedia will qualify as a category 1 service”, and then refuse to say whether it will or will not and take refuge in the process the Bill sets up, when at least one Member of the House of Lords, and possibly a second in the shape of the noble Lord, Lord Clement-Jones, would like to know the answer to the question. I see a Minister from the business department sitting on the Front Bench with my noble friend. This is a bit like throwing a hand grenade into a business headquarters, walking away and saying, “It was nothing to do with me”. You have to imagine what the position is like for the business.
We had a very important amendment from my noble friend Lady Buscombe. I think we all sympathise with the type of abuse that she is talking about—not only its personal effects but its deliberate business effects, the deliberate attempt to destroy businesses. I say only that my reading of her Amendment 106 is that it seeks to impose on Ofcom an objective to prevent harm, essentially, arising from offences under Clauses 160 and 162 of the Bill committed by unverified or anonymous users. Surely what she would want to say is that, irrespective of verification and anonymity, one would want action taken against this sort of deliberate attempt to undermine and destroy businesses. While I have every sympathy with her amendment, I am not entirely sure that it relates to the question of anonymity and verification.
Apart from that, there were in a sense two debates going on in parallel in our deliberations. One was to do with anonymity. On that question, I think the noble Lord, Lord Clement-Jones, put the matter very well: in the end, you have to come down on one side or the other. My personal view, with some reluctance, is that I have come down on the same side as the Government, the noble Lord and others. I think we should not ban anonymity because there are costs and risks to doing so, however satisfying it would be to be able to expose and sue some of the people who say terrible and untrue things about one another on social media.
The more important debate was not about anonymity as such but about verification. We had the following questions, which I am afraid I do not think were satisfactorily answered. What is verification? What does it mean? Can we define what verification is? Is it too expensive? Implicitly, should it be available for free? Is there an obligation for it to be free or do the paid-for services count, and what happens if they are so expensive that one cannot reasonably afford them? Is it real, in the sense that the verification processes devised by the various platforms genuinely provide verification? Various other questions like that came up but I do not think that any of them was answered.
I hate to say this as it sounds a little harsh about a Government whom I so ardently support, but the truth is that the triple shield, also referred to as a three-legged stool in our debate, was hastily cobbled together to make up for the absence of legal but harmful, but it is wonky; it is not working, it is full of holes and it is not fit for purpose. Whatever the Minister says today, there has to be a rethink before he comes back to discuss these matters at the next stage of the Bill. In the meantime, I beg leave to withdraw my amendment.
Lawyers—don’t you love them? How on earth are we supposed to unscramble that at this time of night? It was good to have my kinsman, the noble and learned Lord, Lord Hope, back in our debates. We were remarking only a few days ago that we had not seen enough lawyers in the House in these debates. One appears, and light appears. It is a marvellous experience.
I thank the Committee for listening to my earlier introductory remarks; I hope they helped to untangle some of the issues. The noble Lord, Lord Black, made it clear that the press are happy with what is in the current draft. There could be some changes, and we have heard a number of examples of ways in which one might either top or tail what there is.
There was one question that perhaps he could have come back on, and maybe he will, as I have raised it separately with the department before. I agree with a lot of what he said, but it applies to a lot more than just news publishers. Quality journalism more generally enhances and restores our faith in public services in so many ways. Why is it only the news? Is there a way in which we could broaden that? If there is not this time round, perhaps that is something we need to pick up later.
As the noble Lord, Lord Clement-Jones, has said, the noble Viscount, Lord Colville, made a very strong and clear case for trying to think again about what journalism does in the public realm and making sure that the Bill at least carries that forward, even if it does not deal with some of the issues that he raised.
We have had a number of other good contributions about how to capture some of the good ideas that were flying around in this debate and keep them in the foreground so that the Bill is enhanced. But I think it is time that the Minister gave us his answers.
I join noble Lords who have sent good wishes for a speedy recovery to the noble Baroness, Lady Featherstone.
Amendments 46, 47 and 64, in the name of my noble friend Lady Stowell of Beeston, seek to require platforms to assess the risk of, and set terms for, content currently set out in Clause 12. Additionally, the amendments seek to place duties on services to assess risks to freedom of expression resulting from user empowerment tools. Category 1 platforms are already required to assess the impact on free expression of their safety policies, including user empowerment tools; to keep that assessment up to date; to publish it; and to demonstrate the positive steps they have taken in response to the impact assessment in a publicly available statement.
Amendments 48 and 100, in the name of the noble Lord, Lord Stevenson, seek to introduce a stand-alone duty on category 1 services to protect freedom of expression, with an accompanying code of practice. Amendments 49, 50, 53A, 61 and 156, in the name of the noble Baroness, Lady Fox, seek to amend the Bill’s Clause 17 and Clause 18 duties and clarify duties on content of democratic importance.
All in-scope services must already consider and implement safeguards for freedom of expression when fulfilling their duties. Category 1 services will need to be clear what content is acceptable on their services and how they will treat it, including when removing or restricting access to it, and that they will enforce the rules consistently. In setting these terms of service, they must adopt clear policies designed to protect journalistic and democratic content. That will ensure that the most important types of content benefit from additional protections while guarding against the arbitrary removal of any content. Users will be able to access effective appeal mechanisms if content is unfairly removed. That marks a considerable improvement on the status quo.
Requiring all user-to-user services to justify why they are removing or restricting each individual piece of content, as Amendment 53A would do, would be disproportionately burdensome on companies, particularly small and medium-sized ones. It would also duplicate some of the provisions I have previously outlined. Separately, as private entities, service providers have their own freedom of expression rights. This means that platforms are free to decide what content should or should not be on their website, within the bounds of the law. The Bill should not mandate providers to carry or to remove certain types of speech or content. Accordingly, we do not think it would be appropriate to require providers to ensure that free speech is not infringed, as suggested in Amendment 48.
Why would it not be possible for us to try to define what the public interest might be, and not leave it to the platforms to do so?
I ask the noble Viscount to bear with me. I will come on to this a bit later. I do not think it is for category 1 platforms to do so.
We have introduced Clause 15 to reduce the powers that the major technology companies have over what journalism is made available to UK users. Accordingly, Clause 15 requires category 1 providers to set clear terms of service which explain how they take the importance of journalistic content into account when making their moderation decisions. These duties will not stop platforms removing journalistic content. Platforms have the flexibility to set their own journalism policies, but they must enforce them consistently. They will not be able to remove journalistic content arbitrarily. This will ensure that platforms give all users of journalism due process when making content moderation decisions. Amendment 51 would mean that, where platforms subjectively reached a decision that journalism was not conducive to the public good, they would not have to give it due process. Platforms could continue to treat important journalistic content arbitrarily where they decided that this content was not in the public interest of the UK.
In his first remarks on this group the noble Lord, Lord Stevenson, engaged with the question of how companies will identify content of democratic importance, which is content that seeks to contribute to democratic political debate in the UK at a national and local level. It will be broad enough to cover all political debates, including grass-roots campaigns and smaller parties. While platforms will have some discretion about what their policies in this area are, the policies will need to ensure that platforms are balancing the importance of protecting democratic content with their safety duties. For example, platforms will need to consider whether the public interest in seeing some types of content outweighs the potential harm it could cause. This will require companies to set out in their terms of service how they will treat different types of content and the systems and processes they have in place to protect such content.
Amendments 57 and 62, in the name of my noble friend Lord Kamall, seek to impose new duties on companies to protect a broader range of users’ rights, as well as to pay particular attention to the freedom of expression of users with protected characteristics. As previously set out, services will have duties to safeguard the freedom of expression of all users, regardless of their characteristics. Moreover, UK providers have existing duties under the Equality Act 2010 not to discriminate against people with characteristics which are protected in that Act. Given the range of rights included in Amendment 57, it is not clear what this would require from service providers in practice, and their relevance to service providers would likely vary between different rights.
Amendment 60, in the name of the noble Lord, Lord Clement-Jones, and Amendment 88, in the name of the noble Lord, Lord Stevenson, probe whether references to privacy law in Clauses 18 and 28 include Article 8 of the European Convention on Human Rights. That convention applies to member states which are signatories. Article 8(1) requires signatories to ensure the right to respect for private and family life, home and correspondence, subject to limited derogations that must be in accordance with the law and necessary in a democratic society. The obligations flowing from Article 8 do not apply to individuals or to private companies and it would not make sense for these obligations to be applied in this way, given that states which are signatories will need to decide under Article 8(2) which restrictions on the Article 8(1) right they need to impose. It would not be appropriate or possible for private companies to make decisions on such restrictions.
Providers will, however, need to comply with all UK statutory and common-law provisions relating to privacy, and must therefore implement safeguards for user privacy when meeting their safety duties. More broadly, Ofcom is bound by the Human Rights Act 1998 and must therefore uphold Article 8 of the European Convention on Human Rights when implementing the Bill’s regime.
It is so complicated that the Minister is almost enticing me to stand up and ask about it. Let us just get that right: the reference to the Article 8 powers exists and applies to those bodies in the UK to which such equivalent legislation applies, so that ties us into Ofcom. Companies cannot be affected by it because it is a public duty, not a private duty, but am I then allowed to walk all the way around the circle? At the end, can Ofcom look back at the companies to establish whether, in Ofcom’s eyes, its requirements in relation to its obligations under Article 8 have or have not taken place? It is a sort of transparent, backward-reflecting view rather than a proactive proposition. That seems a complicated way of saying, “Why don’t you behave in accordance with Article 8?”
Yes, Ofcom, which is bound by it through the Human Rights Act 1998, can ask those questions and make that assessment of the companies, but it would not be right for private companies to be bound by something to which it is not appropriate for companies to be signatories. Ofcom will be looking at these questions but the duty rests on it, as bound by the Human Rights Act.
It is late at night and this is slightly tedious, but in the worst of all possible circumstances, Ofcom would be looking at what happened over the last year in relation to its codes of practice and assertions about a particular company. Ofcom is then in trouble because it has not discharged its Article 8 obligations, so who gets to exercise a whip on whom? Sorry, whips are probably the wrong things to use, but you see where I am coming from. All that is left is for the Secretary of State, but probably it would effectively be Parliament, to say to Ofcom, “You’ve failed”. That does not seem a very satisfactory solution.
Platforms will be guided by Ofcom in taking measures to comply with their duties which are recommended in Ofcom’s codes, and which contain safeguards for privacy, including ones based on the European Convention on Human Rights and the rights therein. Paragraph 10(2)(b) of Schedule 4 requires Ofcom to ensure that measures, which it describes in the code of practice, are designed in light of the importance of protecting the privacy of users. Clause 42(2) and (3) provides that platforms will be treated as complying with the privacy duties set out at Clause 18(2) and Clause 28(2), if they take the recommended measures that Ofcom sets out in the codes.
It worked. In seriousness, we will both consult the record and, if the noble Lord wants more, I am very happy to set it out in writing.
Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead, seeks to clarify that “freedom of expression” in Clause 18 refers to the
“freedom to impart ideas, opinions or information”,
as referred to in Article 10 of the European Convention on Human Rights. I think I too have been guilty of using the phrases “freedom of speech” and “freedom of expression” as though they were interchangeable. Freedom of expression, within the law, is intended to encompass all the freedom of expression rights arising from UK law, including under common law. The rights to freedom of expression under Article 10 of the European Convention on Human Rights include both the rights to impart ideas, opinions and information, but also the right to receive such ideas, opinions and information. Any revised definition of freedom of expression to be included in the Bill should refer to both aspects of the Article 10 definition, given the importance for both children and adults of receiving information via the internet. We recognise the importance of clarity in relation to the duties set out in Clauses 18 and 28, and we are very grateful to the noble and learned Lord for proposing this amendment, and for the experience he brings to bear on behalf of the Constitution Committee of your Lordships’ House. The Higher Education (Freedom of Speech) Bill and the Online Safety Bill serve very different purposes, but I am happy to say that the Bill team and I will consider this amendment closely between now and Report.
Amendments 101, 102, 109, 112, 116, 121, 191 and 220, in the name of my noble friend Lord Moylan, seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties, and when drafting or amending codes of practice or guidance. Ofcom must already ensure that it protects freedom of expression when overseeing the Bill, because it is bound by the Human Rights Act, as I say. It also has specific duties to ensure that it is clear about how it is protecting freedom of expression when exercising its duties, including when developing codes of practice.
My noble friend’s Amendment 294 seeks to remove “psychological” from the definition of harm in the Bill. It is worth being clear that the definition of harm is used in the Bill as part of the illegal and child safety duties. There is no definition of harm, psychological or otherwise, with regard to adults, given that the definition of content which is harmful to adults was removed from the Bill in another place. With regard to children, I agree with the points made by the noble Baroness, Lady Kidron. It is important that psychological harm is captured in the Bill’s child safety duties, given the significant impact that such content can have on young minds.
I invite my noble friend and others not to press their amendments in this group.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, His Majesty’s Government are committed to defending the invaluable role of our free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information. That is why this Bill includes strong protections for recognised news publishers. The Bill does not impose new duties on news publishers’ content, which is exempt from the Bill’s safety duties. In addition, the Bill includes strong safeguards for news publisher content, set out in Clause 14. In order to benefit from these protections, publishers will have to meet a set of stringent criteria, set out in Clause 50.
I am aware of concerns in your Lordships’ House and another place that the definition of news publishers is too broad and that these protections could therefore create a loophole to be exploited. That is why the Government are bringing forward amendments to the definition of “recognised news publisher” to ensure that sanctioned entities cannot benefit from these protections. I will shortly explain these protections in detail but I would like to be clear that narrowing the definition any further would pose a critical risk to our commitment to self-regulation of the press. We do not want to create requirements which would in effect put Ofcom in the position of a press regulator. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit.
Government Amendments 126A and 127A propose changes to the criteria for recognised news publishers. These criteria already exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or the purpose of which is to support a proscribed organisation under that Act. We are clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections either. The amendments we are tabling today will therefore tighten the recognised news publisher criteria further by excluding entities that have been designated for sanctions imposed by both His Majesty’s Government and the United Nations Security Council. I hope noble Lords will accept these amendments, in order to ensure that content from publishers which pose a security threat to this country cannot benefit from protections designed to defend a free press.
In addition, the Government have also tabled amendments 50B, 50C, 50D, 127B, 127C and 283A, which are aimed at ensuring that the protections for news publishers in Clause 14 are workable and do not have unforeseen consequences for the operation of category 1 services. Clause 14 gives category 1 platforms a duty to notify recognised news publishers and offer a right of appeal before taking action against any of their content or accounts.
Clause 14 sets out the circumstances in which companies must offer news publishers an appeal. As drafted, it states that platforms must offer this before they take down news publisher content, before they restrict users’ access to such content or where they propose to “take any other action” in relation to publisher content. Platforms must also offer an appeal if they propose to take action against a registered news publisher’s account by giving them a warning, suspending or banning them from using a service or in any way restricting their ability to use a service.
These amendments provide greater clarity about what constitutes “taking action” in relation to news publisher content, and therefore when category 1 services must offer an appeal. They make it clear that a platform must offer this before they take down such content, add a warning label or take any other action against content in line with any terms of service that allow or prohibit content. This will ensure that platforms are not required to offer publishers a right of appeal every time they propose to carry out routine content curation and similar routine actions. That would be unworkable for platforms and would be likely to inhibit the effectiveness of the appeal process.
As noble Lords know, the Bill has a strong focus on user empowerment and enabling users to take control of their online experience. The Government have therefore tabled amendments to Clause 52 to ensure that providers are required only to offer publishers a right of appeal in relation to their own moderation decisions, not where a user has voluntarily chosen not to view certain types of content. For example, if a user has epilepsy and has opted not to view photo-sensitive content, platforms will not be required to offer publishers a right of appeal before restricting that content for the user in question.
In addition, to ensure that the Bill maintains strong protections for children, the amendments make it clear that platforms are not required to offer news publishers an appeal before applying warning labels to content viewed by children. The amendments also make it clear that platforms would be in breach of the legislation if they applied warning labels to content encountered by adults without first offering news publishers an appeal, but in order to ensure that the Bill maintains strong protections for children, that does not apply to warning labels on content encountered by children. I beg to move.
My Lords, I welcome the amendments the Government have tabled, but I ask the Minister to clarify the effect of Amendment 50E. I declare an interest as chair of the Communications and Digital Select Committee, which has discussed Amendment 50E and the labelling of content for children with the news media organisations. This is a very technical issue, but from what my noble friend was just saying, it seems that content that would qualify for labelling for child protection purposes, and which therefore does not qualify for a right of appeal before the content is so labelled, is not content that would normally be encountered by adults but might happen to appeal to children. I would like to be clear that we are not giving the platforms scope for adding labels to content that they ought not to be adding labels to. That aside, as I say, I am grateful to my noble friend for these amendments.
My Lords, I am sorry; in my enthusiasm to get this day of Committee off to a swift start, I perhaps rattled through that rather quickly. On Amendment 50E, which my noble friend Lady Stowell asked about, I make clear that platforms will be in breach of their duties if, without applying the protection, they add warning labels to news publishers’ content that they know will be seen by adult users, regardless of whether that content particularly appeals to children.
As the noble Lord, Lord Clement-Jones, and others noted, we will return to some of the underlying principles later on, but the Government have laid these amendments to clarify category 1 platforms’ duties to protect recognised news publishers’ content. They take some publishers out of scope of the protections and make it clearer that category 1 platforms will have only to offer news publishers an appeal before taking punitive actions against their content.
The noble Baroness, Lady Fox, asked about how we define “recognised news publisher”. I am conscious that we will debate this more in later groups, but Clause 50 sets out a range of criteria that an organisation must meet to qualify as a recognised news publisher. These include the organisation’s “principal purpose” being the publication of news, it being subject to a “standards code” and its content being “created by different persons”. The protections for organisations are focused on publishers whose primary purpose is reporting on news and current affairs, recognising the importance of that in a democratic society. I am grateful to noble Lords for their support.
What my noble friend said is absolutely fine with me, and I thank him very much for it. It might be worth letting the noble Baroness, Lady Fox, know that Amendment 127 has now been moved to the group that the noble Lord, Lord Clement-Jones, referred to. I thought it was worth offering that comfort to the noble Baroness.
My Lords, this debate has demonstrated the diversity of opinion regarding misinformation and disinformation—as the noble Lord said, the Joint Committee gave a lot of thought to this issue—as well as the difficulty of finding the truth of very complex issues while not shutting down legitimate debate. It is therefore important that we legislate in a way that takes a balanced approach to tackling this, keeping people safe online while protecting freedom of expression.
The Government take misinformation and disinformation very seriously. From Covid-19 to Russia’s use of disinformation as a tool in its illegal invasion of Ukraine, it is a pervasive threat, and I pay tribute to the work of my noble friend Lord Bethell and his colleagues in the Department of Health and Social Care during the pandemic to counter the cynical and exploitative forces that sought to undermine the heroic effort to get people vaccinated and to escape from the clutches of Covid-19.
We recognise that misinformation and disinformation come in many forms, and the Bill reflects this. Its focus is rightly on tackling the most egregious, illegal forms of misinformation and disinformation, such as content which amounts to the foreign interference offence or which is harmful to children—for instance, that which intersects with named categories of primary priority or priority content.
That is not the only way in which the Bill seeks to tackle it, however. The new terms of service duties for category 1 services will hold companies to account over how they say they treat misinformation and disinformation on their services. However, the Government are not in the business of telling companies what legal content they can and cannot allow online, and the Bill should not and will not prevent adults accessing legal content. In addition, the Bill will establish an advisory committee on misinformation and disinformation to provide advice to Ofcom on how they should be tackled online. Ofcom will be given the tools to understand how effectively misinformation and disinformation are being addressed by platforms through transparency reports and information-gathering powers.
Amendment 52 from the noble Baroness, Lady Merron, seeks to introduce a new duty on platforms in relation to health misinformation and disinformation for adult users, while Amendments 59 and 107 from my noble friend Lord Moylan aim to introduce new proportionality duties for platforms tackling misinformation and disinformation. The Bill already addresses the most egregious types of misinformation and disinformation in a proportionate way that respects freedom of expression by focusing on misinformation and disinformation that are illegal or harmful to children.
I am curious as to what the Bill says about misinformation and disinformation in relation to children. My understanding of primary priority and priority harms is that they concern issues such as self-harm and pornography, but do they say anything specific about misinformation of the kind we have been discussing and whether children will be protected from it?
I am sorry—I am not sure I follow the noble Baroness’s question.
Twice so far in his reply, the Minister has said that this measure will protect children from misinformation and disinformation. I was just curious because I have not seen any sight of that, either in discussions or in the Bill. I was making a distinction regarding harmful content that we know the shape of—for example, pornography and self-harm, which are not, in themselves, misinformation or disinformation of the kind we are discussing now. It is news to me that children are going to be protected from this, and I am delighted, but I was just checking.
Yes, that is what the measure does—for instance, where it intersects with the named categories of primary priority or priority content in the Bill, although that is not the only way the Bill does it. This will be covered by non-designated content that is harmful to children. As we have said, we will bring forward amendments on Report—which is perhaps why the noble Baroness has not seen them in the material in front of us—regarding material harms to children, and they will provide further detail and clarity.
Returning to the advisory committee that the Bill sets up and the amendments from the noble Baroness, Lady Merron, and my noble friend Lord Moylan, all regulated service providers will be forced to take action against illegal misinformation and disinformation in scope of the Bill. That includes the new false communication offences in the Bill that will capture communications where the sender knows the information to be false but sends it intending to cause harm—for example, hoax cures for a virus such as Covid-19. The noble Baroness is right to say that that is a slightly different approach from the one taken in her amendment, but we think it an appropriate and proportionate response to tackling damaging and illegal misinformation and disinformation. If a platform is likely to be accessed by children, it will have to protect them from encountering misinformation and disinformation content that meets the Bill’s threshold for content that is harmful to children. Again, that is an appropriate and proportionate response.
Turning to the points made by my noble friend Lord Moylan and the noble Baroness, Lady Fox, services will also need to have particular regard to freedom of expression when complying with their safety duties. Ofcom will be required to set out steps that providers can take when complying with their safety duties in the codes of practice, including what is proportionate for different providers and how freedom of expression can be protected.
This might be an appropriate moment for me to say—on the back of that—that, although my noble friend explained current government practice, he has not addressed my point on why there should not be an annual report to Parliament that describes what government has done on these various fronts. If the Government regularly meet newspaper publishers to discuss the quality of information in their newspapers, I for one would have entire confidence that the Government were doing so in the public interest, but I would still quite like—I think the Government would agree on this—a report on what was happening, making an exception for national security. That would still be a good thing to do. Will my noble friend explain why we cannot be told?
While I am happy to elaborate on the work of the counter-disinformation unit in the way I just have, the Government cannot share operational details about its work, as that would give malign actors insight into the scope and scale of our capabilities. As my noble friend notes, this is not in the public interest. Moreover, reporting representations made to platforms by the unit would also be unnecessary as this would overlook both the existing processes that govern engagements with external parties and the new protections that are introduced through the Bill.
In the first intervention, the noble Baroness, Lady Fox, gave a number of examples, some of which are debatable, contestable facts. Companies may well choose to keep them on their platforms within their terms of service. We have also seen deliberate misinformation and disinformation during the pandemic, including from foreign actors promoting more harmful disinformation. It is right that we take action against this.
I hope that I have given noble Lords some reassurance on the points raised about the amendments in this group. I invite them not to press the amendments.
My Lords, I am most grateful to noble Lords across the Committee for their consideration and for their contributions in this important area. As the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, both said, this was an area of struggle for the Joint Committee. The debate today shows exactly why that is so, but it is a struggle worth having.
The noble Lord, Lord Bethell, talked about there being a gap in the Bill as it stands. The amendments include the introduction of risk assessments and transparency and, fundamentally, explaining things in a way that people can actually understand. These are all tried and tested methods and can serve only to improve the Bill.
I am grateful to the Minister for his response and consideration of the amendments. I want to take us back to the words of the noble Baroness, Lady Kidron. She explained it beautifully—partly in response to the comments from the noble Baroness, Lady Fox. This is about tackling a system of amplification of misinformation and disinformation that moves the most marginal of views into the mainstream. It deals with restricting the damage that, as I said earlier, can produce the most dire circumstances. Amplification is the consideration that these amendments seek to tackle.
I am grateful to the noble Lord, Lord Moylan, for his comments, as well as for his amendments. I am sure the noble Lord has reflected that some of the previous amendments he brought before the House somewhat put the proverbial cat among the Committee pigeons. On this occasion, I think the noble Lord has nicely aligned the cats and the pigeons. He has managed to rally us all—with the exception of the Minister—behind these amendments.
My Lords, this has been a good debate. I am glad that a number of noble Lords mentioned Lord Puttnam and the committee that he chaired for your Lordships’ House on democracy and digital technologies. I responded to the debate that we had on that; sadly, it was after he had already retired from your Lordships’ House, but he participated from the steps of the Throne. I am mindful of that report and the lessons learned in it in the context of the debate that we have had today.
We recognise the intent behind the amendments in this group to strengthen the UK’s approach to media literacy in so far as it relates to services that will be regulated by the Bill. Ofcom has a broad duty to promote media literacy under the Communications Act 2003. That is an important responsibility for Ofcom, and it is right that the regulator is able to adapt its approach to support people in meeting the evolving challenges of the digital age.
Amendments 52A and 91 from the noble Lord, Lord Knight, and Amendment 91A from the noble Lord, Lord Holmes of Richmond, seek to introduce duties on in-scope services, requiring them to put in place measures that promote users’ media literacy, while Amendment 98 tabled by the noble Lord, Lord Knight, would require Ofcom to issue a code of practice in relation to the new duty proposed in his Amendment 91. While we agree that the industry has a role to play in promoting media literacy, the Government believe that these amendments could lead to unintended, negative consequences.
I shall address the role of the industry and media literacy, which the noble Baroness, Lady Kidron, dwelt on in her remarks. We welcome the programmes that it runs in partnership with online safety experts such as Parent Zone and Internet Matters and hope they continue to thrive, with the added benefit of Ofcom’s recently published evaluation toolkit. However, we believe that platforms can go further to empower and educate their users. That is why media literacy has been included in the Bill’s risk assessment duties, meaning that regulated services will have to consider measures to promote media literacy to their users as part of the risk assessment process. Additionally, through work delivered under its existing media literacy duty, Ofcom is developing a set of best-practice design principles for platform-based media literacy measures. That work will build an evidence base of the most effective measures that platforms can take to build their users’ media literacy.
In response to the noble Baroness’s question, I say: no, platforms will not be able to avoid putting in place protections for children by using media literacy campaigns. Ofcom would be able to use its enforcement powers if a platform was not achieving appropriate safety outcomes. There are a range of ways in which platforms can mitigate risks, of which media literacy is but one, and Ofcom would expect platforms to consider them all in their risk assessments.
Let me say a bit about the unintended consequences we fear might arise from these amendments. First, the resource demands to create a code of practice and then to regulate firms’ compliance with this type of broad duty will place an undue burden on the regulator. It is also unclear how the proposed duties in Amendments 52A, 91 and 91A would interact with Ofcom’s existing media literacy duty. There is a risk, we fear, that these parallel duties could be discharged in conflicting ways. Amendment 91A is exposed to broad interpretation by platforms and could enable them to fulfil the duty in a way that lacked real impact on users’ media literacy.
The amendment in the name of my noble friend Lord Holmes proposes a duty to promote awareness of financial deception and fraud. The Government are already taking significant action to protect people from online fraud, including through their new fraud strategy and other provisions in this Bill. I know that my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn met noble Lords to talk about that earlier this week. We believe that measures such as prompts for users before they complete financial transactions sit more logically with financial service providers than with services in scope of this Bill.
Amendment 52A proposes a duty on carriers of journalistic content to promote media literacy to their users. We do not want to risk requiring platforms to act as de facto press regulators, assessing the quality of news publishers’ content. That would not be compatible with our commitment to press freedom. Under its existing media literacy duty, Ofcom is delivering positive work to support people to discern high-quality information online. It is also collaborating with the biggest platforms to design best practice principles for platform-based media literacy measures. It intends to publish these principles this year and will encourage platforms to adopt them.
It is right that Ofcom is given time to understand the benefits of these approaches. The Secretary of State’s post-implementation review will allow the Government and Parliament to establish the effectiveness of Ofcom’s current approach and to reconsider the role of platforms in enhancing users’ media literacy, if appropriate. In the meantime, the Bill introduces new transparency-reporting and information-gathering powers to enhance Ofcom’s visibility of platforms delivery and evaluation of media literacy activities. We would not want to see amendments that would inadvertently dissuade platforms from delivering these activities in favour of less costly and less effective measures.
My noble friend Lord Holmes asked about the Online Media Literacy Strategy, published in July 2021, which set out the Government’s vision for improving media literacy in the country. Alongside the strategy, we have committed to publishing annual action plans each financial year until 2024-25, setting out how we meet the ambition of the strategy. In April 2022 we published the Year 2 Action Plan, which included extending the reach of media literacy education to those who are currently disengaged, in consultation with the media literacy task force—a body of 17 cross-sector experts—expanding our grant funding programme to provide nearly £2.5 million across two years for organisations delivering innovative media literacy activities, and commissioning research to improve our understanding of the challenges faced by the sector. We intend to publish the research later this year, for the benefit of civil society organisations, technology platforms and policymakers.
The noble Lord, Lord Knight, in his Amendment 186, would stipulate that Ofcom must levy fees on regulated firms sufficient to fund the work of third parties involved in supporting it to meet its existing media literacy duties. The Bill already allows Ofcom to levy fees sufficient to fund the annual costs of exercising its online safety functions. This includes its existing media literacy duty as far as it relates to services regulated by this Bill. As such, the Bill already ensures that these media literacy activities, including those that Ofcom chooses to deliver through third parties, can be funded through fees levied on industry.
I turn to Amendments 188, 235, 236, 237 and 238. The Government recognise the intent behind these amendments, which is to help improve the media literacy of the general public. Ofcom already has a statutory duty to promote media literacy with regard to the publication of anything by means of electronic media, including services in scope of the Bill. These amendments propose rather prescriptive objectives, either as part of a new duty for Ofcom or through updating its existing duty. They reflect current challenges in the sector but run the risk of becoming obsolete over time, preventing Ofcom from adapting its work in response to emerging issues.
Ofcom has demonstrated flexibility in its existing duty through its renewed Approach to Online Media Literacy, launched in 2021. This presented an expanded media literacy programme, enabling it to achieve almost all the objectives specified in this group. The Government note the progress that Ofcom has already achieved under its renewed approach in the annual plan it produced last month. The Online Safety Bill strengthens Ofcom’s functions relating to media literacy, which is included in Ofcom’s new transparency-reporting and information-gathering powers, which will give it enhanced oversight of industry activity by enabling it to require regulated services to share or publish information about the work that that they are doing on media literacy.
The noble Baroness, Lady Prashar, asked about the view expressed by the Joint Committee on minimum standards for media literacy training. We agree with the intention behind that, but, because of the broad and varied nature of media literacy, we do not believe that introducing minimum standards is the most effective way of achieving that outcome. Instead, we are focusing efforts on improving the evaluation practices of media literacy initiatives to identify which ones are most effective and to encourage their delivery. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. This was published in February this year and has been met with praise from practitioners, including those who received grant funding from the Government’s non-legislative media literacy work programme. The post-implementation review of Ofcom’s online safety regime, which covers its existing media literacy duty in so far as it relates to regulated services, will provide a reasonable point at which to establish the effectiveness of Ofcom’s new work programme, after giving it time to take effect.
Noble Lords talked about the national curriculum and media literacy in schools. Media literacy is indeed a crucial skill for everyone in the digital age. Key media literacy skills are already taught through a number of compulsory subjects in the national curriculum. Digital literacy is included in the computing national curriculum in England, which equips pupils with the knowledge, understanding and skills to use information and communication technology creatively and purposefully. I can reassure noble Lords that people such as Monica are being taught not about historic things like floppy disks but about emerging and present challenges; the computing curriculum ensures that pupils are taught how to design program systems and accomplish goals such as collecting, analysing, evaluating and presenting data.
Does the Minister know how many children are on computing courses?
I do not know, but I shall find out from the Department for Education and write. But those who are on them benefit from a curriculum that includes topics such as programming and algorithms, the responsible and safe use of technology, and other foundational knowledge that may support future study in fields such as artificial intelligence and data science.
This is not the only subject in which media literacy and critical thinking are taught. In citizenship education, pupils are taught about critical thinking and the proper functioning of a democracy. They learn to distinguish fact from opinion, as well as exploring freedom of speech and the role and responsibility of the media in informing and shaping public opinion. As Minister for Arts and Heritage, I will say a bit about subjects such as history, English and other arts subjects, in which pupils learn to ask questions about information, think critically and weigh up arguments, all of which are important skills for media literacy, as well as more broadly.
I am grateful to my noble friends for their amendments in this group, and for the useful debate that we have had. I am grateful also to my noble friend Lady Morgan of Cotes and the members of her committee who have looked at fraud, and for the work of the Joint Committee which scrutinised the Bill, in earlier form, for its recommendations on strengthening the way it tackles fraud online. As the noble Lord, Lord Clement-Jones, said, following those recommendations, the Government have brought in new measures to strengthen the Bill’s provisions to tackle fraudulent activity on in-scope services. I am glad he was somewhat satisfied by that.
All in-scope services will be required to take proactive action to tackle fraud facilitated through user-generated content. In addition, the largest and most popular platforms have a stand-alone duty to prevent fraudulent paid-for advertising appearing on their services. This represents a major step forward in ensuring that internet users are protected from scams, which have serious financial and psychological impacts, as noble Lords noted in our debate. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. Advertising involves a broad range of actors not covered by the current legislative framework, such as advertising intermediaries. I am sympathetic to these concerns and the Government are taking action in this area. Through the online advertising programme, we will deliver a holistic review of the regulatory framework in relation to online advertising. The Government consulted on this work last year and aim to publish a response erelong. As the noble Lord, Lord Stevenson, and others noted, there are a number of Bills which look at this work. Earlier this week, there was a meeting hosted by my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn to try to avoid the cracks opening up between the Bills. I am grateful to my noble friend Lady Morgan for attending; I hope it was a useful discussion.
I turn to the amendments tabled by my noble friend. The existing duties on user reporting and user complaints have been designed for user-generated content and search content and are not easily applicable to paid-for advertising. The duties on reporting and complaints mechanisms require platforms to take action in relation to individual complaints, but many in-scope services do not have control over the paid-for advertising on their services. These amendments are therefore difficult to operate for many in-scope services and would create a substantial burden for small businesses. I assure her and other noble Lords that the larger services, which have strong levers over paid-for advertising, will have to ensure that they have processes in place to enable users to report fraudulent advertising.
In reference to transparency reporting, let me assure my noble friend and others that Ofcom can already require information about how companies comply with their fraudulent advertising duties through transparency reports. In addition, Ofcom will also have the power to gather any information it requires for the purpose of exercising its online safety functions. These powers are extensive and will allow Ofcom to assess compliance with the fraudulent advertising duties.
The noble Viscount, Lord Colville of Culross, asked about the difficulty of identifying fraudulent advertising. Clauses 170 and 171 give guidance and a duty on Ofcom about providers making a judgment about content, including fraudulent advertising. There will also be a code of practice on fraudulent advertising to provide further guidance on mechanisms to deal with this important issue.
My noble friend Lord Lucas’s Amendments 94 and 95 aim to require services to report information relating to fraudulent advertising to UK authorities. I am confident that the Bill’s duties will reduce the prevalence of online fraud, reducing the need for post hoc reporting in this way. If fraud does appear online, there are adequate systems in place for internet users to report this to the police.
People can report a scam to Action Fraud, the national reporting service for fraud and cybercrime. Reports submitted to Action Fraud are considered by the National Fraud Intelligence Bureau and can assist a police investigation. Additionally, the Advertising Standards Authority has a reporting service for reporting online scam adverts, and those reports are automatically shared with the National Cyber Security Centre.
The online advertising programme, which I mentioned earlier, builds on the Bill’s fraudulent advertising duty and looks at the wider online advertising system. That programme is considering measures to increase accountability and transparency across the supply chain, including proposals for all parties to enhance record keeping and information sharing.
My noble friend Lord Lucas was keen to meet to speak further. I will pass that request to my noble friend Lord Sharpe of Epsom, who I think would be the better person to talk to in relation to this on behalf of the Home Office—but I am sure that one of us will be very happy to talk with him.
I look forward to discussing this issue in more detail with my noble friend Lady Morgan and others between now and Report, but I hope that this provides sufficient reassurance on the work that the Government are doing in this Bill and in other ways. I invite my noble friends not to press their amendments.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, the amendments in this group are concerned with complaints mechanisms. I turn first to Amendment 56 from the noble Lord, Lord Stevenson of Balmacara, which proposes introducing a requirement on Ofcom to produce an annual review of the effectiveness and efficiency of platforms’ complaints procedures. Were this review to find that regulated services were not complying effectively with their complaints procedure duties, the proposed new clause would provide for Ofcom to establish an ombudsman to provide a dispute resolution service in relation to complaints.
While I am of course sympathetic to the aims of this amendment, the Government remain confident that service providers are best placed to respond to individual user complaints, as they will be able to take appropriate action promptly. This could include removing content, sanctioning offending users, reversing wrongful content removal or changing their systems and processes. Accordingly, the Bill imposes a duty on regulated user-to-user and search services to establish and operate an easy-to-use, accessible and transparent complaints procedure. The complaints procedure must provide for appropriate action to be taken by the provider in relation to the complaint.
It is worth reminding ourselves that this duty is an enforceable requirement. Where a provider is failing to comply with its complaints procedure duties, Ofcom will be able to take enforcement action against the regulated service. Ofcom has a range of enforcement powers, including the power to impose significant penalties and confirmation decisions that can require the provider to take such steps as are required for compliance. In addition, the Bill includes strong super-complaints provisions that will allow for concerns about systemic issues to be raised with the regulator, which will be required to publish its response to the complaint. This process will help to ensure that Ofcom is made aware of issues that users are facing.
Separately, individuals will also be able to submit complaints to Ofcom. Given the likelihood of an overwhelming volume of complaints, as we have heard, Ofcom will not be able to investigate or arbitrate on individual cases. However, those complaints will be an essential part of Ofcom’s horizon-scanning, research, supervision and enforcement activity. They will guide Ofcom in deciding where to focus its attention. Ofcom will also have a statutory duty to conduct consumer research about users’ experiences in relation to regulated services and the handling of complaints made by users to providers of those services. Further, Ofcom can require that category 1, 2A and 2B providers set out in their annual transparency reports the measures taken to comply with their duties in relation to complaints. This will further ensure that Ofcom is aware of any issues facing users in relation to complaints processes.
At the same time, I share the desire expressed to ensure that the complaints mechanisms will be reviewed and assessed. That is why the Bill contains provisions for the Secretary of State to undertake a review of the efficacy of the entire regulatory framework. This will take place between two and five years after the Part 3 provisions come into force, which is a more appropriate interval for the efficacy of the duties around complaints procedures to be reviewed, as it will allow time for the regime to bed in and provide a sufficient evidence base to assess whether changes are needed.
Finally, I note that Amendment 56 assumes that the preferred solution following a review will be an ombudsman. There is probably not enough evidence to suggest that an ombudsman service would be effective for the online safety regime. It is unclear how an ombudsman service would function in support of the new online safety regime, because individual user complaints are likely to be complex and time-sensitive—and indeed, in many cases financial compensation would not be appropriate. So I fear that the noble Lord’s proposed new clause pre-empts the findings of a review with a solution that is resource-intensive and may be unsuitable for this sector.
Amendments 250A and 250B, tabled by my noble friend Lady Newlove, require that an independent appeals system is established and that Ofcom produces guidance to support this system. As I have set out, the Government believe that decisions on user redress and complaints are best dealt with by services. Regulated services will be required to operate an easy-to-use, accessible and transparent complaints procedure that enables users to make complaints. If services do not comply with these duties, Ofcom will be able to utilise its extensive enforcement powers to bring them into compliance.
The Government are not opposed to revisiting the approach to complaints once the regime is up and running. Indeed, the Bill provides for the review of the regulatory framework. However, it is important that the new approach, which will radically change the regulatory landscape by proactively requiring services to have effective systems and processes for complaints, has time to bed in before it is reassessed.
Turning specifically to the points made by my noble friend and by the noble Baroness, Lady Kidron, about the impartial out of court dispute resolution procedure in the VSP, the VSP regime and the Online Safety Bill are not directly comparable. The underlying principles of both regimes are of course the same, with the focus on systems regulation and protections for users, especially children. The key differences are regarding the online safety framework’s increased scope. The Bill covers a wider range of harms and introduces online safety duties on a wider range of platforms. Under the online safety regime, Ofcom will also have a more extensive suite of enforcement powers than under the UK’s VSP regime.
On user redress, the Bill goes further than the VSP regime as it will require services to offer an extensive and effective complaints process and will enable Ofcom to take stronger enforcement action where they fail to meet this requirement. That is why the Government have put the onus of the complaints procedure on the provider and set out a more robust approach which requires all in-scope, regulated user to user and search services to offer an effective complaints process that provides for appropriate action to be taken in relation to the complaint. This will be an enforceable duty and will enable Ofcom to utilise its extensive online safety enforcement powers where services are not complying with their statutory duty to provide a usable, accessible and transparent complaints procedure.
At the same time, we want to ensure that the regime can develop and respond to new challenges. That is why we have included a power for the Secretary of State to review the regulatory framework once it is up and running. This will provide the correct mechanism to assess whether complaint handling mechanisms can be further strengthened once the new regulations have had time to bed in.
The Government are confident that the Online Safety Bill represents a significant step forward in keeping users safe online for these reasons.
My Lords, could I just ask a question? This Bill has been in gestation for about five to six years, during which time the scale of the problems we are talking about has increased exponentially. The Government appear to be suggesting that they will, in three to five years, evaluate whether or not their approach is working effectively.
There was a lot of discussion in this Chamber yesterday about the will of the people and whether the Government were ignoring it. I gently suggest that the very large number of people, who are having all sorts of problems or who are fearful of harm from the online world, will not find in the timescale that the Government are proposing the sort of remedy and speed of action I suspect they were hoping for. Certainly, the rhetoric the Government have used and continue to use at regular points in the Bill when they are slightly on the back foot seems to be designed to try to make the situation seem better than it is.
Will the Minister and the Bill team take on board that there are some very serious concerns that there will be a lot of lashing back at His Majesty’s Government if in three years’ time—which I fear may be the case—we still have a situation where a large body of complaints are not being dealt with? Ofcom is going to suffer from major ombudsman-like constipation trying to deal with this, and the harms will continue. I think I speak for the Committee when I say that the arguments the Minister and the government side are making really do not hold water.
I do not know about that last point. I was going to say that I am very happy to meet the noble Lord to discuss it. It seems to me to come down to a matter of timing and the timing of the first review. As I say, I am delighted to meet the noble Lord. By the way, the relevant shortest period is two years not three, as he said.
Following on from my friend, the noble Lord, Lord Russell, can I just say to the Minister that I would really welcome all of us having a meeting? As I am listening to this, I am thinking that three to five years is just horrific for the families. This Bill has gone on for so long to get where we are today. We are losing sight of humanity here and the moral compass of protecting human lives. For whichever Government is in place in three to five years to make the decision to say it does not work is absolutely shameful. Nobody in the Government will be accountable and yet for that family, that single person may commit suicide. We have met the bereaved families, so I say to the Minister that we need to go round the table and look at this again. I do not think it is acceptable to say that there is this timeline, this review, for the Secretary of State when we are dealing with young lives. It is in the public interest to get this Bill correct as it navigates its way back to the House of Commons in a far better state than how it arrived.
I would love the noble Viscount to answer my very specific question about who the Government think families should turn to when they have exhausted the complaints system in the next three to five years. I say that as someone who has witnessed successive Secretaries of State promising families that this Bill would sort this out. Yes?
I stress again that the period in question is two years not three.
It is between two and five years. It can be two; it can be five. I am very happy to meet my noble friend and to carry on doing so. The complaints procedure set up for families is to first approach the service provider in an enforceable manner and should the provider fail to meet its enforceable duties to then revert to Ofcom before the courts.
I am sorry but that is exactly the issue at stake. The understanding of the Committee currently is that there is then nowhere to go if they have exhausted that process. I believe that complainants are not entitled to go to Ofcom in the way that the noble Viscount just suggested.
Considerably more rights are provided than they have today, with the service provider. Indeed, Ofcom would not necessarily deal with individual complaints—
They would go to the service provider in the first instance and then—
What recourse would they have, if Ofcom will not deal with individual complaints in those circumstances?
I am happy to meet and discuss this. We are expanding what they are able to receive today under the existing arrangements. I am happy to meet any noble Lords who wish to take this forward to help them understand this—that is probably best.
Amendments 287 and 289 from the noble Baroness, Lady Fox of Buckley, seek to remove the provision for super-complaints from the Bill. The super-complaints mechanism is an important part of the Bill’s overall redress mechanisms. It will enable entities to raise concerns with Ofcom about systemic issues in relation to regulated services, which Ofcom will be required to respond to. This includes concerns about the features of services or the conduct of providers creating a risk of significant harm to users or the public, as well as concerns about significant adverse impacts on the right to freedom of expression.
On who can make super-complaints, any organisation that meets the eligibility criteria set out in secondary legislation will be able to submit a super-complaint to Ofcom. Organisations will be required to submit evidence to Ofcom, setting out how they meet these criteria. Using this evidence, Ofcom will assess organisations against the criteria to ensure that they meet them. The assessment of evidence will be fair and objective, and the criteria will be intentionally strict to ensure that super-complaints focus on systemic issues and that the regulator is not overwhelmed by the number it receives.
To clarify and link up the two parts of this discussion, can the Minister perhaps reflect, when the meeting is being organised, on the fact that the organisations and the basis on which they can complain will be decided by secondary legislation? So we do not know which organisations or what the remit is, and we cannot assess how effective that will be. We know that the super-complainants will not want to overwhelm Ofcom, so things will be bundled into that. Individuals could be excluded from the super-complaints system in the way that I indicated, because super-complaints will not represent everyone, or even minority views; in other words, there is a gap here now. I want that bit gone, but that does not mean that we do not need a robust complaints system. Before Report at least—in the meetings in between—the Government need to advise on how you complain if something goes wrong. At the moment, the British public have no way to complain at all, unless someone sneaks it through in secondary legislation. This is not helpful.
As I said, we are happy to consider individual complaints and super-complaints further.
Again, I am just pulling this together—I am curious to understand this. We have been given a specific case—South West Grid for Learning raising a case based on an individual but that had more generic concerns—so could the noble Viscount clarify, now or in writing, whether that is the kind of thing that he imagines would constitute a super-complaint? If South West Grid for Learning went to a platform with a complaint like that—one based on an individual but brought by an organisation—would Ofcom find that complaint admissible under its super-complaints procedure, as imagined in the Bill?
Overall, the super-complaints mechanism is more for groupings of complaints and has a broader range than the individual complaints process, but I will consider that point going forward.
Many UK regulators have successful super-complaints mechanisms which allow them to identify and target emerging issues and effectively utilise resources. Alongside the Bill’s research functions, super-complaints will perform a vital role in ensuring that Ofcom is aware of the issues users are facing, helping them to target resources and to take action against systemic failings.
On the steps required after super-complaints, the regulator will be required to respond publicly to the super-complaint. Issues raised in the super-complaint may lead Ofcom to take steps to mitigate the issues raised in the complaint, where the issues raised can be addressed via the Bill’s duties and powers. In this way, they perform a vital role in Ofcom’s horizon-scanning powers, ensuring that it is aware of issues as they emerge. However, super-complaints are not linked to any specific enforcement process.
My Lords, it has just occurred to me what the answer is to the question, “Where does an individual actually get redress?” The only way they can get redress is by collaborating with another 100 people and raising a super-complaint. Is that the answer under the Bill?
No. The super-complaints mechanism is better thought of as part of a horizon-scanning mechanism. It is not—
So it is not really a complaints system; it is a horizon-scanning system. That is interesting.
The answer to the noble Lord’s question is that the super-complaint is not a mechanism for individuals to complain on an individual basis and seek redress.
This is getting worse and worse. I am tempted to suggest that we stop talking about this and try to, in a smaller group, bottom out what we are doing. I really think that the Committee deserves a better response on super-complaints than it has just heard.
As I understood it—I am sure that the noble Baroness, Lady Kidron, is about to make the same point—super-complaints are specifically designed to take away the pressure on vulnerable and younger persons to have responsibility only for themselves in bringing forward the complaint that needs to be resolved. They are a way of sharing that responsibility and taking away the pressure. Is the Minister now saying that that is a misunderstanding?
I have offered a meeting; I am very happy to host the meeting to bottom out these complaints.
I understand that the Minister has been given a sticky wicket of defending the indefensible. I welcome a meeting, as I think the whole Committee does, but it would be very helpful to hear the Government say that they have chosen to give individuals no recourse under the Bill—that this is the current situation, as it stands, and that there is no concession on the matter. I have been in meetings with people who have been promised such things, so it is really important, from now on in Committee, that we actually state at the Dispatch Box what the situation is. I spent quite a lot of the weekend reading circular arguments, and we now need to get to an understanding of what the situation is. We can then decide, as a Committee, what we do in relation to that.
As I said, I am very happy to hold the meeting. We are giving users greater protection through the Bill, and, as agreed, we can discuss individual routes to recourse.
I hope that, on the basis of what I have said and the future meeting, noble Lords have some reassurance that the Bill’s complaint mechanisms will, eventually, be effective and proportionate, and feel able not to press their amendments.
I am very sorry that I did not realise that the Minister was responding to this group of amendments; I should have welcomed him to his first appearance in Committee. I hope he will come back—although he may have to spend a bit of time in hospital, having received a pass to speak on this issue from his noble friend.
This is a very complicated Bill. The Minister and I have actually talked about that over tea, and he is now learning the hard lessons of what he took as a light badinage before coming to the Chamber today. However, we are in a bit of a mess here. I was genuinely trying to get an amendment that would encourage the department to move forward on this issue, because it is quite clear from the mood around the Committee that something needs to be resolved here. The way the Government are approaching this is by heading towards a brick wall, and I do not think it is the right way forward.
My Lords, it is a pity that we have not had the benefit of hearing from the Minister, because a lot of his amendments in this group seem to bear on some of the more generic points made in the very good speech by the noble Baroness, Lady Fraser. I assume he will cover them, but I wonder whether he would at least be prepared to answer any questions people might come back with—not in any aggressive sense; we are not trying to scare the pants off him before he starts. For example, the points made by the noble Lord, Lord Clement-Jones, intrigue me.
I used to have responsibility for devolved issues when I worked at No. 10 for a short period. It was a bit of a joke, really. Whenever anything Welsh happened, I was immediately summoned down to Cardiff and hauled over the coals. You knew when you were in trouble when they all stopped speaking English and started speaking Welsh; then, you knew there really was an issue, whereas before I just had to listen, go back and report. In Scotland, nobody came to me anyway, because they knew that the then Prime Minister was a much more interesting person to talk to about these things. They just went to him instead, so I did not really learn very much.
I noticed some issues in the Marshalled List that I had not picked up on when I worked on this before. I do not know whether the Minister wishes to address this—I do not want to delay the Committee too much—but are we saying that to apply a provision in the Bill to the Bailiwick of Guernsey or the Isle of Man, an Order in Council is required to bypass Parliament? Is that a common way of proceeding in these places? I suspect that the noble and learned Lord, Lord Hope, knows much more about this than I do—he shakes his head—but this is a new one on me. Does it mean that this Parliament has no responsibility for how its laws are applied in those territories, or are there other procedures of which we are unaware?
My second point again picks up what the noble Lord, Lord Clement-Jones, was saying. Could the Minister go through in some detail the process by which a devolved authority would apply to the Secretary of State—presumably for DSIT—to seek consent for a devolved offence to be included in the Online Safety Bill regime? If this is correct, who grants to what? Does this come to the House as a statutory instrument? Is just the Secretary of State involved, or does it go to the Privy Council? Are there other ways that we are yet to know about? It would be interesting to know.
To echo the noble Lord, Lord Clement-Jones, we probably do need a letter from the Minister, if he ever gets this cleared, setting out exactly how the variation in powers would operate across the four territories. If there are variations, we would like to know about them.
My Lords, I am very grateful to my noble friend Lady Fraser of Craigmaddie for her vigilance in this area and for the discussion she had with the Bill team, which they and I found useful. Given the tenor of this short but important debate, I think it may be helpful if we have a meeting for other noble Lords who also want to benefit from discussing some of these things in detail, and particularly to talk about some of the issues the noble Lord, Lord Stevenson of Balmacara, just raised. It would be useful for us to talk in detail about general questions on the operation of the law before we look at this again on Report.
In a moment, I will say a bit about the government amendments which stand in my name. I am sure that noble Lords will not be shy in taking the opportunity to interject if questions arise, as they have not been shy on previous groups.
I will start with the amendments tabled by my noble friend Lady Fraser. Her Amendment 58 seeks to add reference to the Human Rights Act 1998 to Clause 18. That Act places obligations on public authorities to act compatibly with the European Convention on Human Rights. It does not place obligations on private individuals and companies, so it would not make sense for such a duty on internet services to refer to the Human Rights Act.
Under that Act, Ofcom has obligations to act in accordance with the right to freedom of expression under Article 10 of the European Convention on Human Rights. As a result, the codes that Ofcom draws up will need to comply with the Article 10 right to freedom of expression. Schedule 4 to the Bill requires Ofcom to ensure that measures which it describes in a code of practice are designed in light of the importance of protecting the right of users’
“freedom of expression within the law”.
Clauses 44(2) and (3) provide that platforms will be treated as complying with their freedom of expression duty if they take the recommended measures that Ofcom sets out in the codes. Platforms will therefore be guided by Ofcom in taking measures to comply with its duties, including safeguards for freedom of expression through codes of practice.
My noble friend’s Amendment 136 seeks to add offences under the Hate Crime and Public Order (Scotland) Act 2021 to Schedule 7. Public order offences are already listed in Schedule 7 to the Bill, which will apply across the whole United Kingdom. This means that all services in scope will need proactively to tackle content that amounts to an offence under the Public Order Act 1986, regardless of where the content originates or where in the UK it can be accessed.
The priority offences list has been developed with the devolved Administrations, and Clause 194 outlines the parliamentary procedures for updating it. The requirements for consent will be set out in the specific subordinate legislation that may apply to the particular offence being made by the devolved authorities—that is to say, they will be laid down by the enabling statutes that Parliament will have approved.
Amendment 228 seeks to require the inclusion of separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s transparency reports. These transparency reports are based on the information requested from category 1, 2A and 2B service providers through transparency reporting. I assure my noble friend that Ofcom is already able to request country-specific information from providers in its transparency reports. The legislation sets out high-level categories of information that category 1, 2A and 2B services may be required to include in their transparency reports. The regulator will set out in a notice the information to be requested from the provider, the format of that information and the manner in which it should be published. If appropriate, Ofcom may request specific information in relation to each country in the UK, such as the number of users encountering illegal content and the incidence of such content.
Ofcom is also required to undertake consultation before producing guidance about transparency reporting. In order to ensure that the framework is proportionate and future-proofed, however, it is vital to allow the regulator sufficient flexibility to request the types of information that it sees as relevant, and for that information to be presented by providers in a manner that Ofcom has deemed to be appropriate.
Similarly, Amendment 225A would require separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s research about users’ experiences of regulated services. Clause 141 requires that Ofcom make arrangements to undertake consumer research to ascertain public opinion and the experiences of UK users of regulated services. Ofcom will already be able to undertake this research on a country-specific basis. Indeed, in undertaking its research and reporting duties, as my noble friend alluded to, Ofcom has previously adopted such an approach. For instance, it is required by the Communications Act 2003 to undertake consumer research. While the legislation does not mandate that Ofcom conduct and publish nation-specific research, Ofcom has done so, for instance through its publications Media Nations and Connected Nations. I hope that gives noble Lords some reassurance of its approach in this regard. Ensuring that Ofcom has flexibility in carrying out its research functions will enable us to future-proof the regulatory framework, and will mean that its research activity is efficient, relevant and appropriate.
I will now say a bit about the government amendments standing in my name. I should, in doing so, highlight that I have withdrawn Amendments 304C and 304D, previously in the Marshalled List, which will be replaced with new amendments to ensure that all the communications offences, including the new self-harm offence, have the appropriate territorial extent when they are brought forward. They will be brought forward as soon as possible once the self-harm offence has been tabled.
Amendments 267A, 267B, 267C, 268A, 268B to 268G, 271A to 271D, 304A, 304B and 304E are amendments to Clauses 160, 162, 164 to 166, 168 and 210 and Schedule 14, relating to the extension of the false and threatening communications offences and the associated liability of corporate officers in Clause 166 to Northern Ireland.
This group also includes some technical and consequential amendments to the false and threatening communications offences and technical changes to the Malicious Communications (Northern Ireland) Order 1988 and Section 127 of the Communications Act 2003. This will minimise overlap between these existing laws and the new false and threatening communications offences in this Bill. Importantly, they mirror the approach taken for England and Wales, providing consistency in the criminal law.
This group also contains technical amendments to update the extent of the epilepsy trolling offence to reflect that it applies to England, Wales and Northern Ireland.
Yes, that would be a sensible way to view it. We will work on that and allow noble Lords to see it before they come to talk to us about it.
I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.
This has been a very good debate indeed. I have good days and bad days in Committee. Good days are when I feel that the Bill is going to make a difference and things are going to improve and the sun will shine. Bad days are a bit like today, where we have had a couple of groups, and this is one of them, where I am a bit worried about where we are and whether we have enough—I was going to use that terrible word “ammunition” but I do not mean that—of the powers that are necessary in the right place and with the right focus to get us through some of the very difficult questions that come in. I know that bad cases make bad law, but they can also illustrate why the law is not good enough. As the noble Baroness, Lady Kidron, was saying, this is possibly one of the areas we are in.
The speeches in the debate have made the case well and I do not need to go back over it. We have got ourselves into a situation where we want to reduce harm that we see around but do not want to impact freedom of expression. Both of those are so important and we have to hold on to them, but we find ourselves struggling. What do we do about that? We think through what we will end up with this Bill on the statute book and the codes of practice through it. This looks as though it is heading towards the question of whether the terms of service that will be in place will be sufficient and able to restrict the harms we will see affecting people who should not be affected by them. But I recognise that the freedom of expression arguments have won the day and we have to live with that.
The noble Baroness, Lady Kidron, mentioned the riskiness of the smaller sites—categories 2A and 2B and the ones that are not even going to be categorised as high as that. Why are we leaving those to cause the damage that they are? There is something not working here in the structure of the Bill and I hope the Minister will be able to provide some information on that when he comes to speak.
Obviously, if we could find a way of expressing the issues that are raised by the measures in these amendments as being illegal in the real world, they would be illegal online as well. That would at least be a solution that we could rely on. Whether it could be policed and serviced is another matter, but it certainly would be there. But we are probably not going to get there, are we? I am not looking at the Minister in any hope but he has a slight downward turn to his lips. I am not sure about this.
How can we approach a legal but harmful issue with the sort of sensitivity that does not make us feel that we have reduced people’s ability to cope with these issues and to engage with them in an adult way? I do not have an answer to that.
Is this another amplification issue or is it deeper and worse than that? Is this just the internet because of its ability to focus on things to keep people engaged, to make people stay online when they should not, to make them reach out and receive material that they ought not to get in a properly regulated world? Is it something that we can deal with because we have a sense of what is moral and appropriate and want to act because society wants us to do it? I do not have a solution to that, and I am interested to hear what the Minister will say, but I think it is something we will need to come back to.
My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.
Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.
However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.
Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.
The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.
Following our earlier discussion, we were going to have a response on super-complaints. I am curious to understand whether we had a pattern of complaints—such as those the noble Baroness, Lady Kidron, and others received—about a platform saying, under its terms of service, that it would remove suicide and self-harm content but failing to do so. Does the Minister think that is precisely the kind of thing that could be substantive material for an organisation to bring as a super-complaint to Ofcom?
My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.
I am sorry to enter the fray again on complaints, but how will anyone know that they have failed in this way if there is no complaints system?
I refer to the meeting my noble friend Lord Camrose offered; we will be able to go through and unpick the issues raised in that group of amendments, rather than looping back to that debate now.
The Minister is going through the structure of the Bill and saying that what is in it is adequate to prevent the kinds of harms to vulnerable adults that we talked about during this debate. Essentially, it is a combination of adherence to terms of service and user-empowerment tools. Is he saying that those two aspects are adequate to prevent the kinds of harms we have talked about?
Yes, they are—with the addition of what I am coming to. In addition to the duty for companies to consider the role of algorithms, which I talked about, Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties, including the power to require information from providers about the operation of their algorithms. The regulator will be able to hold senior executives criminally liable if they fail to ensure that their company is providing Ofcom with the information it requests.
However, we must not restrict users’ right to see legal content and speech. These amendments would prescribe specific approaches for companies’ treatment of legal content accessed by adults, which would give the Government undue influence in choosing, on adult users’ behalf, what content they see—
I wanted to give the Minister time to get on to this. Can we now drill down a little on the terms of service issue? If the noble Baroness, Lady Kidron, is right, are we talking about terms of service having the sort of power the Government suggest in cases where they are category 1 and category 2A but not search? There will be a limit, but an awful lot of other bodies about which we are concerned will not fall into that situation.
Also, I thought we had established, much to our regret, that the terms of service were what they were, and that Ofcom’s powers—I paraphrase to make the point—were those of exposure and transparency, not setting minimum standards. But even if we are talking only about the very large and far-reaching companies, should there not be a power somewhere to engage with that, with a view getting that redress, if the terms of service do not specify it?
The Bill will ensure that companies adhere to their terms of service. If they choose to allow content that is legal but harmful on their services and they tell people that beforehand—and adults are able and empowered to decide what they see online, with the protections of the triple shield—we think that that strikes the right balance. This is at the heart of the whole “legal but harmful” debate in another place, and it is clearly reflected throughout the approach in the Bill and in my responses to all of these groups of amendments. But there are duties to tackle illegal content and to make sure that people know the terms of service for the sites they choose to interact with. If they feel that they are not being adhered to—as they currently are not in relation to suicide and self-harm content on many of the services—users will have the recourse of the regulator to turn to.
I will plant a flag in reference to the new offences, which I know we will come back to again. It is always helpful to look at real-world examples. There is a lot of meme-based self-harm content. Two examples are the Tide Pods challenge—the eating of detergent capsules—and choking games, both of which have been very common and widespread. It would be helpful, ahead of our debate on the new offences, to understand whether they are below or above the threshold of serious self-harm and what the Government’s intention is. There are arguments both ways: obviously, criminalising children for being foolish carries certain consequences, but we also want to stop the spread of the content. So, when we come to that offence, it would be helpful if the Minister could use specific examples, such as the meme-based self-harm content, which is quite common.
I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.
The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.
It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.
The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.
Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.
I asked a number of questions on specific scenarios. If the Minister cannot answer them straight away, perhaps he could write to me. They all rather called for “yes/no” answers.
The noble Baroness threw me off with her subsequent question. She was broadly right, but I will write to her after I refresh my memory about what she said when I look at the Official Report.
My Lords, protecting women and girls is a priority for His Majesty’s Government, at home, on our streets and online. This Bill will provide vital protections for women and girls, ensuring that companies take action to improve their safety online and protect their freedom of expression so that they can continue to play their part online, as well as offline, in our society.
On Amendments 94 and 304, tabled by my noble friend Lady Morgan of Cotes, I want to be unequivocal: all service providers must understand the systemic risks facing women and girls through their illegal content and child safety risk assessments. They must then put in place measures that manage and mitigate these risks. Ofcom’s codes of practice will set out how companies can comply with their duties in the Bill.
I assure noble Lords that the codes will cover protections against violence against women and girls. In accordance with the safety duties, the codes will set out how companies should tackle illegal content and activity confronting women and girls online. This includes the several crimes that we have listed as priority offences, which we know are predominantly perpetrated against women and girls. The codes will also cover how companies should tackle harmful online behaviour and content towards girls.
Companies will be required to implement systems and processes designed to prevent people encountering priority illegal content and minimise the length of time for which any such content is present. In addition, Ofcom will be required to carry out broad consultation when drafting codes of practice to harness expert opinions on how companies can address the most serious online risks, including those facing women and girls. Many of the examples that noble Lords gave in their speeches are indeed reprehensible. The noble Baroness, Lady Kidron, talked about rape threats and threats of violence. These, of course, are examples of priority illegal content and companies will have to remove and prevent them.
My noble friend Lady Morgan suggested that the Bill misses out the specific course of conduct that offences in this area can have. Clause 9 contains provisions to ensure that services
“mitigate and manage the risk of the service being used for the commission or facilitation of”
an offence. This would capture patterns of behaviour. In addition, Schedule 7 contains several course of conduct offences, including controlling and coercive behaviour, and harassment. The codes will set out how companies must tackle these offences where this content contributes to a course of conduct that might lead to these offences.
To ensure that women’s and girls’ voices are heard in all this, the Bill will, as the right reverend Prelate noted, make it a statutory requirement for Ofcom to consult the Victims’ Commissioner and the domestic abuse commissioner about the formation of the codes of practice. As outlined, the existing illegal content, child safety and child sexual abuse and exploitation codes will already cover protections for women and girls. Creating a separate code dealing specifically with violence against women and girls would mean transposing or duplicating measures from these in a separate code.
In its recent communication to your Lordships, Ofcom stated that it will be consulting quickly on the draft illegal content and child sexual abuse and exploitation codes, and has been clear that it has already started the preparatory work for these. If Ofcom were required to create a separate code on violence against women and girls this preparatory work would need to be revised, with the inevitable consequence of slowing down the implementation of these vital protections.
An additional stand-alone code would also be duplicative and could cause problems with interpretation and uncertainty for Ofcom and providers. Linked to this, the simpler the approach to the codes, the higher the rates of compliance are likely to be. The more codes there are covering specific single duties, the more complicated it will be for providers, which will have to refer to multiple different codes, and the harder for businesses to put in place the right protections for users. Noble Lords have said repeatedly that this is a complex Bill, and this is an area where I suggest we should not make it more complex still.
As the Bill is currently drafted, Ofcom is able to draft codes in a way that addresses a range of interrelated risks affecting different groups of users, such as people affected in more than one way; a number of noble Lords dealt with that in their contributions. For example, combining the measures that companies can take to tackle illegal content targeting women and girls with the measures they can take to tackle racist abuse online could ensure a more comprehensive and effective approach that recognises the point, which a number of noble Lords made, that people with more than one protected characteristic under the Equality Act may be at compound risk of harm. If the Bill stipulated that Ofcom separate the offences that disproportionately affect women and girls from other offences in Schedule 7, this comprehensive approach to tackling violence against women and girls online could be lost.
Could my noble friend the Minister confirm something? I am getting rather confused by what he is saying. Is it the case that there will be just one mega code of practice to deal with every single problem, or will there be lots of different codes of practice to deal with the problems? I am sure the tech platforms will have sufficient people to be able to deal with them. My understanding is that Ofcom said that, while the Bill might not mandate a code of practice on violence against women and girls, it would in due course be happy to look at it. Is that right, or is my noble friend the Minister saying that Ofcom will never produce a code of practice on violence against women and girls?
It is up to Ofcom to decide how to set the codes out. What I am saying is that the codes deal with specific categories of threat or problem—illegal content, child safety content, child sexual abuse and exploitation—rather than with specific audiences who are affected by these sorts of problems. There is a circularity here in some of the criticism that we are not reflecting the fact that there are compound harms to people affected in more than one way and then saying that we should have a separate code dealing with one particular group of people because of one particular characteristic. We are trying to deal with categories of harm that we know disproportionately affect women and girls but which of course could affect others, as the noble Baroness rightly noted. Amendment 304—
I thank the Minister for giving way. There is a bit of a problem that I would like to raise. I think the Minister is saying that there should not be a code of practice in respect of violence against women and girls. That sounds to me like there will be no code of practice in this one particular area, which seems rather harsh. It also does not tackle the issue on which I thought we were all agreed, even if we do not agree the way forward: namely, that women and girls are disproportionately affected. If it is indeed the case that the Minister feels that way, how does he suggest this is dealt with?
There are no codes designed for Jewish people, Muslim people or people of colour, even though we know that they are disproportionately affected by some of these harms as well. The approach taken is to tackle the problems, which we know disproportionately affect all of those groups of people and many more, by focusing on the harms rather than the recipients of the harm.
Can I check something with my noble friend? This is where the illogicality is. The Government have mandated in the Strategic Policing Requirement that violence against women and girls is a national threat. I do not disagree with him that other groups of people will absolutely suffer abuse and online violence, but the Government themselves have said that violence against women and girls is a national threat. I understand that my noble friend has the speaking notes, the brief and everything else, so I am not sure how far we will get on this tonight, but, given the Home Office stance on it, I think that to say that this is not a specific threat would be a mistake.
With respect, I do not think that that is a perfect comparison. The Strategic Policing Requirement is an operational policing document intended for chief constables and police and crime commissioners in the important work that they do, to make sure they have due regard for national threats as identified by the Home Secretary. It is not something designed for commercial technology companies. The approach we are taking in the Bill is to address harms that can affect all people and which we know disproportionately affect women and girls, and harms that we know disproportionately affect other groups of people as well.
We have made changes to the Bill: the consultation with the Victims’ Commissioner and the domestic abuse commissioner, the introduction of specific offences to deal with cyber-flashing and other sorts of particular harms, which we know disproportionately affect women and girls. We are taking an approach throughout the work of the Bill to reflect those harms and to deal with them. Because of that, respectfully, I do not think we need a specific code of practice for any particular group of people, however large and however disproportionately they are affected. I will say a bit more about our approach. I have said throughout, including at Second Reading, and my right honourable friend the Secretary of State has been very clear in another place as well, that the voices of women and girls have been heard very strongly and have influenced the approach that we have taken in the Bill. I am very happy to keep talking to noble Lords about it, but I do not think that the code my noble friend sets out is the right way to go about solving this issue.
Amendment 304 seeks to adopt the Istanbul convention definition of violence against women and girls. The Government are already compliant with the Convention on Preventing and Combating Violence Against Women and Domestic Violence, which was ratified last year. However, we are unable to include the convention’s definition of violence against women and girls in the Bill, as it extends to legal content and activity that is not in scope of the Bill as drafted. Using that definition would therefore cause legal uncertainty for companies. It would not be appropriate for the Government to require companies to remove legal content accessed by adults who choose to access it. Instead, as noble Lords know, the Government have brought in new duties to improve services’ transparency and accountability.
Amendment 104 in the name of the noble Lord, Lord Stevenson, seeks to require user-to-user services to provide a higher standard of protection for women, girls and vulnerable adults than for other adults. The Bill already places duties on service providers and Ofcom to prioritise responding to content and activity that presents the highest risk of harm to users. This includes users who are particularly affected by online abuse, such as women, girls and vulnerable adults. In overseeing the framework, Ofcom must ensure that there are adequate protections for those who are most vulnerable to harm online. In doing so, Ofcom will be guided by its existing duties under the Communications Act, which requires it to have regard when performing its duties to the
“vulnerability of children and of others whose circumstances appear to OFCOM to put them in need of special protection”.
The Bill also amends Ofcom’s general duties under the Communications Act to require that Ofcom, when carrying out its functions, considers the risks that all members of the public face online, and ensures that they are adequately protected from harm. This will form part of Ofcom’s principal duty and will apply to the way that Ofcom performs all its functions, including when producing codes of practice.
In addition, providers’ illegal content and child safety risk assessment duties, as well as Ofcom’s sectoral risk assessment duties, require them to understand the risk of harm to users on their services. In doing so, they must consider the user base. This will ensure that services identify any specific risks facing women, girls or other vulnerable groups of people.
As I have mentioned, the Bill will require companies to prioritise responding to online activity that poses the greatest risk of harm, including where this is linked to vulnerability. Vulnerability is very broad. The threshold at which somebody may arguably become vulnerable is subjective, context-dependent and maybe temporary. The majority of UK adult users could be defined as vulnerable in particular circumstances. In practice, this would be very challenging for Ofcom to interpret if it were added to the safety objectives in this way. The existing approach allows greater flexibility so that companies and Ofcom can focus on the greatest threats to different groups of people at any given time. This allows the Bill to adapt to and keep pace with changing risk patterns that may affect different groups of people.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, the amendments in this group consider the role of collaboration and consultation in Ofcom’s approach. The proposals range in their intent, and include mandating additional roles for young people in the framework, adding new formal consultation requirements, and creating powers for Ofcom to work with other organisations.
I reassure noble Lords that the Government take these concerns extremely seriously. That is why the Bill already places the voices of experts, users and victims at the heart of the regime it establishes. In fact, the intent of many of the amendments in this group will already be delivered. That includes Ofcom working with others effectively to deliver the legislation, consulting on draft codes of practice, and having the ability to designate specific regulatory functions to other bodies where appropriate. Where we can strengthen the voices of users, victims or experts—without undermining existing processes, reducing the regulator’s independence or causing unacceptable delays—the Government are open to this. That is why I am moving the amendment today. However, as we have heard in previous debates, this is already a complex regulatory framework, and there is a widespread desire for it to be implemented quickly. Therefore, it is right that we guard against creating additional or redundant requirements which could complicate the regime or unduly delay implementation.
I turn to the amendment in my name. As noble Lords know, Ofcom will develop codes of practice setting out recommended measures for companies to fulfil their duties under the Bill. When developing those codes, Ofcom must consult various persons and organisations who have specific knowledge or expertise related to online harms. This process will ensure that the voices of users, experts and others are reflected in the codes, and, in turn, that the codes contain appropriate and effective measures.
One of the most important goals of the Bill, as noble Lords have heard me say many times, is the protection of children. It is also critical that the codes reflect the views of victims of online abuse, as well as the expertise of those who have experience in managing them. Therefore, the government amendment seeks to name the Commissioner for Victims and Witnesses, the domestic abuse commissioner and the Children’s Commissioner as statutory consultees under Clause 36(6). Ofcom will be required to consult those commissioners when preparing or amending a code of practice.
Listing these commissioners as statutory consultees will guarantee that the voices of victims and those who are disproportionately affected by online abuse are represented when developing codes of practice. This includes, in particular, women and girls—following on from our debate on the previous group—as well as children and vulnerable adults. This will ensure that Ofcom’s codes propose specific and targeted measures, such as on illegal content and content that is harmful to children, that platforms can take to address abuse effectively. I therefore hope that noble Lords will accept it.
I will say a little about some of the other amendments in this group before noble Lords speak to them. I look forward to hearing how they introduce them.
I appreciate the intent of Amendment 220E, tabled by the noble Lord, Lord Clement-Jones, and my noble friend Lady Morgan of Cotes, to address the seriousness of the issue of child sexual exploitation and abuse online. This amendment would allow Ofcom to designate an expert body to tackle such content. Where appropriate and effective, Section 1(7) of the Communications Act 2003 and Part II of the Deregulation and Contracting Out Act 1994 provide a route for Ofcom to enter into co-regulatory arrangements under the online safety framework.
There are a number of organisations that could play a role in the future regulatory framework, given their significant experience and expertise on the complex and important issue of tackling online child sexual exploitation and abuse. This includes the Internet Watch Foundation, which plays a pivotal role in the detection and removal of child sexual abuse material and provides vital tools to support its members to detect this abhorrent content.
A key difference from the proposed amendment is that the existing route, following consultation with Ofcom, requires an order to be made by a Minister, under the Deregulation and Contracting Out Act 1994, before Ofcom can authorise a co-regulator to carry out regulatory functions. Allowing Ofcom to do this, without the need for secondary legislation, would allow Ofcom to bypass existing parliamentary scrutiny when contracting out its regulatory functions under the Bill. By contrast, the existing route requires a draft order to be laid before, and approved by, each House of Parliament.
The noble Lord, Lord Knight of Weymouth, tabled Amendment 226, which proposes a child user advocacy body. The Government are committed to the interests of child users being represented and protected, but we believe that this is already achieved through the Bill’s existing provisions. There is a wealth of experienced and committed representative groups who are engaged with the regulatory framework. As the regulator, Ofcom will also continue to consult widely with a range of interested parties to ensure that it understands the experience of, and risks affecting, children online. Further placing children’s experiences at the centre of the framework, the Government’s Amendment 98A would name the Children’s Commissioner as a statutory consultee for the codes of practice. The child user advocacy body proposed in the noble Lord’s Amendment 226 may duplicate the Children’s Commissioner’s existing functions, which would create uncertainty, undermining the effectiveness of the Children’s Commissioner’s Office. The Government are confident that the Children’s Commissioner will effectively use her statutory duties and powers to understand children’s experiences of the digital realm.
For the reasons that I have set out, I am confident that children’s voices will be placed at the heart of the regime, with their interests defended and advocated for by the regulator, the Children’s Commissioner, and through ongoing engagement with civil society groups.
Similarly, Amendment 256, tabled by the noble Baroness, Lady Bennett of Manor Castle, seeks to require that any Ofcom advisory committees established by direction from the Secretary of State under Clause 155 include at least two young people. Ofcom has considerable experience in setting up committees of this kind. While there is nothing that would preclude committee membership from including at least two young people, predetermining the composition of any committee would not give Ofcom the necessary space and independence to run a transparent process. We feel that candidates should be appointed based on relevant understanding and technical knowledge of the issue in question. Where a board is examining issues with specific relevance to the interests of children, we would expect the committee membership to reflect that appropriately.
I turn to the statement of strategic priorities. As I hope noble Lords will agree, future changes in technology will likely have an impact on the experience people have online, including the nature of online harms. As provided for by Clause 153, the statement of strategic priorities will allow the Secretary of State to set out a statement of the Government’s strategic priorities in relation to online safety. This ensures that the Government can respond to changes in the digital and regulatory landscape at a strategic level. A similar power exists for telecommunications, the management of the radio spectrum, and postal services.
Amendments 251 to 253 seek to place additional requirements on the preparation of a statement before it can be designated. I reassure noble Lords that the existing consultation and parliamentary approval requirements allow for an extensive process before a statement can be designated. These amendments would introduce unnecessary steps and would move beyond the existing precedent in the Communications Act when making such a statement for telecommunications, the management of the radio spectrum, and postal services.
Finally, Amendment 284, tabled by the noble Lord, Lord Stevenson of Balmacara, proposes changes to Clause 171 on Ofcom’s guidance on illegal content judgments. Ofcom is already required to consult persons it considers appropriate before producing or revising the guidance, which could include the groups named in the noble Lord’s amendment. This amendment would oblige Ofcom to run formal public consultations on the illegal content guidance at two different stages: first, at a formative stage in the drafting process, and then before publishing a final version. These consultations would have to be repeated before subsequently amending or updating the guidance in any way. This would impose duplicative, time-consuming requirements on the regulator to consult, which are excessive when looking at other comparable guidance. The proposed consultations under this amendment would ultimately delay the publication of this instrumental guidance.
I will listen to what noble Lords have to say when they speak to their amendments, but these are the reasons why, upon first reading, we are unpersuaded by them.
My Lords, I thank the Minister for opening the group. This is a slightly novel procedure: he has rebutted our arguments before we have even had a chance to put them—what is new? I hope he has another speech lined up for the end which accepts some of the arguments we put, to demonstrate that he has listened to all the arguments made in the debate.
I will speak mainly to Amendments 220E and 226, ahead of the noble Baroness, Lady Kidron; I understand that the noble Baroness, Lady Merron, will be speaking at the end of the group to Amendment 226. I am very grateful to the noble Baroness, Lady Morgan, for signing Amendment 220E; I know she feels very strongly about this issue as well.
As the Minister said, this amendment is designed to confirm the IWF’s role as the recognised body for dealing with notice and take-down procedures for child sexual abuse imagery in the UK and to ensure that its long experience and expertise continues to be put to best use. In our view, any delay in establishing the roles and responsibilities of expert organisations such as the IWF in working with Ofcom under the new regulatory regime risks leaving a vacuum in which the risks to children from this hateful form of abuse will only increase. I heard what the Minister said about the parliamentary procedure, but that is a much slower procedure than a designation by Ofcom, so I think that is going to be one of the bones of contention between us.
The Internet Watch Foundation is a co-regulatory body with over 25 years of experience working with the internet industry, law enforcement and government to prevent the uploading of, and to disable public access to, known child sexual abuse, and to secure the removal of indecent images and videos of children from the internet. The organisation has had some considerable success over the last 25 years, despite the problem appearing to be getting worse globally.
In 2022, it succeeded in removing a record 255,000 web pages containing child sexual abuse. It has also amassed a database of more than 1.6 million unique hashes of child sexual abuse material, which has been provided to the internet industry to keep its platforms free from such material. In 2020, the Independent Inquiry into Child Sexual Abuse concluded that, in the UK, the IWF
“sits at the heart of the national response to combating the proliferation of indecent images of children. It is an organisation that deserves to be acknowledged publicly as a vital part of how, and why, comparatively little child sexual abuse material is hosted in the UK”.
I am grateful to noble Lords who have spoken to their amendments. Regarding the lead amendment in the group, I take on board what was said about its inevitable pre-emption—something that I know all too well from when the boot is on the other foot in other groups. However, I have listened to the points that were made and will of course respond.
I join the tributes rightly paid by noble Lords to the Internet Watch Foundation. The Government value its work extremely highly and would support the use of its expertise and experience in helping to deliver the aims of the Bill. My noble friend Lady Morgan of Cotes is right to say that it is on the front line of this work and to remind us that it encounters some of the most horrific and abhorrent content in the darkest recesses of the internet—something that I know well from my time as an adviser at the Home Office, as well as in this capacity now. Both the Secretary of State for Science, Innovation and Technology and the Minister for Safeguarding at the Home Office recently provided a foreword to the foundation’s latest annual report.
Clearly, Ofcom will need a wide variety of relationships with a range of organisations. Ofcom has been in regular contact with the Internet Watch Foundation, recognising its significant role in supporting the objectives of online safety regulation, and is discussing a range of options to make the best use of its expertise. The noble Lord, Lord Clement-Jones, asked what consultation and discussion is being had. We support the continuation of that engagement and are in discussions with the Internet Watch Foundation ourselves to understand how it envisages its role in supporting the regulatory environment. No decisions have been made on the co-regulatory role that other organisations may play. The Government will work with Ofcom to understand where it may be effective and beneficial to delivering the regulatory framework. Careful assessment of the governance, independence and funding of any organisations would be needed if co-designation were to be considered, but officials from the Department for Science, Innovation and Technology and the Home Office are in discussion with the IWF in relation to a memorandum of understanding to support ongoing collaboration.
On the designation of regulatory functions, we are satisfied that the powers under the Communications Act and the Deregulation and Contracting Out Act are sufficient, should other bodies be required to deliver specific aspects of the regime, so we do not see a need to amend the Bill in the way the amendments in this group suggest. Those Acts require an order from the Minister in order to designate any functions. The Minister has to consult Ofcom before making the order, and that is the mechanism that was used to appoint the Advertising Standards Authority to regulate broadcast advertising. It remains appropriate for Parliament to scrutinise the delivery of these important regulatory functions; accordingly, such an order cannot be made unless a draft of the order has been laid before, and approved by a resolution of, each House of Parliament.
The noble Baroness, Lady Merron, dwelt on the decision not to include a child user advocacy body. As I said in my earlier remarks and in relation to other groups, the Bill ensures that children’s voices will be heard and that what they say will be acted on. Ofcom will have statutory duties requiring it to understand the opinions and experiences of users, including children, by consulting widely when developing its codes. Ofcom will also have the flexibility to establish other mechanisms for conducting research about users’ experience. Additionally, the super-complaints process, which we began discussing this afternoon, will make sure that entities, including those that represent the interests of children, will have their voices heard and will help Ofcom recognise and eliminate systemic failings.
We are also naming the Children’s Commissioner as a statutory consultee for Ofcom in developing its codes of practice. A further new child user advocacy body would encroach on the wider statutory functions of the Children’s Commissioner. Both bodies would have similar responsibilities and powers to represent the interests of child users of regulated services, to protect and promote the interests of child users of regulated services, and to be a statutory consultee for the drafting and amendment of Ofcom’s codes of practice.
The noble Baroness, Lady Kidron, when discussing the input of the Children’s Commissioner into the regulatory framework, suggested that it was a here and now issue. She is right: the Children’s Commissioner will represent children’s views to Ofcom in preparing the codes of practice to ensure that they are fully informing the regime, but the commissioner will also have a continuing role, as they will be the statutory consultee on any later amendments to the codes of practice relating to children. That will ensure that they can engage in the ongoing development of the regime and can continue to feed in insights on emerging risks identified through the commissioner’s statutory duty to understand children’s experiences.
The Bill further ensures that new harms and risks to children are proactively identified by requiring that Ofcom make arrangements to undertake research about users’ experiences on regulated services. This will build on the significant amount of research that Ofcom already does, better to understand children’s experience online, particularly their experiences of online harms.
The super-complaints process will enable an eligible entity to make a complaint to Ofcom regarding a provider or providers that cause significant harm or significant adverse impact on users, including children. This will help Ofcom to recognise and eliminate systemic failings, including those relating to children, and will ensure that children’s views and voices continue to inform the regime as it is developed.
The Bill will also require that Ofcom undertake consumer consultation in relation to regulated services. This will, in effect, expand the scope of the Communications Consumer Panel to online safety matters, and will ensure that the needs of users, including children, are at the heart of Ofcom’s regulatory approach.
I draw noble Lords’ attention to the provisions of Clause 141(2), which states that Ofcom must make arrangements to ascertain
“the experiences of United Kingdom users of regulated services”.
That, of course, includes children. I hope, therefore, that noble Lords will be satisfied that the voices of children are indeed being listened to throughout the operation of the Bill. However, we have high regard for the work of the Internet Watch Foundation. I hope that noble Lords will be willing not to press their amendments—after the noble Lord, Lord Clement-Jones, asks his question.
My Lords, I am in the slightly strange position of not having moved the amendment, but I want to quickly respond. I was slightly encouraged by what the Minister said about Ofcom having been in regular contact with the IWF. I am not sure that that is mutual; maybe Ofcom thinks it is in good contact with the IWF, but I am not sure the IWF thinks it is in good contact with Ofcom. However, I am encouraged that the Minister at least thinks that that has been the case and that he is encouraging consultation and the continuation of engagement.
If I might follow up that comment, I agree entirely with what the noble Baroness has just said. It is very tricky for an independent charity to have the sort of relationship addressed in some of the language in this debate. Before the Minister completes his comments and sits down again, I ask him: if Ofcom were to negotiate a contracted set of duties with the IWF—indeed, with many other charities or others who are interested in assisting with this important work—could that be done directly by Ofcom, with powers that it already has? I think I am right to say that it would not require parliamentary approval. It is only if we are talking about co-regulation, which again raises other issues, that we would go through a process that requires what sounded like the affirmative procedure—the one that was used, for example, with the Advertising Standards Authority. Is that right?
Yes, I think it is. I am happy to confirm that in writing. I am grateful to my noble friend Lady Stowell, who of course is a former chairman of the Charity Commission, for making the point about the charitable status of the foundation. I should clarify that officials from the Department for Science, Innovation and Technology and the Home Office are in touch with the IWF about its role.
Speedily moving on, Ofcom is in discussion with the foundation about a memorandum of understanding. I hope that reassures the noble Lord, Lord Clement-Jones, that they are in reciprocal contact. Obviously, I cannot pre-empt where their discussions are taking them in relation to that MoU, but it is between Ofcom and the foundation. Careful consideration of governance, funding and issues of charity, as my noble friend raised, would have to be thought about if co-designation were being considered.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, it is a pleasure to follow the noble Lord, Lord Bethell, who is clearly passionate about this aspect. As the noble Baroness, Lady Harding, said, this is one of the most important groups of amendments that we have to debate on the Bill, even though we are on day eight of Committee. As she said, it is about the right assignment of responsibilities, so it is fundamental to the way that the Bill will operate.
My noble friend Lord Allan brilliantly summed up many of the arguments, and he has graphically described the problem of ministerial overreach, as did the noble Baroness, Lady Harding. We on these Benches strongly support the amendments put forward by the noble Lord, Lord Stevenson, and those put forward by the noble Baroness, Lady Stowell. Obviously, there is some difference of emphasis. They each follow the trail of the different committees of which their proposers were members, which is entirely understandable. I recall that the noble Lord, Lord Gilbert, was the hinge between the two committees—and brilliantly he did that. I very much hope that, when we come back at the next stage, if the Minister has not moved very far, we will find a way to combine those two strands. I think they are extremely close—many noble Lords have set out where we are on accountability and oversight.
Strangely, we are not trying to get out of the frying pan of the Secretary of State being overbearing and move to where we have no parliamentary oversight. Both the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson, are clearly in favour of greater oversight of Ofcom. The question is whether it is oversight of the codes and regulation or of Ofcom itself. I think we can find a way to combine those two strands. In that respect, I entirely agree with the noble Baroness, Lady Fox: it is all about making sure that we have the right kind of oversight.
I add my thanks to Carnegie UK. The noble Lord, Lord Stevenson, and the noble Baroness, Lady Stowell, set out the arguments, and we have the benefit of the noble Baroness’s letter to the Secretary of State of 30 January, which she mentioned in her speech. They have set out very clearly where speakers in this debate unanimously want to go.
The Government have suggested some compromise on Clause 39. As the noble Lord, Lord Stevenson said, we have not seen any wording for that, but I think it is highly unlikely that that, by itself, will satisfy the House when we come to Report.
There are many amendments here which deal with the Secretary of State’s powers, but I believe that the key ones are the product of both committees, which is about the Joint Committee. If noble Lords read the Government’s response to our Joint Committee on the draft Bill, they will see that the arguments given by the Government are extremely weak. I think it was the noble Baroness, Lady Stowell, who used the phrase “democratic deficit”. That is exactly what we are not seeking: we are trying to open this out and make sure we have better oversight and accountability. That is the goal of the amendments today. We have heard from the noble Viscount, Lord Colville, about the power of lobbying by companies. Equally, we have heard about how the Secretary of State can be overbearing. That is the risk we are trying to avoid. I very much hope that the Minister sees his way to taking on board at least some of whichever set of amendments he prefers.
My Lords, the amendments concern the independence of Ofcom and the role of parliamentary scrutiny. They are therefore indeed an important group, as those things will be vital to the success of the regime that the Bill sets up. Introducing a new, ground-breaking regime means balancing the need for regulatory independence with a transparent system of checks and balances. The Bill therefore gives powers to the Secretary of State comprising a power to direct Ofcom to modify a code of practice, a power to issue a statement of strategic priorities and a power to issue non-binding guidance to the regulator.
These powers are important but not novel; they have precedent in the Communications Act 2003, which allows the Secretary of State to direct Ofcom in respect of its network and spectrum functions, and the Housing and Regeneration Act 2008, which allows the Secretary of State to make directions to the Regulator of Social Housing to amend its standards. At the same time, I agree that it is important that we have proportionate safeguards in place for the use of these powers, and I am very happy to continue to have discussions with noble Lords to make sure that we do.
Amendment 110, from the noble Lord, Lord Stevenson, seeks to introduce a lengthier process regarding parliamentary approval of codes of practice, requiring a number of additional steps before they are laid in Parliament. It proposes that each code may not come into force unless accompanied by an impact assessment covering a range of factors. Let me reassure noble Lords that Ofcom is already required to consider these factors; it is bound by the public sector equality duty under the Equality Act 2010 and the Human Rights Act 1998 and must ensure that the regime and the codes of practice are compliant with rights under the European Convention on Human Rights. It must also consult experts on matters of equality and human rights when producing its codes.
Amendment 110 also proposes that any designated Select Committee in either House has to report on each code and impact assessment before they can be made. Under the existing process, all codes must already undergo scrutiny by both Houses before coming into effect. The amendment would also introduce a new role for the devolved Administrations. Let me reassure noble Lords that the Government are working closely with them already and will continue to do so over the coming months. As set out in Schedule 5 to the Scotland Act 1998, however, telecommunications and thereby internet law and regulation is a reserved policy area, so input from the devolved Administrations may be more appropriately sought through other means.
Amendments 111, 113, 114, 115, and 117 to 120 seek to restrict or remove the ability of the Secretary of State to issue directions to Ofcom to modify draft codes of practice. Ofcom has great expertise as a regulator, as noble Lords noted in this debate, but there may be situations where a topic outside its remit needs to be reflected in a code of practice. In those situations, it is right for the Government to be able to direct Ofcom to modify a draft code. This could, for example, be to ensure that a code reflects advice from the security services, to which Ofcom does not have access. Indeed, it is particularly important that the Secretary of State be able to direct Ofcom on matters of national security and public safety, where the Government will have access to information which Ofcom will not.
I have, however, heard the concerns raised by many in your Lordships’ House, both today and on previous occasions, that these powers could allow for too much executive control. I can assure your Lordships that His Majesty’s Government are committed to protecting the regulatory independence of Ofcom, which is vital to the success of the framework. With this in mind, we have built a number of safeguards into the use of the powers, to ensure that they do not impinge on regulatory independence and are used only in limited circumstances and for the appropriate reasons.
I have heard the strong feelings expressed that this power must not unduly restrict regulatory independence, and indeed share that feeling. In July, as noble Lords noted, the Government announced our intention to make substantive changes to the power; these changes will make it clear that the power is for use only in exceptional circumstances and will replace the “public policy” wording in Clause 39 with a defined list of reasons for which a direction can be made. I am happy to reiterate that commitment today, and to say that we will be making these changes on Report when, as the noble Lord, Lord Clement-Jones, rightly said, noble Lords will be able to see the wording and interrogate it properly.
Additionally, in light of the debate we have just had today—
Can my noble friend the Minister clarify what he has just said? When he appeared in front of the Communications and Digital Committee, I think he might have been road-testing some of that language. In the specific words used, he would still have allowed the Secretary of State to direct Ofcom for economic reasons. Is that likely to remain the case? If it is, I feel it will not actually meet what I have heard is the will of the Committee.
When we publish the wording, we will rightly have an opportunity to discuss it before the debate on Report. I will be happy to discuss it with noble Lords then. On the broader points about economic policy, that is a competency of His Majesty’s Government, not an area of focus for Ofcom. If the Government had access to additional information that led them to believe that a code of practice as drafted could have a significant, disproportionate and adverse effect on the livelihoods of the British people or to the broader economy, and if it met the test for exceptional circumstances, taking action via a direction from the Secretary of State could be warranted. I will happily discuss that when my noble friend and others see the wording of the changes we will bring on Report. I am sure we will scrutinise that properly, as we should.
I was about to say that, in addition to the commitment we have already made, in the light of the debate today we will also consider whether transparency about the use of this power could be increased further, while retaining the important need for government oversight of issues that are genuinely beyond Ofcom’s remit. I am conscious that, as my noble friend Lady Stowell politely said, I did not convince her or your Lordships’ committee when I appeared before it with my honourable friend Paul Scully. I am happy to continue our discussions and I hope that we may reach some understanding on this important area.
I am sorry to interrupt, but may I clarify what my noble friend just said? I think he said that, although he is open to increasing the transparency of the procedure, he does not concede a change—from direction to a letter about guidance which Ofcom should take account of. Is he willing to consider that as well?
I am happy to continue to discuss it, and I will say a bit more about the other amendments in this group, but I am not able to say much more at this point. I will happily follow this up in discussion with my noble friend, as I know it is an issue of interest to her and other members of your Lordships’ committee.
The noble Lord, Lord Stevenson, asked about our international obligations. As noble Lords noted, the Government have recognised the importance of regulatory independence in our work with international partners, such as the Council of Europe’s declaration on the independence of regulators. That is why we are bringing forward the amendments previously announced in another place. Ensuring that powers of direction can be issued only in exceptional circumstances and for a set of reasons defined in the Bill will ensure that the operational independence of Ofcom is not put at risk. That said, we must strike a balance between parliamentary oversight and being able to act quickly where necessary.
Regarding the amendment tabled by my noble friend Lady Stowell, which calls for all codes which have been altered by a direction to go through the affirmative procedure, as drafted, the negative procedure is used only if a direction is made to a code of practice relating to terrorism or child sexual exploitation or abuse, for reasons of national security or public safety. It is important that the parliamentary process be proportionate, particularly in cases involving national security or public safety, where a code might need to be amended quickly to protect people from harm. We therefore think that, in these cases, the negative procedure is more appropriate.
On timing, the Government are committed to ensuring that the framework is implemented quickly, and this includes ensuring that the codes of practice are in force. The threshold of exceptional circumstances for the power to direct can lead to a delay only in situations where there would otherwise be significant consequences for national security or public safety, or for the other reasons outlined today.
My noble friend Lord Moylan was not able to be here for the beginning of the debate on this group, but he is here now. Let me say a little about his Amendment 254. Under Clause 153, the Secretary of State can set out a statement of the Government’s strategic priorities in relation to matters of online safety. This power is necessary, as future technological changes are likely to shape online harms, and the Government must be able to state their strategic priorities in relation to them. My noble friend’s amendment would go beyond the existing precedent for the statement of strategic priorities in relation to telecommunications, management of the radio spectrum, and postal services outlined in the Communications Act. The Secretary of State must consult Ofcom and other appropriate persons when preparing this statement. This provides the opportunity for widespread scrutiny of a draft statement before it can be designated through a negative parliamentary procedure. We consider that the negative procedure is appropriate, in line with comparable existing arrangements.
Amendment 257 from the noble Lord, Lord Stevenson, seeks to remove the Secretary of State’s power to issue guidance to Ofcom about the exercise of its online safety functions. Issuing guidance of this kind, with appropriate safeguards, including consultation and limitations on its frequency, is an important part of future-proofing the regime. New information—for example, resulting from parliamentary scrutiny or technological developments—may require the Government to clarify the intent of the legislation.
Amendments 258 to 260 would require the guidance to be subject to the affirmative procedure in Parliament. Currently, Ofcom must be consulted, and any guidance must be laid before Parliament. The Bill does not subject the guidance to a parliamentary procedure because the guidance does not create any statutory requirements, and Ofcom is required only to have had regard to it. We think that remains the right approach.
The noble Lord, Lord Stevenson, has made clear his intention to question Clause 156, which grants the Secretary of State the power to direct Ofcom’s media literacy activity only in special circumstances. This ensures that the regulatory framework is equipped to respond to significant future threats—for example, to the health or safety of the public, or to national security. I have already set out, in relation to other amendments, why we think it is right that the Secretary of State can direct Ofcom in these circumstances.
The delegated powers in the Bill are crucial to ensuring that the regulatory regime keeps pace with changes in this area. Amendment 290 from the noble Lord, Lord Stevenson, would go beyond the existing legislative process for these powers, by potentially providing for additional committees to be, in effect, inserted into the secondary legislative process. Established committees themselves are able to decide whether to scrutinise parts of a regime in more detail, so I do not think they need a Parkinson rule to do that.
Noble Lords have expressed a common desire to see this legislation implemented as swiftly as possible, so I hope they share our wariness of any amendments which could slow that process down. The process as envisaged in this amendment is an open-ended one, which could delay implementation. Of course, however, it is important that Parliament is able to scrutinise the work of the regulator. Like most other regulators, Ofcom is accountable to Parliament on how it exercises its functions. The Secretary of State is required to present its annual report and accounts before both Houses. Ministers from Scotland, Wales and Northern Ireland must also lay a copy of the report before their respective Parliament or Assembly. Moreover, the officers of Ofcom can be required to appear before Select Committees to answer questions about its operations on an annual basis. Parliament will also have a role in approving a number of aspects of the regulatory framework through its scrutiny of both the primary and secondary legislation. This will include the priority categories for harms and Ofcom’s codes of practice.
More broadly, we want to ensure that this ground-breaking legislation has the impact we intend. Ongoing parliamentary scrutiny of it will be crucial to help to ensure that. There is so much expertise in both Houses, and it has already helped to improve this legislation, through the Joint Committee on the draft Bill, the DCMS Select Committee in another place and, of course, your Lordships’ Communications and Digital Committee.
As my noble friend Lady Stowell said, we must guard against fragmentation and duplication, which we are very mindful of. Although we do not intend to legislate for a new committee—as I set out on previous occasions, including at Second Reading and before the Communications and Digital Committee—we remain happy to discuss possible mechanisms for oversight to ensure that we make best use of the expertise in both Houses of Parliament so that the Bill delivers what we want. With that, I hope that Members of the Committee will be happy to continue the discussions in this area and not press their amendments.
I am grateful to the noble Lord for his comprehensive response and for the welcome change in tone and the openness to further debate and discussions. I thank all those who spoke in the debate. The noble Baroness, Lady Harding, was right: we are getting into a routine where we know roughly where our places are and, if we have contributions to make, we make them in the right order and make them comprehensive. We did our bit quite well, but I am afraid that the Minister’s response made me a bit confused. As I said, I welcome the change of tone, the sense of engagement with some of the issues and the ability to meet to discuss ways forward in some of those areas. But he then systematically and rather depressingly shut off just about everything that I thought we were going to discuss. I may be overstating that, so I will read Hansard carefully to make sure that there are still chinks of light in his hitherto impenetrable armour. I really must stop using these metaphors— I thought that the noble Baroness, Lady Harding, had managed to get me off the hook with her question about whether we were an island of concrete rock, and about whether the boat was going to end up in the stormy sea that we were creating. I decided that I could not follow that, so I will not.
We ought to take forward and address three things, which I will briefly go through in the response. One that we did not nail down was the good point made by the noble Baroness, Lady Kidron, that we had focused on regulatory structures in the form of set bodies relating—or not relating—to parliamentary procedures and to Ministers and their operations. She pointed out that, actually, the whole system has a possible drag effect that we also need to think about. I note that good point because we probably need a bit of time to think about how that would work in the structures that come forward.
The noble Lord, Lord Allan, said that we are trying to look at the changing of the accountability model. I disagree with the word “changing” because we are not trying to change anything; we have a model that works, but the new factor that we are trying to accommodate is the intensity of interaction and, as we said, the amplification that comes from the internet. I worry that this was not being picked up enough in the Minister’s response, but we will pick it up later and see if we can get through it.
The three points I wanted to make sure of were as follows. Following the line taken by the noble Baroness, Lady Stowell, one point is on trying to find a proper balance between the independence of the regulator; the Secretary of State’s right, as an elected leader of this aspect of the Government, to make recommendations and proposals to that regulator on how the system can be better; and Parliament’s ability to find a place in that structure, which is still eluding us a little, so we will need to spend more time on it. There is enough there to be reassured that we will find a way of balancing the independence of the regulator and the role of the Secretary of State. It does not need as many mentions in the legislation as it currently has. There is clearly a need for the Secretary of State to be able to issue direction in cases of national security et cetera—but it is the “et cetera” that I worry about: what are these instances? Until they are nailed down and in the Bill, there has to be a question about that.
As the noble Baroness, Lady Kidron, set out at the beginning of this debate, the amendments in this group have involved extensive discussions among Members in both Houses of Parliament, who sit on all sides of both Houses. I am very grateful for the way noble Lords and Members in another place have done that. They have had those preliminary discussions so that our discussions in the debate today and in preparation for it could be focused and detailed. I pay particular tribute to the noble Baroness, Lady Kidron, and my noble friends Lord Bethell and Lady Harding, who have been involved in extensive discussions with others and then with us in government. These have been very helpful indeed; they continue, and I am happy to commit to their continuing.
Age-assurance technologies will play an important role in supporting the child safety duties in this Bill. This is why reference is made to them on the face of the Bill—to make it clear that the Government expect these measures to be used for complying with the duties to protect children from harmful content and activity online. Guidance under Clause 48 will already cover pornographic content. While this is not currently set out in the legislation, the Government intend, as noble Lords know, to designate pornographic content as a category of primary priority content which is harmful to children. As I set out to your Lordships’ House during our debate on harms to children, we will amend the Bill on Report to list the categories of primary and primary priority content on the face of the Bill.
I am very grateful to noble Lords for the engagement we have had on some of the points raised in Amendments 142 and 306 in recent weeks. As we have been saying in those discussions, the Government are confident that the Bill already largely achieves the outcomes sought here, either through existing provisions in it or through duties in other legislation, including data protection legislation, the Human Rights Act 1998 and the Equality Act 2010. That is why we think that re-stating duties on providers which are already set out in the Bill, or repeating duties set out in other legislation, risks causing uncertainty, and why we need to be careful about imposing specific timelines on Ofcom by which it must produce age-assurance guidance. It is essential that we protect Ofcom’s ability robustly to fulfil its consultation duties for the codes of practice. If Ofcom is given insufficient time to fulfil these duties, the risk of legal challenge being successful is increased.
I welcome Ofcom’s recent letter to your Lordships, outlining its implementation road map, which I hope provides some reassurance directly from the regulator on this point. Ofcom will prioritise protecting children from pornography and other harmful content. It intends to publish, this autumn, draft guidance for Part 5 pornography duties and draft codes of practice for Part 3 illegal content duties, including for child sexual exploitation and abuse content. Draft codes of practice for children’s safety duties will follow next summer. These elements of the regime are being prioritised ahead of others, such as the category 1 duties, to reflect the critical importance of protecting children.
Although we believe that the Bill already largely achieves the outcomes sought, we acknowledge the importance of ensuring that there are clear principles for Ofcom to apply when recommending or requiring the use of age-assurance technologies. I am happy to reassure noble Lords that the Government will continue to consider this further and are happy to continue our engagement on this issue, although any amendment must be made in a way that sits alongside existing legislation and within the framework of the Bill.
I turn to Amendments 161 and 183. First, I will take the opportunity to address some confusion about the requirements in Parts 3 and 5 of the Bill. The Bill ensures that companies must prevent children accessing online pornography, regardless of whether it is regulated in Part 3 or Part 5. The Government are absolutely clear on this point; anything less would be unacceptable. The most effective approach to achieving this is to focus on the outcome of preventing children accessing harmful content, which is what the Bill does. If providers do not prevent children accessing harmful content, Ofcom will be able to bring enforcement action against them.
I will address the point raised by my noble friend Lord Bethell about introducing a standard of “beyond reasonable doubt” for age verification for pornography. As my noble friend knows, we think this a legally unsuitable test which would require Ofcom to determine the state of mind of the provider, which would be extremely hard to prove and would therefore risk allowing providers to evade their duties. A clear, objective duty is the best way to ensure that Ofcom can enforce compliance effectively. The Bill sets clear outcomes which Ofcom will be able to take action on if these are not achieved by providers. A provider will be compliant only if it puts in place systems and processes which meet the objective requirements of the child safety duties.
The provisions in the Bill on proportionality are important to ensure that the requirements in the child safety duties are tailored to the size and capacity of providers. Smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services and will need to make sure these systems and processes achieve the required outcomes of the child safety duties.
The Government expect companies to use age-verification technologies to prevent children accessing services which pose the highest risk of harm to children, such as online pornography. However, companies may use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.
Age verification may not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For example, if a user-to-user service such as a social medium does not allow—
I am sorry to interrupt. The Minister said that he would bear in mind proportionality in relation to size and capacity. Is that not exactly the point that the noble Baroness, Lady Harding, was trying to make? In relation to children, why will that be proportionate? A single child being damaged in this way is too much.
The issue was in relation to a provider’s size and capacity; it is an issue of making sure it is effective and enforceable, and proportionate to the size of the service in question. It may also not be the most effective approach for companies to follow to comply with their duties. If there is a company such as a user-to-user service in social media that says it does not allow pornography under its terms of service, measures such as content moderation and user reporting might be more appropriate and effective for protecting children than age verification in those settings. That would allow content to be better detected and taken down, while—
I understand that, but it is an important point to try to get on the record. It is an outcome-based solution that we are looking for, is it not? We are looking for zero activity where risks to children are there. Clearly, if the risk assessment is that there is no risk that children can be on that site, age verification may not be required— I am extending it to make a point—but, if there is a risk, we need to know that the outcome of that process will be zero. That is my point, and I think we should reflect on that.
I am very happy to, and the noble Lord is right that we must be focused on the outcomes here. I am very sympathetic to the desire to make sure that providers are held to the highest standards, to keep children protected from harmful content online.
I know the Minister said that outcomes are detailed in the Bill already; I wonder whether he could just write to us and describe where in the Bill those outcomes are outlined.
I shall happily do that, and will happily continue discussions with my noble friend and others on this point and on the appropriate alternative to the language we have discussed.
On the matter of Ofcom independently auditing age- assurance technologies, which my noble friend also raised, the regulator already has the power to require a company to undertake and pay for a report from a skilled person about a regulated service. This will assist Ofcom in identifying and assessing non-compliance, and will develop its understanding of the risk of failure to comply. We believe that this is therefore already provided for.
I reassure noble Lords that the existing definition of pornographic content in the Bill already captures the same content that Amendment 183ZA, in the name of the noble Baroness, Lady Ritchie of Downpatrick, intends to capture. The definition in the Bill shares the key element of the approach Ofcom is taking for pornography on UK-established video-sharing platforms. This means that the industry will be familiar with this definition and that Ofcom will have experience in regulating content which meets it.
The definition is also aligned with that used in existing legislation. I take on board the point she made about her trawl of the statute book for it, but the definition is aligned elsewhere in statute, such as in the Coroners and Justice Act 2009. This means that, in interpreting the existing definition in the Bill, the courts may be able to draw on precedent from the criminal context, giving greater certainty about its meaning. The definition of pornography in Part 5 is also consistent with the British Board of Film Classification’s guidelines for the definition of sex works, which is
“works whose primary purpose is sexual arousal or stimulation”
and the BBFC’s definition of R18. We therefore think it is not necessary to refer to BBFC standards in this legislation. Including the definition in the Bill also retains Parliament’s control of the definition, and therefore also which content is subject to the duties in Part 5. That is why we believe that the definition as outlined in the Bill is more straightforward for both service providers and Ofcom to apply.
I turn to Amendments 184 and 185. The Government share the concerns raised in today’s debate about the wider regulation of online pornography. It is important to be clear that extreme pornography, so-called revenge pornography and child sexual exploitation and abuse are already illegal and are listed as priority offences in the Bill. This means that under the illegal content duties, Part 3 providers, which will include some of the most popular commercial pornography services, must take proactive, preventive measures to limit people’s exposure to this criminal content and behaviour.
Does my noble friend the Minister recognise that those laws have been in place for the 30 years of the internet but have not successfully been used to protect the rights of those who find their images wrongly used, particularly those children who have found their images wrongly used in pornographic sites? Does he have any reflections on how that performance could be improved?
I would want to take advice and see some statistics, but I am happy to do that and to respond to my noble friend’s point. I was about to say that my noble friend Lady Jenkin of Kennington asked a number of questions, but she is not here for me to answer them.
I turn to Amendment 232 tabled by the noble Lord, Lord Allan of Hallam. Because of the rapid development of age-assurance technologies, it is right that they should be carefully assessed to ensure that they are used effectively to achieve the outcomes required. I am therefore sympathetic to the spirit of his amendment, but must say that Ofcom will undertake ongoing research into the effectiveness of age-assurance technologies for its various codes and guidance, which will be published. Moreover, when preparing or updating the codes of practice, including those that refer to age-assurance technologies, Ofcom is required by the Bill to consult a broad range of people and organisations. Parliament will also have the opportunity to scrutinise the codes before they come into effect, including any recommendations regarding age assurance. We do not think, therefore, that a requirement for Ofcom to produce a separate report into age-assurance technologies is a necessary extra burden to impose on the regulator.
In relation to this and all the amendments in this group, as I say, I am happy to carry on the discussions that we have been having with a number of noble Lords, recognising that they speak for a large number of people in your Lordships’ House and beyond. I reiterate my thanks, and the Government’s thanks, to them for the way in which they have been going about that. With that, I encourage them not to press their amendments.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, I regret that my noble friend Lord Lipsey is unable to be here. I wish him and the noble Lord, Lord McNally, well. I also regret that my noble friend Lord Stevenson is not here to wind up this debate and introduce his Amendment 127. Our inability to future-proof these proceedings means that, rather than talking to the next group, I am talking to this one.
I want to make four principal points. First, the principle of press freedom, as discussed by the noble Lords, Lord Black and Lord Faulks, in particular, is an important one. We do not think that this is the right Bill to reopen those issues. We look forward to the media Bill as the opportunity to discuss these things more fully across the House.
Secondly, I have some concerns about the news publisher exemption. In essence, as the noble Lord, Lord Clement-Jones, set out, as long as you have a standards code, a complaints process, a UK address and a team of contributors, the exemption applies. That feels a bit loose to me, and it opens up the regime to some abuse. I hear what the noble Baronesses, Lady Gohir and Lady Grey-Thompson, said about how we already see pretty dodgy outfits allowing racist and abusive content to proliferate. I look forward to the Minister’s comments on whether the bar we have at the moment is too low and whether there is some reflection to be done on that.
The third point is on my noble friend Lord Stevenson’s Amendment 127, which essentially says that we should set a threshold around whether complaints are dealt with in a timely manner. In laying that amendment, my noble friend essentially wanted to probe. The noble Lord, Lord Faulks, is here, so this is a good chance to have him listen to me say that we think that complaints should be dealt with more swiftly and that the organisation that he chairs could do better at dealing with that.
My fourth comment is about comments, particularly after listening to the speech of the noble Baroness, Lady Grey-Thompson, about some of the hateful comment that is hidden away inside the comments that news publishers carry. I was very much struck by what she said in respect of some of the systems of virality that are now being adopted by those platforms. There, I think Amendment 227 is tempting. I heard what the noble Baroness, Lady Stowell, said, and I think I agree that this is better addressed by Parliament.
For me, that just reinforces the need for this Bill, more than any other that I have ever worked on in this place, to have post-legislative scrutiny by Parliament so that we, as a Parliament, can review whether the regime we are setting up is running appropriately. It is such a novel regime, in particular around regulating algorithms and artificial intelligence. It would be an opportunity to see whether, in this case, the systems of virality were creating an amplification of harm away from the editorial function that the news publishers are able to exercise over the comments.
On that basis, and given the hour, I am happy to listen with care to the wise words of the Minister.
My Lords, I join noble Lords who have sent their best wishes to the noble Lords, Lord Lipsey and Lord McNally.
His Majesty’s Government are committed to defending the invaluable role of a free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information.
We have included strong protections for news publishers’ and journalistic content in the Bill, which extends to the exemption from the Bill’s safety duties for users’ comments and reviews on news publishers’ sites. This reflects a wider exemption for comments and reviews on provider content more generally. For example, reviews of products on retailers’ sites are also exempt from regulation. This is designed to avoid disproportionate regulatory burden on low-risk services.
Amendment 124 intends to modify that exemption, so that the largest news websites no longer benefit and are subject to the Bill’s regulatory regime. Below-the-line comments are crucial for enabling reader engagement with the news and encouraging public debate, as well as for the sustainability—and, as the noble Baroness, Lady Fox, put it, the accountability—of the news media. We do not consider it proportionate, necessary or compatible with our commitment to press freedom to subject these comment sections to oversight by Ofcom.
We recognise that there can sometimes be unpleasant or abusive below-the-line comments. We have carefully considered the risks of this exemption against the need to protect freedom of speech and media freedoms on matters of public interest. Although comment functions will not be subject to online regulation, I reassure the Members of the Committee who raised concerns about some of the comments which have attracted particular attention that sites hosting such comments can, in some circumstances, be held liable for any illegal content appearing on them, where they have actual knowledge of the content in question and fail to remove it expeditiously.
The strong protections for recognised news publishers in the Bill include exempting their content from the Bill’s safety duties, requiring category 1 platforms to notify recognised news publishers and to offer a right of appeal before removing or moderating any of their content. Clause 50 stipulates the clear criteria that publishers will have to meet to be considered a “recognised news publisher” and to benefit from those protections. When drafting these criteria, the Government have been careful to ensure that established news publishers are captured, while limiting the opportunity for bad actors to qualify.
Amendment 126 seeks to restrict the criteria for recognised news publishers in the Bill, so that only members of an approved regulator within the meaning of Section 42 of the Crime and Courts Act 2013 benefit from the protections offered by the Bill. This would create strong incentives for publishers to join specific press regulators. We do not consider this to be compatible with our commitment to a free press. We will repeal existing legislation that could have that effect, specifically Section 40 of the Crime and Courts Act 2013, through the media Bill, as noble Lords have noted, which has recently been published. Without wanting to make a rod for my own back when we come to that Bill, I agree with my noble friend Lord Black of Brentwood that it would be the opportunity to have this debate, if your Lordships so wished.
The current effect of this amendment would be to force all news publishers to join a single press regulator—namely Impress, the only UK regulator which has sought approval by the Press Recognition Panel—if they were to benefit from the exclusion for recognised news publishers. Requiring a publisher to join specific regulators is, in the view of His Majesty’s Government, not only incompatible with protecting press freedom in the UK but unnecessary given the range of detailed criteria which a publisher must meet to qualify for the additional protections, as set out in Clause 50 of the Bill.
As part of our commitment to media freedom, we are committed to independent self-regulation of the press. As I have indicated, Clause 50 stipulates the clear criteria which publishers will have to meet to be considered a “recognised news publisher” and to benefit from the protections in the Bill. One of those criteria is for entities to have policies and procedures for handling and resolving complaints. Amendment 127 from the noble Lord, Lord Stevenson, adds a requirement that these policies and procedures must cover handling and resolving complaints “in a timely manner”. To include such a requirement will place the responsibility on Ofcom to decide what constitutes “timely”, and, in effect, put it in the position of press regulator. That is not something that we would like. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit.
I turn now to Amendment 227. We recognise that, as legislation comes into force, it will be necessary to ensure that the protections we have put in place for journalistic and news publisher content are effective. We need to ensure that the regulatory framework does not hinder access to such content, particularly in the light of the fact that, in the past, news content has sometimes been removed or made less visible by social media moderators or algorithms for unclear reasons, often at the height of news cycles. That is why we have required Ofcom to produce a specific report, under Clause 144, assessing the impact of the Bill on the availability and treatment of news publisher and journalistic content on category 1 services.
Before the Minister closes his folder and sits down, perhaps I could say that I listened carefully and would just like him to reflect a little more for us on my question of whether the bar is set too low and there is too much wriggle room in the exemption around news publishers. A tighter definition might be something that would benefit the Bill and the improvement of the Bill when we come back to it on Report.
Looking at the length of Clause 50—and I note that the noble Lord, Lord Allan of Hallam, made much the same point in his speech—I think the definitions set out in Clause 50 are extensive. Clause 50(1) sets out a number of recognised news publishers, obviously including
“the British Broadcasting Corporation, Sianel Pedwar Cymru”—
self-evidently, as well as
“the holder of a licence under the Broadcasting Act 1990 or 1996”
or
“any other entity which … meets all of the conditions in subsection (2), and … is not an excluded entity”
as set out in subsection (3). Subsection (2) sets out a number of specific criteria which I think capture the recognised news publishers we want to see.
Noble Lords will be aware of the further provisions we have brought forward to make sure that entities that are subject to a sanction are not able to qualify, such as—
I think it is actually quite important that there is—to use the language of the Bill—a risk assessment around the notion that people might game it. I thought the noble Baroness, Lady Gohir, made a very good point. People are very inventive and, if you have ever engaged with the people who run some of those big US misinformation sites—let us just call them that—you will know that they have very inventive, very clever people. They will be looking at this legislation and if they figure out that by opening a UK office and ticking all the boxes they will now get some sorts of privileges in terms of distributing their misinformation around the world, they will do it. They will try it, so I certainly think it is worth there being at least some kind of risk assessment against that happening.
In two years’ time we will be able to see whether the bad thing happened, but whether or not it is the Minister having a conversation with Ofcom now, I just think that forewarned is forearmed. We know that that is a possibility and it would be helpful for some work to be done now to make sure that that is not a loophole that none of us want, I think.
I am mindful of the examples the noble Lord gave in his speech. Looking at some of the provisions set out in subsection (2) about a body being
“subject to a standards code”
or having
“policies and procedures for handling and resolving complaints”,
I think on first response that those examples he gave would be covered. But I will certainly take on board the comments he made and those the noble Baroness, Lady Gohir, made as well and reflect on them. I hope—
On a final point of clarification, in contrast, I think the exemption may be too narrow, not too broad. With the emergence of blogs and different kinds of news organisations—I think the noble Lord, Lord Allan, described well the complexity of what we have—and some of the grimmer, grosser examples of people who might play the system, does the Minister acknowledge that that might be dealt with by the kind of exemptions that have been used for RT? When somebody is really an extremist representative of, I do not know, ISIS, pretending to be a media organisation, the sensible thing to do would be to exempt them, rather than to overtighten the exemptions, so that new, burgeoning, widely read online publications can have press freedom protection.
I will certainly take on board the points the noble Baroness raises. Hearing representations in both directions on the point would, on first consideration, reassure me that we have it right, but I will certainly take on board the points which the noble Baroness, the noble Lord and others have raised in our debate on this. As the noble Lord, Lord Allan, suggests, I will take the opportunity to discuss it with Ofcom, as we will do on many of the issues which we are discussing in this Committee, to make sure that its views are taken on board before we return to these and other issues on Report.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, this has been a grim but important debate to open the Committee’s proceedings today. As my noble friend Lady Harding of Winscombe and others have set out, some of the issues and materials about which we are talking are abhorrent indeed. I join other noble Lords in thanking my noble friend Lord Harlech for his vigilance and consideration for those who are watching our proceedings today, to allow us to talk about them in the way that we must in order to tackle them, but to ensure that we do so sensitively. I thank noble Lords for the way they have done that.
I pay tribute also to those who work in this dark corner of the internet to tackle these harms. I am pleased to reassure noble Lords that the Bill has been designed in a way that responds to emerging and new technologies that may pose a risk of harm. In our previous debates, we have touched on explicitly naming certain technologies and user groups or making aspects of the legislation more specific. However, one key reason why the Government have been resistant to such specificity is to ensure that the legislation remains flexible and future-proofed.
The Bill has been designed to be technology-neutral in order to capture new services that may arise in this rapidly evolving sector. It confers duties on any service that enables users to interact with each other, as well as search services, meaning that any new internet service that enables user interaction will be caught by it.
Amendment 125, tabled by the noble Baroness, Lady Kidron—whose watchful eye I certainly feel on me even as she takes a rare but well-earned break today—seeks to ensure that machine-generated content, virtual reality content and augmented reality content are regulated content under the Bill. I am happy to confirm to her and to my noble friend Lady Harding who moved the amendment on her behalf that the Bill is designed to regulate providers of user-to-user services, regardless of the specific technologies they use to deliver their service, including virtual reality and augmented reality content. This is because any service that allows its users to encounter content generated, uploaded or shared by other users is in scope unless exempt. “Content” is defined very broadly in Clause 207(1) as
“anything communicated by means of an internet service”.
This includes virtual or augmented reality. The Bill’s duties therefore cover all user-generated content present on the service, regardless of the form this content takes, including virtual reality and augmented reality content. To state it plainly: platforms that allow such content—for example, the metaverse—are firmly in scope of the Bill.
The Bill also ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated by the Bill where appropriate. Specifically, Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service. This approach ensures that the Bill covers scenarios such as malicious bots on a social media platform abusing users, or when users share content produced by new tools, such as ChatGPT, while excluding functions such as customer service chatbots which are low risk. Content generated by an artificial intelligence bot and then placed by a user on a regulated service will be regulated by the Bill. Content generated by an AI bot which interacts with user-generated content, such as bots on Twitter, will be regulated by the Bill. A bot that is controlled by the service provider, such as a customer service chatbot, is out of scope; as I have said, that is low risk and regulation would therefore be disproportionate. Search services using AI-powered features will be in scope of the search duties.
The Government recognise the need to act both to unlock the opportunities and to address the potential risks of this technology. Our AI regulation White Paper sets out the principles for the responsible development of AI in the UK. These principles, such as safety and accountability, are at the heart of our approach to ensuring the responsible development and use of artificial intelligence. We are creating a horizon-scanning function and a central risk function which will enable the Government to monitor future risks.
The Bill does not distinguish between the format of content present on a service. Any service that allows its users to encounter content generated, uploaded or shared by other users is in scope unless exempt, regardless of the format of that content. This includes virtual and augmented reality material. Platforms that allow such content, such as the metaverse, are firmly in scope of the Bill and must take the required steps to protect their users from harm. I hope that gives the clarity that my noble friend and others were seeking and reassurance that the intent of Amendment 125 is satisfied.
The Bill will require companies to take proactive steps to tackle all forms of online child sexual abuse, including grooming, live streaming, child sexual abuse material and prohibited images of children. If AI-generated content amounts to a child’s sexual exploitation or abuse offence in the Bill, it will be subject to the illegal content duties. Regulated providers will need to take steps to remove this content. We will shortly bring forward, and have the opportunity to debate in Committee, a government amendment to address concerns relating to the sending of intimate images. This will cover the non-consensual sharing of manufactured images—more commonly known as deepfakes. The possession and distribution of altered images that appear to be indecent photographs of children is ready covered by the indecent images of children offences, which are very serious offences with robust punishment in law.
Will the review also cover an understanding of what has been happening in criminal cases where, in some of the examples that have been described, people have tried to take online activity to court? We will at that point understand whether the judges believe that existing offences cover some of these novel forms of activity. I hope the review will also extend not just to what Ofcom does as a regulator but to understand what the courts are doing in terms of the definitions of criminal activity and whether they are being effective in the new online spaces.
I believe it will. Certainly, both government and Parliament will take into account judgments in the court on this Bill and in related areas of law, and will, I am sure, want to respond.
It is not just the judgments of the courts; it is about how the criminal law as a very basic point has been framed. I invite my noble friend the Minister to please meet with the Dawes Centre, because it is about future crime. We could end up with a situation in which more and more violence, particularly against women and girls, is being committed in this space, and although it may be that the Bill has made it regulated, it may not fall within the province of the criminal law. That would be a very difficult situation for our law to end up in. Can my noble friend the Minister please meet with the Dawes Centre to talk about that point?
I am happy to reassure my noble friend that the director of the Dawes Centre for Future Crime sits on the Home Office’s Science Advisory Council, whose work is very usefully fed into the work being done at the Home Office. Colleagues at the Ministry of Justice keep criminal law under constant review, in light of research by such bodies and what we see in the courts and society. I hope that reassures my noble friend that the points she raised, which are covered by organisations such as the Dawes Centre, are very much in the mind of government.
The noble Lord, Lord Allan of Hallam, explained very effectively the nuances of how behaviour translates to the virtual world. He is right that we will need to keep both offences and the framework under review. My noble friend Lady Berridge asked a good and clear question, to which I am afraid I do not have a similarly concise answer. I can reassure her that generated child sexual abuse and exploitation material is certainly illegal, but she asked about sexual harassment via a haptic suit; that would depend on the specific circumstances. I hope she will allow me to respond in writing, at greater length and more helpfully, to the very good question she asked.
Under Clause 56, Ofcom will also be required to undertake periodic reviews into the incidence and severity of content that is harmful to children on the in-scope services, and to recommend to the Secretary of State any appropriate changes to regulations based on its findings. Clause 141 also requires Ofcom to carry out research into users’ experiences of regulated services, which will likely include experiences of services such as the metaverse and other online spaces that allow user interaction. Under Clause 147, Ofcom may also publish reports on other online safety matters.
The questions posed by the noble Lord, Lord Russell of Liverpool, about international engagement are best addressed in a group covering regulatory co-operation, which I hope we will reach later today. I can tell him that we have introduced a new information-sharing gateway for the purpose of sharing information with overseas regulators, to ensure that Ofcom can collaborate effectively with its international counterparts. That builds on existing arrangements for sharing information that underpin Ofcom’s existing regulatory regimes.
The amendments tabled by the noble Lord, Lord Knight of Weymouth, relate to providers’ judgments about when content produced by bots is illegal content, or a fraudulent advertisement, under the Bill. Clause 170 sets out that providers will need to take into account all reasonably available relevant information about content when making a judgment about its illegality. As we discussed in the group about illegal content, providers will need to treat content as illegal when this information gives reasonable grounds for inferring that an offence was committed. Content produced by bots is in scope of providers’ duties under the Bill. This includes the illegal content duties, and the same principles for assessing illegal content will apply to bot-produced content. Rather than drawing inferences about the conduct and intent of the user who generated the content, the Bill specifies that providers should consider the conduct and the intent of the person who can be assumed to have controlled the bot at the point it created the content in question.
The noble Lord’s amendment would set out that providers could make judgments about whether bot-produced content is illegal, either by reference to the conduct or mental state of the person who owns the bot or, alternatively, by reference to the person who controls it. As he set out in his explanatory statement and outlined in his speech, I understand he has brought this forward because he is concerned that providers will sometimes not be able to identify the controller of a bot, and that this will impede providers’ duties to take action against illegal content produced by them. Even when the provider does not know the identity of the person controlling the bot, however, in many cases there will still be evidence from which providers can draw inferences about the conduct and intent of that person, so we are satisfied that the current drafting of the Bill ensures that providers will be able to make a judgment on illegality.
My concern is also whether or not the bot is out of control. Can the Minister clarify that issue?
It depends on what the noble Lord means by “out of control” and what content the bot is producing. If he does not mind, this may be an issue which we should go through in technical detail and have a more free-flowing conservation with examples that we can work through.
This is a very interesting discussion; the noble Lord, Lord Knight, has hit on something really important. When somebody does an activity that we believe is criminal, we can interrogate them and ask how they came to do it and got to the conclusion that they did. The difficulty is that those of us who are not super-techy do not understand how you can interrogate a bot or an AI which appears to be out of control on how it got to the conclusion that it did. It may be drawing from lots of different places and there may be ownership of lots of different sources of information. I wonder whether that is why we are finding how this will be monitored in future so concerning. I am reassured that the noble Lord, Lord Knight of Weymouth, is nodding; does the Minister concur that this may be a looming problem for us?
I certainly concur that we should discuss the issue in greater detail. I am very happy to do so with the noble Lord, the noble Baroness and others who want to do so, along with officials. If we can bring some worked examples of what “in control” and “out of control” bots may be, that would be helpful.
I hope the points I have set out in relation to the other issues raised in this group and the amendments before us are satisfactory to noble Lords and that they will at this point be content not to press their amendments.
My Lords, I thank all noble Lords who have contributed to a thought-provoking and, I suspect, longer debate than we had anticipated. At Second Reading, I think we were all taken aback when this issue was opened up by my noble friend Lord Sarfraz; once again, we are realising that this requires really careful thought. I thank my noble friend the Minister for his also quite long and thoughtful response to this debate.
I feel that I owe the Committee a small apology. I am very conscious that I talked in quite graphic detail at the beginning when there were still children in the Gallery. I hope that I did not cause any harm, but it shows how serious this is that we have all had to think so carefully about what we have been saying—only in words, without any images. We should not underestimate how much this has demonstrated the importance of our debates.
On the comments of the noble Baroness, Lady Fox, I am a huge enthusiast, like the noble Lord, Lord Knight, for the wonders of the tech world and what it can bring. We are managing the balance in this Bill to make sure that this country can continue to benefit from and lead the opportunities of tech while recognising its real and genuine harms. I suggest that today’s debate has demonstrated the potential harm that the digital world can bring.
I listened carefully—as I am certain the noble Baroness, Lady Kidron, has been doing in the digital world—to my noble friend’s words. I am encouraged by what he has put on the record on Amendment 125, but there are some specific issues that it would be helpful for us to talk about, as he alluded to, after this debate and before Report. Let me highlight a couple of those.
First, I do not really understand the technical difference between a customer service bot and other bots. I am slightly worried that we are defining in the specific one type of bot that would not be captured by this Bill. I suspect that there might be others in future. We must think carefully through whether we are getting too much into the specifics of the technology and not general enough in making sure we capture where it could go. That is one example.
Secondly, as my noble friend Lady Berridge would say, I am not sure that we have got to the bottom of whether this Bill, coupled with the existing body of criminal law, will really enable law enforcement officers to progress the cases as they see fit and protect vulnerable women—and men—in the digital world. I very much hope we can extend the conversation there. We perhaps risk getting too close to the technical specifics if we are thinking about whether a haptic suit is in or out of scope of the Bill; I am certain that there will be other technologies that we have not even thought about yet that we will want to make sure that the Bill can capture.
I very much welcome the spirit in which this debate has been held. When I said that I would do this for the noble Baroness, Lady Kidron, I did not realise quite what a huge debate we were opening up, but I thank everyone who has contributed and beg leave to withdraw the amendment.
(1 year, 6 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, I strongly support the amendment in the names of the noble Lords, Lord Knight and Lord Stevenson, as well as my noble friend Lady Featherstone. The essence of the message from the noble Lord, Lord Knight, about the need for trust and the fact that you can gain trust through greater transparency is fundamental to this group.
The Joint Committee’s report is now a historical document. It is partly the passage of time, but it was an extraordinary way in which to work through some of the issues, as we did. We were very impacted by the evidence given by Frances Haugen, and the fact that certain things came to light only as a result of her sharing information with the Securities and Exchange Commission. We said at the time that:
“Lack of transparency of service providers also means that people do not have insight into the prevalence and nature of activity that creates a risk of harm on the services that they use”.
That is very much the sense that the noble Lord, Lord Stevenson, is trying to get to by adding scope as well.
We were very clear about our intentions at the time. The Government accepted the recommendation that we made and said that they agreed with the committee that
“services with transparency reporting requirements should be required to publish their transparency reports in full, and in an accessible and public place”.
So what we are really trying to do is to get the Government to agree to what they have already agreed to, which we would have thought would be a relatively straightforward process.
There are some other useful aspects, such as the review of effectiveness of the transparency requirements. I very much appreciate what my noble friend just said about not reading transparency reports. I read the oversight reports but not necessarily the transparency reports. I am not sure that Frances Haugen was a great advert for transparency reports at the time, but that is a mere aside in the circumstances.
I commend my noble friend Lady Featherstone’s Amendment 171, which is very consistent with what we were trying to achieve with the code of practice about violence against women and girls. That would fit very easily within that. One of the key points that my noble friend Lord Allan made is that this is for the benefit of the platforms as well. It is not purely for the users. Of course it is useful for the users, but not exclusively, and this could be a way of platforms engaging with the users more clearly, inserting more fresh air into this. In these circumstances it is pretty conclusive that the Government should adhere to what they agreed to in their response to the Joint Committee’s report.
As ever, I thank all noble Lords who have spoken. I absolutely take, accept and embrace the point that transparency is wholly critical to what we are trying to achieve with the Bill. Indeed, the chandelier of transparency reports should be our shared aim—a greenhouse maybe. I am grateful for everyone’s contributions to the debate. I agree entirely with the views expressed. Transparency is vital in holding companies to account for keeping their users safe online. As has been pointed out, it is also to the benefit of the platforms themselves. Confident as I am that we share the same objectives, I would like to try to reassure noble Lords on a number of issues that have been raised.
Amendments 160A, 160B and 181A in the name of the noble Lord, Lord Knight of Weymouth, seek to require providers to make their transparency reports publicly available, subject to appropriate redactions, and to allow Ofcom to prevent their publication where it deems that the risks posed by drawing attention to illegal content outweigh the benefit to the public of the transparency report. Let me reassure the noble Lord that the framework, we strongly believe, already achieves the aim of those amendments. As set out in Clause 68, Ofcom will specify a range of requirements in relation to transparency reporting in a notice to categories 1, 2A and 2B. This will include the kind of information that is required in the transparency report and the manner in which it should be published. Given the requirement to publish the information, this already achieves the intention of Amendment 160A.
The specific information requested for inclusion within the transparency report will be determined by Ofcom. Therefore, the regulator will be able to ensure that the information requested is appropriate for publication. Ofcom will take into account any risks arising from making the information public before issuing the transparency notice. Ofcom will have separate information-gathering powers, which will enable the regulator to access information that is not suitable to be published in the public domain. This achieves the intention of Amendment 160B. There is also a risk of reducing trust in transparency reporting if there is a mechanism for Ofcom to prevent providers publishing their transparency reports.
Amendment 181A would require Ofcom to issue guidance on what information should be redacted and how this should be done. However, Ofcom is already required to produce guidance about transparency reports, which may include guidance about what information should be redacted and how to do this. It is important to provide the regulator with the flexibility to develop appropriate guidance.
Amendment 165 seeks to expand the information within the transparency reporting requirements to cover the scope of the terms of service set out by user-to-user providers. I very much agree with the noble Lord that it is important that Ofcom can request information about the scope of terms of service, as well as about their application. Our view is that the Bill already achieves this. Schedule 8 sets out the high-level matters about which information may be required. This includes information about how platforms are complying with their duties. The Bill will place duties on user-to-user providers to ensure that any required terms of service are clear and accessible. This will require platforms to set out what the terms of service cover—or, in other words, the scope. While I hope that this provides reassurance on the matter, if there are still concerns in spite of what I have said, I am very happy to look at this. Any opportunity to strengthen the Bill through that kind of clarity is worth looking at.
I welcome the Minister’s comments. I am interrupting just because this is my amendment rather than my noble friend Lord Knight’s. The word “scope” caused us some disquiet on this Bench when we were trying to work out what we meant by it. It has been fleshed out in slightly different ways around the Chamber, to advantage.
I go back to the original intention—I am sorry for the extensive introduction, but it is to make sure that I focus the question correctly—which was to make sure that we are not looking historically at the terms of reference that have been issued, and whether they are working in a transparency mode, but addressing the question of what is missing or is perhaps not addressed properly. Does the Minister agree that that would be taken in by the word “scope”?
I think I probably would agree, but I would welcome a chance to discuss it further.
Finally, Amendment 229 intends to probe how Ofcom will review the effectiveness of transparency requirements in the Bill. It would require Ofcom to produce reports reviewing the effectiveness of transparency reports and would give the Secretary of State powers to implement any recommendations made by the regulator. While I of course agree with the sentiment of this amendment, as I have outlined, the transparency reporting power is designed to ensure that Ofcom can continuously review the effectiveness of transparency reports and make adjustments as necessary. This is why the Bill requires Ofcom to set out in annual transparency notices what each provider should include in its reports and the format and manner in which it should be presented, rather than putting prescriptive or static requirements in the Bill. That means that Ofcom will be able to learn, year on year, what will be most effective.
Under Clause 145, Ofcom is required to produce its own annual transparency report, which must include a summary of conclusions drawn from providers’ transparency reports, along with the regulator’s view on industry best practice and other appropriate information—I hope and think that goes to some of the points raised by the noble Lord, Lord Allan of Hallam.
My Lords, just before the Minister moves on—and possibly to save me finding and reading it—can he let us know whether those annual reports by Ofcom will be laid before Parliament and whether Parliament will have a chance to debate them?
I believe so, but I will have to confirm that in writing. I am sorry not to be able to give a rapid answer.
Clause 159 requires the Secretary of State to review in total the operation of the regulatory framework to ensure it is effective. In that review, Ofcom will be a statutory consultee. The review will specifically require an assessment of the effectiveness of the regulatory framework in ensuring that the systems and processes used by services provide transparency and accountability to users.
The Bill will create what we are all after, which is a new culture of transparency and accountability in the tech sector. For the reasons I have laid out, we are confident that the existing provisions are sufficiently broad and robust to provide that. As such, I hope the noble Lord feels sufficiently reassured to withdraw the amendment.
My Lords, that was a good, quick debate and an opportunity for the noble Viscount to put some things on the record, and explain some others, which is helpful. It is always good to get endorsement around what we are doing from both the noble Lord, Lord Allan, and the noble Baroness, Lady Fox. That is a great spread of opinion. I loved the sense of the challenge as to whether anyone ever reads the transparency reports whenever they are published; I imagine AI will be reading and summarising them, and making sure they are not written as gobbledygook.
On the basis of what we have heard and if we can get some reassurance that strong transparency is accompanied by strong parliamentary scrutiny, then I am happy to withdraw the amendment.
My Lords, this has been miscellany, indeed. We must be making progress if we are picking up amendments such as these. I thank noble Lords who have spoken to the amendments and the issues covered in them.
I turn first to Amendment 185A brought to us by the noble Lord, Lord Bassam of Brighton, which seeks to add duties on online marketplaces to limit children’s access to the sale of knives, and proactively to identify and remove listings which appear to encourage the sale of knives for the purposes of violence or self-harm. Tackling knife crime is a priority for His Majesty’s Government; we are determined to crack down on this violent scourge, which is devastating our communities. I hope that he will forgive me for not drawing on the case he mentioned, as it is still sub judice. However, I certainly take the point he makes; we are all too aware of cases like it up and down the country. I received an email recently from Amanda and Stuart Stephens, whose son, Olly, was murdered by two boys, one of whom was armed with a knife. All these cases are very much in our minds as we debate the Bill.
Let me try to reassure them and the noble Lord as well as other Members of the Committee that the Bill, through its existing duties and other laws on the statute book, already achieves what the noble Lord seeks with his amendment. The sale of offensive weapons and of knives to people under the age of 18 are criminal offences. Any online retailer which directly sells these prohibited items can already be held criminally liable. Once in force, the Bill will ensure that technology platforms, including online marketplaces, prevent third parties from using their platform to sell offensive weapons or knives to people under the age of 18. The Bill lists both these offences as priority offences, meaning that user-to-user services, including online marketplaces, will have a statutory obligation proactively to prevent these offences taking place on their services.
I am sorry to interrupt. The Minister has twice given a positive response, but he limited it to child sexual exploitation; he did not mention terrorism, which is in fact the bigger issue. Could he confirm that it is both?
Yes, and as I say, I am happy to talk with the noble Lord about this in greater detail. Under the Bill, category 1 companies will have a new duty to safeguard all journalistic content on their platform, which includes citizen journalism. But I will have to take all these points forward with him in our further discussions.
My noble friend Lord Bethell is not here to move his Amendment 220D, which would allow Ofcom to designate online safety regulatory duties under this legislation to other bodies. We have previously discussed a similar issue relating to the Internet Watch Foundation, so I shall not repeat the points that we have already made.
On the amendments on supposedly gendered language in relation to Ofcom advisory committees in Clauses 139 and 155, I appreciate the intention to make it clear that a person of either sex should be able to perform the role of chairman. The Bill uses the term “chairman” to be consistent with the terminology in the Office of Communications Act 2002, and we are confident that this will have no bearing on Ofcom’s decision-making on who will chair the advisory committees that it must establish, just as, I am sure, the noble Lord’s Amendment 56 does not seek to be restrictive about who might be an “ombudsman”.
I appreciate the intention of Amendment 262 from the noble Baroness, Lady Bennett of Manor Castle. It is indeed vital that the review reflects the experience of young people. Clause 159 provides for a review to be undertaken by the Secretary of State, and published and laid before Parliament, to assess the effectiveness of the regulatory framework. There is nothing in the existing legislation that would preclude seeking the views of young people either as part of an advisory group or in other ways. Moreover, the Secretary of State is required to consult Ofcom and other persons she considers appropriate. In relation to young people specifically, it may be that a number of different approaches will be effective—for example, consulting experts or representative groups on children’s experiences online. That could include people of all ages. The regulatory framework is designed to protect all users online, and it is right that we take into account the full spectrum of views from people who experience harms, whatever their age and background, through a consultation process that balances all their interests.
Amendment 268AA from the noble Lord, Lord Bassam, relates to reporting requirements for online abuse and harassment, including where this is racially motivated—an issue we have discussed in Questions and particularly in relation to sport. His amendment would place an additional requirement on all service providers, even those not in scope of the Bill. The Bill’s scope extends only to user-to-user and search services. It has been designed in this way to tackle the risk of harm to users where it is highest. Bringing additional companies in scope would dilute the efforts of the legislation in this important regard.
Clauses 16 and 26 already require companies to set up systems and processes that allow users easily to report illegal content, including illegal online abuse and harassment. This amendment would therefore duplicate this existing requirement. It also seeks to create an additional requirement for companies to report illegal online abuse and harassment to the Crown Prosecution Service. The Bill does not place requirements on in-scope companies to report their investigations into crimes that occur online, other than child exploitation and abuse. This is because the Bill aims to prevent and reduce the proliferation of illegal material and the resulting harm it causes to so many. Additionally, Ofcom will be able to require companies to report on the incidence of illegal content on their platforms in its transparency reports, as well as the steps they are taking to tackle that content.
I hope that reassures the noble Lord that the Bill intends to address the problems he has outlined and those explored in the exchange with the noble Lord, Lord Clement-Jones. With that, I hope that noble Lords will support the government amendments in this group and be satisfied not to press theirs at this point.
My Lords, I listened very carefully to the Minister’s response to both my amendments. He has gone some way to satisfying my concerns. I listened carefully to the concerns of the noble Baroness, Lady Fox, and noble Lords on the Lib Dem Benches. I am obviously content to withdraw my amendment.
I do not quite agree with the Minister’s point about dilution on the last amendment—I see it as strengthening —but I accept that the amendments themselves slightly stretch the purport of this element of the legislation. I shall review the Minister’s comments and I suspect that I shall be satisfied with what he said.
I am grateful to noble Lords for helping us to reach our target for the first time in this Committee, especially to do so in a way which has given us a good debate on which to send us off into the Whitson Recess. I am off to the Isle of Skye, so I will make a special detour to Balmacara in honour of the noble Lord.
The noble Lord does not believe anything that I say at this Dispatch Box, but I will send a postcard.
As noble Lords are by now well aware, all services in scope of the Bill, regardless of their size, will be required to take action against illegal content and all services likely to be accessed by children must put in place protections for children. Companies designated as category 1 providers have significant additional duties. These include the overarching transparency, accountability and freedom of expression duties, as well as duties on content of democratic importance, news publishers’ content, journalistic content and fraudulent advertising. It is right to put such duties only on the largest platforms with features enabling the greatest reach, as they have the most significant influence over public discourse online.
I turn first to Amendment 192 in the name of my noble friend Lady Morgan of Cotes and Amendment 192A from the noble Lord, Lord Griffiths of Burry Port, which are designed to widen category 1 definitions to include services that pose a risk of harm, regardless of their number of users. Following removal of the legal but harmful provisions in another place, the Bill no longer includes the concept of risk of harm in Category 1 designation. As we set out, it would not be right for the Government to define what legal content it considers harmful to adults, and it follows that it would not be appropriate for the Government to categorise providers and to require them to carry out duties based on this definition.
In addition, requiring all companies to comply with the full range of Category 1 duties would pose a disproportionate burden on services which do not exert the same influence over public discourse online. I appreciate the point made by the noble Baroness, Lady Bull, with regard to regulatory burden. There is a practical element to this as well. Services, particularly smaller ones, have finite resources. Imposing additional duties on them would divert them from complying with their illegal and child safety duties, which address the most serious online harms. We do not want to weaken their ability to tackle criminal activity or to protect children.
As we discussed in detail in a previous debate, the Bill tackles suicide and self-harm content in a number of ways. The most robust protections in the Bill are for children, while those for adults strike a balance between adults being protected from illegal content and given more choice over what legal content they see. The noble Lord, Lord Stevenson, asked why we do not start with the highest risk rather than thinking about the largest services, but we do. We start with the most severe harms—illegal activity and harm to children. We are focusing on the topics of greatest risk and then, for other categories, allowing adults to make decisions about the content with which they interact online.
A number of noble Lords referred to suicide websites and fora. We are concerned about the widespread availability of content online which promotes and advertises methods of suicide and self-harm, which can be easily accessed by young or vulnerable people. Under the Bill, where suicide and self-harm websites host user-generated content, they will be in scope of the legislation. These sites will need proactively to prevent users from being exposed to priority illegal content, including content which encourages or assists suicide under the terms of the Suicide Act 1961. Additionally, it is an offence under Section 4(3) of the Misuse of Drugs Act 1971 for a website to offer to sell controlled drugs to consumers in England and Wales. Posting advice on how to obtain such drugs in England and Wales is also likely to be an offence, regardless of where the person providing the advice is located.
The Bill also limits the availability of such content by placing illegal content duties on search services, including harmful content which affects children or where this content is shared on user-to-user services. This will play a key role in reducing traffic that directs people to websites which encourage or assist suicide, and reduce the likelihood of users encountering such content. The noble Baroness, Lady Bull, asked about starvation. Encouraging people to starve themselves or not to take prescribed medication will be covered.
Amendment 194 tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to ensure that Ofcom can designate companies as category 1, 2A or 2B on a provisional basis, when it considers that they are likely to meet the relevant thresholds. This would mean that the relevant duties can be applied to them, pending a full assessment by Ofcom. The Government recognise the concern highlighted by the noble Lord, Lord Allan, about the rapid pace of change in the technology sector and how that can make it challenging to keep the register of the largest and most influential services up to date. I assure noble Lords that the Bill addresses this with a duty which the Government introduced during the Bill’s recommittal in another place. This duty, at Clause 88, requires Ofcom proactively to identify and publish a list of companies which are close to category 1 thresholds. This will reduce any delays in Ofcom adding additional obligations on companies which grow rapidly, or which introduce new high-risk features. It will also ensure that the regime remains agile and adaptable to emerging threats.
Platforms with the largest reach and greatest influence over public discourse will be designated as category 1. The Bill sets out a clear process for determining category 1 providers, based on thresholds relating to these criteria, which will be set by the Secretary of State in secondary legislation. The process has been designed to ensure that it is transparent and evidence-based. We expect the main social media platforms and possibly some others to be designated as category 1 services, but we do not wish to prejudge the process set out above by indicating which specific services are likely to be designated, as I have set out on previous groups.
The amendment would enable Ofcom to place new duties on companies without due process. Under the approach that we take in the Bill, Ofcom can designate companies as belonging to each category based only on an objective assessment of evidence against thresholds approved by Parliament. The Government’s approach also provides greater certainty for companies, as is proposed in this amendment. We have heard concerns in previous debates about when companies will have the certainty of knowing their category designation. These amendments would introduce continuous uncertainty and subjectivity into the designation process and would give Ofcom significant discretion over which companies should be subject to which duties. That would create a very uncertain operating environment for businesses and could reduce the attractiveness of the UK as a place to do business.
I hope that explains why we are not taken by these amendments but, in the spirit of the Whitsun Recess, I will certainly think about them on the train as I head north. I am very happy to discuss them with noble Lords and others between now and our return.
Before the Minister sits down, he did let slip that he was going on the sleeper, so I do not think that there will be much thinking going on—although I did not sleep a wink the last time I went, so I am sure that he will have plenty of time.
I am sure that the noble Baroness, Lady Morgan, will want to come in—but could he repeat that again? Risk assessment drives us, but the risk assessment for a company that will not be regarded as a category 1 provider because it does not meet categorisation thresholds means that, even though it is higher risk than perhaps even some of the category 1 companies, it will not be subject to the requirements to pick up the particular issues raised by the noble Baroness and the noble Lord, and their concerns for those issues, which are clearly social harms, will not really be considered on a par.
In the response I gave, I said that we are making the risk assessment that the riskiest behaviour is illegal content and content which presents a harm to children. That is the assessment and the approach taken in the Bill. In relation to other content which is legal and for adults to choose how they encounter it, there are protections in the Bill to enforce terms of service and empower users to curate their own experience online, but that assessment is made by adult users within the law.
I thank all noble Lords who spoke in this short but important debate. As we heard, some issues relating to risk and harm have been returned to and will no doubt be again, and we note the impact of the absence of legal but harmful as a concept. As the noble Baroness, Lady Bull, said, I know that the noble Baroness, Lady Parminter, was very sad that she could not be here this afternoon due to another engagement.
I will not keep the House much longer. I particularly noted the noble Baroness’s point that there should not be, and is not, a direct relationship between the size of the platform and its ability to cause harm. There is a balance to be struck between the regulatory burden placed on platforms versus the health and well-being of those who are using them. As I have said before, I am not sure that we have always got that particular balance right in the Bill.
The noble Lord, Lord Allan, was very constructive: it has to be a good thing if we are now beginning to think about the Bill’s implementation, although we have not quite reached the end and I do not want to prejudge any further stages, in the sense that we are now thinking about how this would work. Of course, he is right to say that some of these platforms have no intention of complying with these rules at all. Ofcom and the Government will have to work out what to do about that.
Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course. I beg leave to withdraw.
(1 year, 5 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, I am all that is left between us and hearing from the Minister with his good news, so I will constrain my comments accordingly.
The noble Baroness, Lady Kidron, begin by paying tribute to the parents of Olly, Breck, Molly, Frankie and Sophie. I very much join her in doing that; to continually have to come to this place and share their trauma and experience comes at a great emotional cost. We are all very grateful to them for doing it and for continuing to inform and motivate us in trying to do the right thing. I am grateful to my noble friend Lady Healy and in particular to the noble Baroness, Lady Newlove, for amplifying that voice and talking about the lost opportunity, to an extent, of our failure to find a way of imposing a general duty of care on the platforms, as was the original intention when the noble Baroness, Lady Morgan, was the Secretary of State.
I also pay a big tribute to the noble Baroness, Lady Kidron. She has done the whole House, the country and the world a huge service in her campaigning around this and in her influence on Governments—not just this one—on these issues. We would not be here without her tireless efforts, and it is important that we acknowledge that.
We need to ensure that coroners can access the information they need to do their job, and to have proper sanctions available to them when they are frustrated in being able to do it. This issue is not without complication, and I very much welcome the Government’s engagement in trying to find a way through it. I too look forward to the good news that has been trailed; I hope that the Minister will be able to live up to his billing. Like the noble Baroness, Lady Harding, I would love to see him embrace, at the appropriate time, the “safety by design” amendments and some others that could complete this picture. I also look forward to his answers on issues such as data preservation, which the noble Lord, Lord Allan, covered among the many other things in his typically fine speech.
I very much agree that we should have a helpline and do more about that. Some years ago, when my brother-in-law sadly died in his 30s, it fell to me to try to sort out his social media accounts. I was perplexed that the only way I could do it was by fax to these technology companies in California. That was very odd, so to have proper support for bereaved families going through their own grief at that moment seems highly appropriate.
As we have discussed in the debates on the Bill, a digital footprint is an asset that is exploited by these companies. But it is an asset that should be regarded as part of one’s estate that can be bequeathed to one’s family; then some of these issues would perhaps be lessened. On that basis, and in welcoming a really strong and moving debate, I look forward to the Minister’s comments.
My Lords, this has been a strong and moving debate, and I am grateful to the noble Baroness, Lady Kidron, for bringing forward these amendments and for the way she began it. I also echo the thanks that the noble Baroness and others have given to the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens, Frankie Thomas and all the young people whose names she rightly held in remembrance at the beginning of this debate. There are too many others who find themselves in the same position. The noble Lord, Lord Knight, is right to pay tribute to their tirelessness in campaigning, given the emotional toll that we know it has on them. I know that they have followed the sometimes arcane processes of legislation and, as my noble friend Lady Morgan said, we all look forward to the Bill becoming an Act of Parliament so that it can make a difference to families who we wish to spare from the heartache they have had.
Every death is sorrowful, but the death of a child is especially heartbreaking. The Government take the issues of access to information relating to a deceased child very seriously. We have undertaken extensive work across government and beyond to understand the problems that parents, and coroners who are required to investigate such deaths, have faced in the past in order to bring forward appropriate solutions. I am pleased to say that, as a result of that work, and thanks to the tireless campaigning of the noble Baroness, Lady Kidron, and our discussions with those who, very sadly, have first-hand experience of these problems, we will bring forward a package of measures on Report to address the issues that parents and coroners have faced. Our amendments have been devised in close consultation with the noble Baroness and bereaved families. I hope the measures will rise to the expectations they rightly have and that they will receive their support.
The package of amendments will ensure that coroners have access to the expertise and information they need to conduct their investigations, including information held by technology companies, regardless of size, and overseas services such as Wattpad, mentioned by the noble Baroness, Lady Healy of Primrose Hill, in her contribution. This includes information about how a child interacted with specific content online as well as the role of wider systems and processes, such as algorithms, in promoting it. The amendments we bring forward will also help to ensure that the process for accessing data is more straightforward and humane. The largest companies must ensure that they are transparent with parents about their options for accessing data and respond swiftly to their requests. We must ensure that companies cannot stonewall parents who have lost a child and that those parents are treated with the humanity and compassion they deserve.
I take the point that the noble Baroness, Lady Kidron, rightly makes: small does not mean safe. All platforms will be required to comply with Ofcom’s requests for information about a deceased child’s online activity. That will be backed by Ofcom’s existing enforcement powers, so that where a company refuses to provide information without a valid excuse it may be subject to enforcement action, including sanctions on senior managers. Ofcom will also be able to produce reports for coroners following a Schedule 5 request on matters relevant to an investigation or inquest. This could include information about a company’s systems and processes, including how algorithms have promoted specific content to a child. This too applies to platforms of any size and will ensure that coroners are provided with information and expertise to assist them in understanding social media.
Where this Bill cannot solve an issue, we are exploring alternative avenues for improving outcomes as well. For example, the Chief Coroner has committed to consider issuing non-legislative guidance and training for coroners about social media, with the offer of consultation with experts.
I am sorry to interrupt my noble friend. On the coroners’ training and national guidelines, the Chief Coroner has no powers across the nation over all the coroners. How is he or she going to check that the coroners are keeping up with their training and are absolutely on the ball? The Chief Coroner has no powers across the country and everything happens in London; we are talking about outside London. How can we know that no other family has to suffer, considering that we have this legislation?
My noble friend rightly pulled me up for not responding to her letter as speedily as we have been dealing with the questions raised by the noble Baroness, Lady Kidron. We have had some useful meetings with Ministers at the Ministry of Justice, which the noble Baroness has attended. I would be very happy to provide some detail on this to my noble friend—I am conscious of her experience as Victims’ Commissioner—either in writing or to organise a briefing if she would welcome that.
The noble Lord, Lord Allan of Hallam, rightly raised data protection. Where Ofcom and companies are required to respond to coroners’ requests for information, they are already required to comply with personal data protection legislation, which protects the privacy of other users. This may include the redaction of information that would identify other users. We are also exploring whether guidance from the Information Commissioner's Office could support technology companies to understand how data protection law applies in such cases.
The noble Lord mentioned the challenges of potential conflicts of law around the world. Where there is a conflict of laws—for example, due to data protection laws in other jurisdictions—Ofcom will need to consider the best way forward on a case-by-case basis. For example, it may request alternative information which could be disclosed, and which would provide insight into a particular issue. We will seek to engage our American counterparts to understand any potential and unintended barriers created by the US Stored Communications Act. I can reassure the noble Lord that these matters are in our mind.
We are also aware of the importance of data preservation to both coroners and bereaved parents. The Government agree with the principle of ensuring that these are preserved. We will be working towards solving this in the Data Protection and Digital Information Bill. In addition, we will explore whether there are further options to improve outcomes for parents in that Bill as well. I want to assure noble Lords and the families watching this debate closely that we will do all we can to deliver the necessary changes to give coroners and parents the information that they seek and to ensure a more straightforward and humane process in the future.
I turn in detail to the amendments the noble Baroness, Lady Kidron, brought forward. First, Amendments 215 and 216 include new requirements on Ofcom, seeking to ensure that coroners and parents can obtain data from social media companies after the death of a child. Amendment 215 would give Ofcom the ability to impose senior management liability on an individual in cases where a coroner has issued a notice requiring evidence to be provided in an inquest into the death of a child. Amendment 216 would put Ofcom’s powers at the disposal of a coroner or close relatives of a deceased child so that Ofcom would be obliged to require information from platforms or other persons about the social media activity of a deceased child. It also requires service providers to provide a point of contact. Amendments 198 and 199 are consequential to this.
As I said, we agree with the intent of the noble Baroness’s amendments and we will deal with it in the package that we will bring forward before Report. Our changes to the Bill will seek to ensure that Ofcom has the powers it needs to support coroners and their equivalents in Scotland, so that they have access to the information they need to conduct investigations into a child’s death where social media may have played a part.
My Lords, I am grateful to the noble Lords, Lord Bethell, Lord Curry and Lord Allan for introducing their amendments, to the noble Baroness, Lady Morgan, for her direct question, and to the noble Baroness, Lady Kidron, for her equally direct question. I am sure they will be of great assistance to the Minister when he replies. I will highlight the words of the noble Lord, Lord Allan, who said “We are looking for services to succeed”. I think that is right, but what is success? It includes compliance and enforcement, and that is what this group refers to.
The amendments introduced by the noble Lord, Lord Bethell, seek to strengthen what is already in the Bill about Ofcom’s Chapter 6 powers of enforcement, otherwise known as business disruption powers, and they focus on what happens in the event of a breach; they seek to be more prescriptive than what we already have. I am sure the Minister will remember that the same issue came up in the Digital Economy Bill, around the suggestion that the Government should take specific powers. There, the Government argued they had assurances from credit card companies that, if and when action was required, they would co-operate. In light of that previous discussion, it will be interesting to hear what the Minister has to say.
In respect of the amendments introduced by the noble Lord, Lord Curry, on the need to toughen up requirements on Ofcom to act, I am sure the Minister will say that these powers are not required and that the Bill already makes provision for Ofcom blocking services which are failing in their duties. I echo the concern of the noble Lord, Lord Clement-Jones, about being overly prescriptive and not allowing Ofcom to do its job. The truth is that Ofcom may need discretion but it also needs teeth, and I will be interested to hear what the Minister has to say about whether he feels, in the light of the debate today and other conversations, that there is sufficient toughness in the Bill and that Ofcom will be able to do the job it is required to do. There is an issue of the balance of discretion versus requirement, and I know he will refer to this. I will also be interested to hear from the Minister about the view of Ofcom with respect to what is in the Bill, and whether it feels that it has sufficient powers.
I will raise a final point about the amendments in the name of the noble Lord, Lord Curry. I think they ask a valid question about the level of discretion that Ofcom will have. I ask the Minister this: if, a few years down the line, we find that Ofcom has not used the powers suitably, despite clear failures, what would the Government seek to do? With that, I look forward to hearing from the Minister.
My Lords, where necessary, the regulator will be able to apply to the courts for business disruption measures. These are court orders which will require third-party ancillary services and access facilities to withdraw their services from, or impede users’ access to, non-compliant regulated services. These are strong, flexible powers which will ensure that Ofcom can take robust action to protect users. At the same time, we have ensured that due process is followed. An application for a court order will have to specify the non-compliant provider, the grounds and evidence on which the application is based and the steps that third parties must take to withdraw services or block users’ access. Courts will consider whether business disruption measures are an appropriate way of preventing harm to users and, if an order is granted, ensure it is proportionate to the risk of harm. The court will also consider the interests of all relevant parties, which may include factors such as contractual terms, technical feasibility and the costs of the measures. These powers will ensure that services can be held to account for failure to comply with their duties under the Bill, while ensuring that Ofcom’s approach to enforcement is proportionate and upholds due process.
I am reminded by my noble friend Lord Foster of Bath, particularly relating to the gambling sector, that some of these issues may run across various regulators that are all seeking business disruption. He reminded me that if you type into a search engine, which would be regulated and subject to business disruption measures here, “Casinos not regulated by GAMSTOP”, you will get a bunch of people who are evading GAMSTOP’s regulation. Noble Lords can imagine similar for financial services—something that I know the noble Baroness, Lady Morgan of Cotes, is also very interested in. It may not be for answer now, but I would be interested to understand what thinking the Government have on how all the different business disruption regimes—financial, gambling, Ofcom-regulated search services, et cetera—will all mesh together. They could all come before the courts under slightly different legal regimes.
When I saw the noble Lord, Lord Foster of Bath, and the noble Baroness, Lady Armstrong of Hill Top, in their places, I wondered whether they were intending to raise these points. I will certainly take on board what the noble Lord says and, if there is further information I can furnish your Lordships with, I certainly will.
The noble Baroness, Lady Kidron, asked whether the powers can be used on out-of-scope services. “No” is the direct answer to her direct question. The powers can be used only in relation to regulated services, but if sites not regulated by the Bill are publishing illegal content, existing law enforcement powers—such as those frequently deployed in cases of copyright infringement—can be used. I could set out a bit more in writing if that would be helpful.
My noble friend Lord Bethell’s amendments seek to set out in the Bill that Ofcom will be able to make a single application to the courts for an order enabling business disruption measures that apply against multiple platforms and operators. I must repeat, as he anticipated, the point made by my right honourable friend Chris Philp that the civil procedure rules allow for a multi-party claim to be made. These rules permit any number of claimants or defendants and any number of claims to be covered by one claim form. The overriding objective of the civil procedure rules is that cases are dealt with justly and proportionately. I want to reassure my noble friend that the Government are confident that the civil procedure rules will provide the necessary flexibility to ensure that services can be blocked or restricted.
The amendment in the name of the noble Lord, Lord Allan of Hallam, seeks to clarify what services might be subject to access restriction orders by removing the two examples provided in the Bill: internet access services and application stores. I would like to reassure him that these are simply indicative examples, highlighting two kinds of service on which access restriction requirements may be imposed. It is not an exhaustive list. Orders could be imposed on any services that meet the definition—that is, a person who provides a facility that is able to withdraw, adapt or manipulate it in such a way as to impede access to the regulated service in question. This provides Ofcom with the flexibility to identify where business disruption measures should be targeted, and it future-proofs the Bill by ensuring that the power remains functional and effective as technologies develop.
As the noble Lord highlighted, these are significant powers that can require that services be blocked in the UK. Clearly, limiting access to services in this way substantially affects the business interests of the service in question and the interests of the relevant third-party service, and it could affect users’ freedom of expression. It is therefore essential that appropriate safeguards are included and that due process is followed. That is why Ofcom will be required to seek a court order to be able to use these powers, ensuring that the courts have proper oversight.
To ensure that due process is upheld, an application by the regulator for a court order will have to specify the non-compliant provider, the grounds of the order and the steps that Ofcom considers should be imposed on the third parties in order to withdraw services and block users’ access. These requirements will ensure that the need to act quickly to tackle harm is appropriately balanced against upholding fundamental rights.
It might be useful to say a little about how blocking works—
Before the Minister does that, can he say whether he envisages that operating against VPNs as well?
If I may, I will take advice on that and write to the noble Lord.
Yes; he made a helpful point, and I will come back on it.
We share a common interest in understanding whether it would be used against VPNs, but we may not necessarily have the same view about whether it should be. Do not take that as an encouragement—take it as a request for information.
I thank the noble Lord.
The term “blocking” is used to describe measures that will significantly impede or restrict access to non-compliant services—for example, internet service providers blocking websites or app stores blocking certain applications. These measures will be used only in exceptional circumstances, where the service has committed serious failures in meeting its duties and where no other action would reasonably prevent online harm to users in the UK.
My noble friend Lord Bethell’s Amendments 218F and 218L seek to ensure that Ofcom can request that an interim service or access restriction order endures for a period of six months in cases where a service hosts pornographic content. I reassure him that the court will already be able to make an order which can last up to six months. Indeed, the court’s interim order can have effect until either the date on which the court makes a service or access restriction order, or an expiry date specified by the court in the order. It is important that sanctions be determined on a case-by-case basis, which is why no limitations are set for these measures in the Bill.
As my noble friend knows, in the Bill there are clear duties on providers to ensure that children are not able to access pornography, which Ofcom will have a robust set of powers to enforce. It is important, however, that Ofcom’s powers and its approach to enforcement apply equally and consistently across the range of harms in scope of the Bill, rather than singling out one form of content in particular.
I hope that that is useful to noble Lords, along with the commitment to write on the further points which were raised. With that, I urge my noble friend to withdraw his amendment.
My Lords, to be honest, this debate has been an incredible relief to me. Here we have been taking a step away from some of the high-level conversations we had about what we mean by the internet and safety, looking at the far horizon, and instead looking at the moment when the Bill has real traction to try to change behaviours and improve the environment of the internet. I am extremely grateful to the Minister for his fulsome reply on a number of the issues.
The reason why it is so important is the two big areas where enforcement and compliance are going to be really tricky. First, there is Ofcom’s new relationship with the really big behemoths of the internet. It has a long tradition of partnership with big companies such as ITV, the radio sector—with the licensed authorities. However, of course it has licences, and it can pull them. I have worked for some of those companies, and it is quite a thing to go to see your regulator when you know that it can pull your licence. Obviously, that is within legal reason, but at the end of the day it owns your licence, and that is different to having a conversation where it does not.
The second class is the Wild West: the people living in open breach of regular societal norms who care not for the intentions of either the regulator, the Government or even mainstream society. Bringing those people back into reasonable behaviour will be a hell of a thing. My noble friend Lord Grade spoke, reasonably but with a degree of trepidation, about the challenge faced by Ofcom there. I am extremely grateful to the Minister for addressing those points.
Ofcom will step up to having a place next to the FCA and the MHRA. The noble Lord, Lord Curry, spoke about some of the qualities needed of one of the big three regulators. Having had some ministerial oversight of the MHRA, I can tell your Lordships that it has absolutely no hesitation about tackling big pharmaceutical companies and is very quick, decisive and clear. It wields a big stick—or, to use the phrase of the noble Baroness, Lady Merron, big teeth—in order to conduct that. That is why I ask the Minister just to keep in mind some of the recommendations embedded in these amendments.
The noble Baroness, Lady Kidron, mentioned illegal content, and I appreciate the candour of the Minister’s reply. However, business disruption measures offer an opportunity to address the challenge of illegal content, which is something that I know the Secretary of State has spoken about very interestingly, in terms of perhaps commissioning some kind of review. If such a thing were to happen, I ask that business disruption measures and some way of employing them might be brought into that.
We should look again at enforcement and compliance. I appreciate the Minister saying that it is important to let the regulator make some of these decisions, but the noble Lord, Lord Allan, was right: the regulator needs to know what the Government’s intentions are. I feel that we have opened the book on this, but there is still a lot more to be said about where the Government see the impact of regulation and compliance ending up. In all the battles in other jurisdictions—France, Germany, the EU, Canada, Louisiana and Utah—it all comes down to enforcement and compliance. We need to know more of what the Government hope to achieve in that area. With that, I beg leave to withdraw my amendment.
(1 year, 5 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, the Government are supportive of improving data sharing and encouraging greater collaboration between companies and researchers, subject to the appropriate safeguards. However, the data that companies hold about users can, of course, be sensitive; as such, mandating access to data that are not publicly available would be a complex matter, as noble Lords noted in their contributions. The issue must be fully thought through to ensure that the risks have been considered appropriately. I am grateful for the consideration that the Committee has given this matter.
It is because of this complexity that we have given Ofcom the task of undertaking a report on researchers’ access to information. Ofcom will conduct an in-depth assessment of how researchers can currently access data. To the point raised by the noble Lord, Lord Knight, and my noble friend Lord Bethell, let me provide reassurance that Ofcom will assess the impact of platforms’ policies that restrict access to data in this report, including where companies charge for such access. The report will also cover the challenges that constrain access to data and how such challenges might be addressed. These insights will provide an evidence base for any guidance that Ofcom may issue to help improve data access for researchers in a safe and secure way.
Amendments 230 and 231 seek to require Ofcom to publish a report into researchers’ access to data more rapidly than within the currently proposed two years. I share noble Lords’ desire to develop the evidence base on this issue swiftly, but care must be taken to balance Ofcom’s timelines to ensure that it can deliver its key priorities in establishing the core parts of the regulatory framework that the Bill will bring in; for example, the illegal content and child safety duties. Implementing these duties must be the immediate priority for Ofcom to ensure that the Bill meets its objective of protecting people from harm. It is crucial that we do not divert attention away from these areas and that we allow Ofcom to carry out this work as soon as is practicable.
Further to this, considering the complex matter of researchers’ access to data will involve consultation with interested parties, such as the Information Commissioner’s Office, the Centre for Data Ethics and Innovation, UK Research and Innovation, representatives of regulated services and others—including some of those parties mentioned by noble Lords today—as set out in Clause 146(3). This is an extremely important issue that we need to get right. Ofcom must be given adequate time to consult as it sees necessary and undertake the appropriate research.
Before the Minister succeeds in disappointing us, can he clarify something for us? Once Ofcom has published the report, it has the power to issue guidance. What requirement is there for platforms to abide by that guidance? We want there to be some teeth at the end of all this. There is a concern that a report will be issued, followed by some guidance, but that nothing much else will happen.
It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.
We are sympathetic to the amendment. It is complex, and this has been a useful debate—
I wonder whether the Minister has an answer to the academic community, who now see their European colleagues getting ahead through being able to access data through other legislation in other parts of the world. Also, we have a lot of faith in Ofcom, but it seems a mistake to let it be the only arbiter of what needs to be seen.
We are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.
I do not expect the Minister to have an answer to this today, but it will be useful to get this on the record as it is quite important. Can he let us know the Government’s thinking on the other piece of the equation? We are getting the platforms to disclose the data, and an important regulatory element is the research organisations that receive it. In the EU, that is being addressed with a code of conduct, which is a mechanism enabled by the general data protection regulation that has been approved by the European Data Protection Board and creates this legal framework. I am not aware of equivalent work having been done in the UK, but that is an essential element. We do not want to find that we have the teeth to persuade the companies to disclose the data, but not the other piece we need—probably overseen by the Information Commissioner’s Office rather than Ofcom—which is a mechanism for approving researchers to receive and then use the data.
We are watching with interest what is happening in other jurisdictions. If I can furnish the Committee with any information in the area the noble Lord mentions, I will certainly follow up in writing.
I have a question, in that case, in respect of the jurisdictions. Why should we have weaker powers for our regulator than others?
I do not think that we do. We are doing things differently. Of course, Ofcom will be looking at all these matters in its report, and I am sure that Parliament will have an ongoing interest in them. As jurisdictions around the world continue to grapple with these issues, I am sure that your Lordships’ House and Parliament more broadly will want to take note of those developments.
But surely, there is no backstop power. There is the review but there is no backstop which would come into effect on an Ofcom recommendation, is there?
We will know once Ofcom has completed its research and examination of these complex issues; we would not want to pre-judge its conclusions.
With that, if there are no further questions, I invite the noble Lord to withdraw his amendment.
My Lords, this was a short but important debate with some interesting exchanges at the end. The noble Baroness, Lady Harding, mentioned the rapidly changing environment generated by generative AI. That points to the need for wider ecosystem-level research on an independent basis than we fear we might get as things stand, and certainly wider than the skilled persons we are already legislating for. The noble Lord, Lord Bethell, referred to the access that advertisers already have to insight. It seems a shame that we run the risk, as the noble Baroness, Lady Kidron, pointed out, of researchers in other jurisdictions having more privileged access than researchers in this country, and therefore becoming dependent on those researchers and whistleblowers to give us that wider view. We could proceed with a report and guidance as set out in the Bill but add in some reserved powers in order to take action if the report suggests that Ofcom might need and want that. The Minister may want to reflect on that, having listened to the debate. On that basis, I am happy to beg leave to withdraw the amendment.
My Lords, I enter the fray with some trepidation. In a briefing, Carnegie, which we all love and respect, and which has been fantastic in the background in Committee days, shared some concerns. As I interpret its concerns, when Ofcom was created in 2003 its decisions could be appealed on their merits, as the noble Lord has just suggested, to the Competition Appeal Tribunal, and I believe that this was seen as a balancing measure against an untested regime. What followed was that the broad basis on which appeal was allowed led to Ofcom defending 10 appeals per year, which really frustrated its ability as a regulator to take timely decisions. It turned out that the appeals against Ofcom made up more than 80% of the workload of the Competition Appeal Tribunal, whose work was supposed to cover a whole gamut of matters. When there was a consultation in the fringes of the DEA, it was decided to restrict appeal to judicial review and appeal on process. I just want to make sure that we are not opening up a huge and unnecessary delaying tactic.
I thank all those who have spoken, and I very much appreciate the spirit in which the amendments were tabled. They propose changes to the standard of appeal, the standing to appeal and the appeals process itself. The Government are concerned that enabling a review of the full merits of cases, as proposed by Amendments 243 and 245, could prove burdensome for the courts and the regulator, since a full-merits approach, as we have been hearing, has been used by regulated services in other regulatory regimes to delay intervention, undermining the effectiveness of the enforcement process. With deep-pocketed services in scope, allowing for a full-merits review could incentivise speculative appeals, both undermining the integrity of the system and slowing the regulatory process.
While the Government are fully committed to making sure that the regulator is properly held to account, we feel that there is not a compelling case for replacing the decisions of an expert and well-resourced regulator with those of a tribunal. Ofcom will be better placed to undertake the complex analysis, including technical analysis, that informs regulatory decisions.
Amendment 245 would also limit standing and leave to appeal only to providers and those determined eligible entities to make super-complaints under Clause 150. This would significantly narrow the eligibility requirements for appeals. For appeals against Ofcom notices we assess that the broader, well-established standard in civil law of sufficient interest is more appropriate. Super-complaints fulfil a very different function from appeals. Unlike appeals, which will allow regulated services to challenge decisions of the regulator, super-complaints will allow organisations to advocate for users, including vulnerable groups and children, to ensure that systemic issues affecting UK users are brought to Ofcom’s attention. Given the entirely distinct purposes of these functions, it would be inappropriate to impose the eligibility requirements for super-complaints on the appeals system.
I am also concerned about the further proposal in Amendment 245 to allow the tribunal to replace Ofcom’s decision with its own. Currently, the Upper Tribunal is able to dismiss an appeal or quash Ofcom’s decision. Quashed decisions must be remitted to Ofcom for reconsideration, and the tribunal may give directions that it considers appropriate. Amendment 245 proposes instead allowing the Upper Tribunal to
“impose or revoke, or vary the amount of, a penalty … give such directions or take such other steps as OFCOM could itself have given or taken, or … make any other decision which OFCOM could itself have made”.
The concern is that this risks undermining Ofcom’s independence and discretion in applying its powers and issuing sanctions, and in challenging the regulator’s credibility and authority. It may also further incentivise well-resourced providers to appeal opportunistically, with a view to securing a more favourable outcome at a tribunal.
On that basis, I fear that the amendments tabled by the noble Lord would compromise the fundamental features of the current appeals provisions, without any significant benefits, and risk introducing a range of inadvertent consequences. We are confident that the Upper Tribunal’s judicial review process, currently set out in the Bill, provides a proportionate, effective means of appeal that avoids unnecessary expense and delays, while ensuring that the regulator’s decisions can be thoroughly scrutinised. It is for these reasons that I hope the noble Baroness will withdraw the amendment.
My Lords, I am grateful to the Minister. I will take that as a no—but a very well-considered no, for which I thank him. I say to the noble Lord, Lord Clement-Jones, that we certainly would not wish to make him feel uncomfortable at any time. I am grateful to him and the noble Baroness, Lady Kidron, for their contributions. As I said at the outset, this amendment was intended to probe the issue, which I feel we have done. I certainly would not want to open a can of worms—online, judicial or otherwise. Nor would I wish, as the Minister suggested, to undermine the work, efficiency and effectiveness of Ofcom. I am glad to have had the opportunity to present these amendments. I am grateful for the consideration of the Committee and the Minister, and with that I beg leave to withdraw.
My Lords, this has been a broad and mixed group of amendments. I will be moving the amendments in my name, which are part of it. These introduce the new offence of encouraging or assisting serious self-harm and make technical changes to the communications offences. If there can be a statement covering the group and the debate we have had, which I agree has been well informed and useful, it is that this Bill will modernise criminal law for communications online and offline. The new offences will criminalise the most damaging communications while protecting freedom of expression.
Amendments 264A, 266 and 267, tabled by the noble Lord, Lord Clement-Jones, and my noble friend Lady Buscombe, would expand the scope of the false communications offence to add identity theft and financial harm to third parties. I am very grateful to them for raising these issues, and in particular to my noble friend Lady Buscombe for raising the importance of financial harm from fake reviews. This will be addressed through the Digital Markets, Competition and Consumers Bill, which was recently introduced to Parliament. That Bill proposes new powers to address fake and misleading reviews. This will provide greater legal clarity to businesses and consumers. Where fake reviews are posted, it will allow the regulator to take action quickly. The noble Baroness is right to point out the specific scenarios about which she has concern. I hope she will look at that Bill and return to this issue in that context if she feels it does not address her points to her satisfaction.
Identity theft is dealt with by the Fraud Act 2006, which captures those using false identities for their own benefit. It also covers people selling or using stolen personal information, such as banking information and national insurance numbers. Adding identity theft to the communications offences here would duplicate existing law and expand the scope of the offences too broadly. Identity theft, as the noble Lord, Lord Clement-Jones, noted, is better covered by targeted offences rather than communications offences designed to protect victims from psychological and physical harm. The Fraud Act is more targeted and therefore more appropriate for tackling these issues. If we were to add identity theft to Clause 160, we would risk creating confusion for the courts when interpreting the law in these areas—so I hope the noble Lord will be inclined to side with clarity and simplicity.
Amendment 265, tabled by my noble friend Lord Moylan, gives me a second chance to consider his concerns about Clause 160. The Government believe that the clause is necessary and that the threshold of harm strikes the right balance, robustly protecting victims of false communications while maintaining people’s freedom of expression. Removing “psychological” harm from Clause 160 would make the offence too narrow and risk excluding communications that can have a lasting and serious effect on people’s mental well-being.
But psychological harm is only one aspect of Clause 160; all elements of the offence must be met. This includes a person sending a knowingly false message with an intention to cause non-trivial harm, and without reasonable excuse. It has also been tested extensively as part of the Law Commission’s report Modernising Communications Offences, when determining what the threshold of harm should be for this offence. It thus sets a high bar for prosecution, whereby a person cannot be prosecuted solely on the basis of a message causing psychological harm.
The noble Lord, Lord Allan, rightly recalled Section 127 of the Communications Act and the importance of probing issues such as this. I am glad he mentioned the Twitter joke trial—a good friend of mine acted as junior counsel in that case, so I remember it well. I shall spare the blushes of the noble Baroness, Lady Merron, in recalling who the Director of Public Prosecutions was at the time. But it is important that we look at these issues, and I am happy to speak further with my noble friend Lord Moylan and the noble Baroness, Lady Fox, about this and their broader concerns about freedom of expression between now and Report, if they would welcome that.
My noble friend Lord Moylan said that it would be unusual, or novel, to criminalise lying. The offence of fraud by false representation already makes it an offence dishonestly to make a false representation—to breach the ninth commandment—with the intention of making a gain or causing someone else a loss. So, as my noble and learned friend Lord Garnier pointed out, there is a precedent for lies with malicious and harmful intent being criminalised.
Amendments 267AA, 267AB and 268, tabled my noble friend Lady Buscombe and the noble Baroness, Lady Kennedy of The Shaws, take the opposite approach to those I have just discussed, as they significantly lower and expand the threshold of harm in the false and threatening communications offences. The first of these would specify that a threatening communications offence is committed even if someone encountering the message did not fear that the sender specifically would carry out the threat. I am grateful to the noble Baroness for her correspondence on this issue, informed by her work in Scotland. The test here is not whether a message makes a direct threat but whether it conveys a threat—which can certainly cover indirect or implied threats.
I reassure the noble Baroness and other noble Lords that Clause 162 already captures threats of “death or serious harm”, including rape and disfigurement, as well as messages that convey a threat of serious harm, including rape and death threats, or threats of serious injury amounting to grievous bodily harm. If a sender has the relevant intention or recklessness, the message will meet the required threshold. But I was grateful to see my right honourable friend Edward Argar watching our debates earlier, in his capacity as Justice Minister. I mentioned the matter to him and will ensure that his officials have the opportunity to speak to officials in Scotland to look at the work being done with regard to Scots law, and to follow the points that the noble Baroness, Lady Bennett, made about pictures—
I am grateful to the Minister. I was not imagining that the formulations that I played with fulfilled all of the requirements. Of course, as a practising lawyer, I am anxious that we do not diminish standards. I thank the noble Baroness, Lady Fox, for raising concerns about freedom of speech, but this is not about telling people that they are unattractive or ugly, which is hurtful enough to many women and can have very deleterious effects on their self-confidence and willingness to be public figures. Actually, I put the bar reasonably high in describing the acts that I was talking about: threats that somebody would kill, rape, bugger or disfigure you, or do whatever to you. That was the shocking thing: the evidence showed that it was often at that high level. It is happening not just to well-known public figures, who can become somewhat inured to this because they can find a way to deal with it; it is happening to schoolgirls and young women in universities, who get these pile-ons as well. We should reckon with the fact that it is happening on a much wider basis than many people understand.
Yes, we will ensure that, in looking at this in the context of Scots law, we have the opportunity to see what is being done there and that we are satisfied that all the scenarios are covered. In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007, so I hope that that satisfies her that that element is covered—but we will certainly look at all of this.
I turn to government Amendment 268AZA, which introduces the new serious self-harm offence, and Amendments 268AZB and 268AZC, tabled by the noble Lords, Lord Allan and Lord Clement-Jones. The Government recognise that there is a gap in the law in relation to the encouragement of non-fatal self-harm. The new offence will apply to anyone carrying out an act which intends to, and is capable of, encouraging or assisting another person seriously to self-harm by means of verbal or electronic communications, publications or correspondence.
I say to the noble Baroness, Lady Finlay of Llandaff, that the new clause inserted by Amendment 268AZA is clear that, when a person sends or publishes a communication that is an offence, it is also clear that, when a person forwards on another person’s communication, that will be an offence too. The new offence will capture only the most serious behaviour and avoid criminalising vulnerable people who share their experiences of self-harm. The preparation of these clauses was informed by extensive consultation with interested groups and campaign bodies. The new offence includes two key elements that constrain the offence to the most culpable offending; namely, that a person’s act must be intended to encourage or assist the serious self-harm of another person and that serious self-harm should amount to grievous bodily harm. If a person does not intend to encourage or assist serious self-harm, as will likely be the case with recovery and supportive material, no offence will be committed. The Law Commission looked at this issue carefully, following evidence from the Samaritans and others, and the implementation will be informed by an ongoing consultation as well.
I am sorry to interrupt the Minister, but the Law Commission recommended that the DPP’s consent should be required. The case that the Minister has made on previous occasions in some of the consultations that he has had with us is that this offence that the Government have proposed is different from the Law Commission one, and that is why they have not included the DPP’s consent. I am rather baffled by that, because the Law Commission was talking about a high threshold in the first place, and the Minister is talking about a high threshold of intent. Even if he cannot do so now, it would be extremely helpful to tie that down. As the noble Baroness and my noble friend said, 130 organisations are really concerned about the impact of this.
The Law Commission recommended that the consent, but not the personal consent, of the Director of Public Prosecutions should be required. We believe, however, that, because the offence already has tight parameters due to the requirement for an intention to cause serious self-harm amounting to grievous bodily harm, as I have just outlined, an additional safeguard of obtaining the personal consent of the Director of Public Prosecutions is not necessary. We would expect the usual prosecutorial discretion and guidance to provide sufficient safeguards against inappropriate prosecutions in this area. As I say, we will continue to engage with those groups that have helped to inform the drafting of these clauses as they are implemented to make sure that that assessment is indeed borne out.
I will follow up in writing on that point.
Before I conclude, I will mention briefly the further government amendments in my name, which make technical and consequential amendments to ensure that the communications offences, including the self-harm offence, have the appropriate territorial extent. They also set out the respective penalties for the communications offences in Northern Ireland, alongside a minor adjustment to the epilepsy trolling offence, to ensure that its description is more accurate.
I hope that noble Lords will agree that the new criminal laws that we will make through this Bill are a marked improvement on the status quo. I hope that they will continue to support the government amendments. I express my gratitude to the Law Commission and to all noble Lords—
Just before the Minister sits down—I assume that he has finished his brief on the self-harm amendments; I have been waiting—I have two questions relating to what he said. First, if I heard him right, he said that the person forwarding on is also committing an offence. Does that also apply to those who set up algorithms that disseminate, as opposed to one individual forwarding on to another individual? Those are two very different scenarios. We can see how one individual forwarding to another could be quite targeted and malicious, and we can see how disseminating through an algorithm could have very widespread harms across a lot of people in a lot of different groups—all types of groups—but I am not clear from what he said that that has been caught in his wording.
Secondly—I will ask both questions while I can—I asked the Minister previously why there have been no prosecutions under the Suicide Act. I understood from officials that this amendment creating an offence was to reflect the Suicide Act and that suicide was not included in the Bill because it was already covered as an offence by the Suicide Act. Yet there have been no prosecutions and we have had deaths, so I do not quite understand why I have not had an answer to that.
I will have to write on the second point to try to set that out in further detail. On the question of algorithms, the brief answer is no, algorithms would not be covered in the way a person forwarding on a communication is covered unless the algorithm has been developed with the intention of causing serious self-harm; it is the intention that is part of the test. If somebody creates an algorithm intending people to self-harm, that could be captured, but if it is an algorithm generally passing it on without that specific intention, it may not be. I am happy to write to the noble Baroness further on this, because it is a good question but quite a technical one.
It needs to be addressed, because these very small websites already alluded to are providing some extremely nasty stuff. They are not providing support to people and helping decrease the amount of harm to those self-harming but seem to be enjoying the spectacle of it. We need to differentiate and make sure that we do not inadvertently let one group get away with disseminating very harmful material simply because it has a small website somewhere else. I hope that will be included in the Minister’s letter; I do not expect him to reply now.
Some of us are slightly disappointed that my noble friend did not respond to my point on the interaction of Clause 160 with the illegal content duty. Essentially, what appears to be creating a criminal offence could simply be a channel for hyperactive censorship on the part of the platforms to prevent the criminal offence taking place. He has not explained that interaction. He may say that there is no interaction and that we would not expect the platforms to take any action against offences under Clause 160, or that we expect a large amount of action, but nothing was said.
If my noble friend will forgive me, I had better refresh my memory of what he said—it was some time ago—and follow up in writing.
My Lords, I will be extremely brief. There is much to chew on in the Minister’s speech and this was a very useful debate. Some of us will be happier than others; the noble Baroness, Lady Buscombe, will no doubt look forward to the digital markets Bill and I will just have to keep pressing the Minister on the Data Protection and Digital Information Bill.
There is a fundamental misunderstanding about digital identity theft. It will not necessarily always be fraud that is demonstrated—the very theft of the identity is designed to be the crime, and it is not covered by the Fraud Act 2006. I am delighted that the Minister has agreed to talk further with the noble Baroness, Lady Kennedy, because that is a really important area. I am not sure that my noble friend will be that happy with the response, but he will no doubt follow up with the Minister on his amendments.
The Minister made a very clear statement on the substantive aspect of the group, the new crime of encouraging self-harm, but further clarification is still needed. We will look very carefully at what he said in relation to what the Law Commission recommended, because it is really important that we get this right. I know that the Minister will talk further with the noble Baroness, Lady Finlay, who is very well versed in this area. In the meantime, I beg leave to withdraw my amendment.
My Lords, this is a real hit-and-run operation from the noble Lord, Lord Stevenson. He has put down an amendment on my favourite subject in the last knockings of the Bill. It is totally impossible to deal with this now—I have been thinking and talking about the whole area of AI governance and ethics for the past seven years—so I am not going to try. It is important, and the advisory committee under Clause 139 should take it into account. Actually, this is much more a question of authenticity and verification than of content. Trying to work out whether something is ChatGPT or GPT-4 content is a hopeless task; you are much more likely to be able to identify whether these are automated users such as chatbots than you are to know about the content itself.
I will leave it there. I missed the future-proofing debate, which I would have loved to have been part of. I look forward to further debates with the noble Viscount, Lord Camrose, on the deficiencies in the White Paper and to the Prime Minister’s much more muscular approach to AI regulation in future.
I am sure that the noble Lord, Lord Stevenson of Balmacara, is smiling over a sherry somewhere about the debate he has facilitated. His is a useful probing amendment and we have had a useful discussion.
The Government certainly recognise the potential challenges posed by artificial intelligence and digitally manipulated content such as deepfakes. As we have heard in previous debates, the Bill ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated where appropriate. Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service.
The labelling of this content via draft legislation is not something to which I can commit today. The Government’s AI regulation White Paper sets out the principles for the responsible development of artificial intelligence in the UK. These principles, such as safety, transparency and accountability, are at the heart of our approach to ensuring the responsible development and use of AI. As set out in the White Paper, we are building an agile approach that is designed to be adaptable in response to emerging developments. We do not wish to introduce a rigid, inflexible form of legislation for what is a flexible and fast-moving technology.
The public consultation on these proposals closed yesterday so I cannot pre-empt our response to it. The Government’s response will provide an update. I am joined on the Front Bench by the Minister for Artificial Intelligence and Intellectual Property, who is happy to meet with the noble Baroness, Lady Kidron, and others before the next stage of the Bill if they wish.
Beyond labelling such content, I can say a bit to make it clear how the Bill will address the risks coming from machine-generated content. The Bill already deals with many of the most serious and illegal forms of manipulated media, including deepfakes, when they fall within scope of services’ safety duties regarding illegal content or content that is potentially harmful to children. Ofcom will recommend measures in its code of practice to tackle such content, which could include labelling where appropriate. In addition, the intimate image abuse amendments that the Government will bring forward will make it a criminal offence to send deepfake images.
In addition to ensuring that companies take action to keep users safe online, we are taking steps to empower users with the skills they need to make safer choices through our work on media literacy. Ofcom, for example, has an ambitious programme of work through which it is funding several initiatives to build people’s resilience to harm online, including initiatives designed to equip people with the skills to identify disinformation. We are keen to continue our discussions with noble Lords on media literacy and will keep an open mind on how it might be a tool for raising awareness of the threats of disinformation and inauthentic content.
With gratitude to the noble Lords, Lord Stevenson and Lord Knight, and everyone else, I hope that the noble Lord, Lord Knight, will be content to withdraw his noble friend’s amendment.
My Lords, I am grateful to everyone for that interesting and quick debate. It is occasionally one’s lot that somebody else tables an amendment but is unavoidably detained in Jerez, drinking sherry, and monitoring things in Hansard while I move the amendment. I am perhaps more persuaded than my noble friend might have been by the arguments that have been made.
We will return to this in other fora in response to the need to regulate AI. However, in the meantime, I enjoyed in particular the John Booth quote from the noble Baroness, Lady Bennett. In respect of this Bill and any of the potential harms around generative AI, if we have a Minister who is mindful of the need for safety by design when we have concluded this Bill then we will have dealt with the bits that we needed to deal with as far as this Bill is concerned.
My Lords, what more can I say than that I wish to be associated with the comments made by the noble Baroness and then by the noble Lord, Lord Clement-Jones? I look forward to the Minister’s reply.
I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.
As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.
The Bill already includes a definition of age assurance in Clause 207, which is
“measures designed to estimate or verify the age or age-range of users of a service”.
As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.
This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.
(1 year, 5 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, I am pleased that we are on Report, and I thank all noble Lords who took part in Committee and those with whom I have had the pleasure of discussing issues arising since then, particularly for their constructive and collaborative nature, which we have seen throughout the passage of Bill.
In Committee, I heard the strength of feeling and the desire for an introductory clause. It was felt that this would help make the Bill less complex to navigate and make it less easy for providers to use this complexity to try to evade their duties under it. I have listened closely to these concerns and thank the noble Lord, Lord Stevenson of Balmacara, the noble Baroness, Lady Merron, and others for their work on this proposal. I am particularly grateful for their collaborative approach to ensuring the new clause has the desired effect without causing legal uncertainty. In that spirit, I am pleased to introduce government Amendment 1. I am grateful too to the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, who have signed their names to it. That is a very good start to our amendments here on Report.
Amendment 1 inserts an introductory clause at the start of the Bill, providing an overarching statement about the main objectives of the new regulatory framework. The proposed new clause describes the main broad objectives of the duties that the Bill imposes on providers of regulated services and that the Bill confers new functions and powers on Ofcom.
The clause makes clear that regulated services must identify, mitigate and manage risks that particularly affect people with a certain characteristic. This recognises that people with certain characteristics, or more than one such characteristic, are disproportionately affected by online harms and that providers must account for and protect them from this. The noble Baroness, Lady Merron, raised the example of Jewish women, as did the noble Baroness, Lady Anderson of Stoke-on-Trent. Sadly, they have first-hand experience of the extra levels of abuse and harm that some groups of people can face when they have more than one protected characteristic. It could just as easily be disabled women or queer people of colour. The noble Baroness, Lady Merron, has tabled several amendments highlighting this problem, which I will address further in response to the contribution I know she will make to this debate.
Subsection 3 of the proposed new clause outlines the main outcomes that the duties in the Bill seek to secure. It is a fundamental principle of the legislation that the design of services can contribute to the risk of users experiencing harm online. I thank the noble Lord, Lord Russell of Liverpool, for continuing to raise this issue. I am pleased to confirm that this amendment will state clearly that a main outcome of the legislation is that services must be safe by design. For example, providers must choose and design their functionalities so as to limit the risk of harm to users. I know this is an issue to which we will return later on Report, but I hope this provides reassurance about the Government’s intent and the effect of the Bill’s framework.
Services must also be designed and operated in a way which ensures that a higher standard of protection is provided for children than for adults, that users’ rights to freedom of expression and privacy are protected and that transparency and accountability are enhanced. It should be noted that we have worked to ensure that this clause provides clarity to those affected by the Bill without adversely affecting the interpretation or effect of the substantive provisions of the rest of the Bill. As we debated in Committee, this is of the utmost importance, to ensure that this clause does not create legal uncertainty or risk with the interpretation of the rest of the Bill’s provisions.
I hope that your Lordships will welcome this amendment and I beg to move.
Amendment 2 (to Amendment 1)
My Lords, needless to say, I disagree with what the noble Lord, Lord Moylan, has just been saying precisely because I believe that the new clause that the Minister has put forward, which I have signed and has support across the House, expresses the purpose of the Bill in the way that the original Joint Committee wanted. I pay tribute to the Minister, who I know has worked extremely hard, in co-operation with the noble Lord, Lord Stevenson of Balmacara, to whom I also pay tribute for getting to grips with a purpose clause. The noble Baronesses, Lady Kidron and Lady Harding, have put their finger on it: this is more about activity and design than it is about content, and that is the reason I fundamentally disagree with the noble Lord, Lord Moylan. I do not believe that will be the impact of the Bill; I believe that this is about systemic issues to do with social media, which we are tackling.
I say this slightly tongue-in-cheek, but if the Minister had followed the collective wisdom of the Joint Committee originally, perhaps we would not have worked at such breakneck speed to get everything done for Report stage. I believe that the Bill team and the Minister have worked extremely hard in a very few days to get to where we are on many amendments that we will be talking about in the coming days.
I also want to show my support for the noble Baroness, Lady Merron. I do not believe it is just a matter of the Interpretation Act; I believe this is a fundamental issue and I thank her for raising it, because it was not something that was immediately obvious. The fact is that a combination of characteristics is a particular risk in itself; it is not just about having several different characteristics. I hope the Minister reflects on this and can give a positive response. That will set us off on a very good course for the first day of Report.
My Lords, this has indeed set us on a good course, and I am grateful to noble Lords for their questions and contributions. I apologise to my noble friend Lord Moylan, with whom I had the opportunity to discuss a number of issues relating to freedom of expression on Monday. We had tabled this amendment, and I apologise if I had not flagged it and sought his views on it explicitly, though I was grateful to him and the noble Baroness, Lady Fox of Buckley, for their time in discussing the issues of freedom of expression more broadly.
I am grateful to my noble friend Lady Harding and to the noble Baroness, Lady Kidron, for their tireless work over many months on this Bill and for highlighting the importance of “content” and “activity”. Both terms have been in the Bill since its introduction, for instance in Clauses 5(2) and (3), but my noble friend Lady Harding is right to highlight it in the way that she did. The noble Baroness, Lady Kidron, asked about the provisions on safety by design. The statement in the new clause reflects the requirements throughout the Bill to address content and activity and ensure that services are safe by design.
On the amendments tabled by the noble Baroness, Lady Merron, which draw further attention to people who have multiple characteristics and suffer disproportionately because of it, let me start by saying again that the Government recognise that this is, sadly, the experience for many people online, and that people with multiple characteristics are often at increased risk of harm. The Bill already accounts for this, and the current drafting captures people with multiple characteristics because of Section 6 of the Interpretation Act 1978. As she says, this was a new one to me—other noble Lords may be more familiar with this legacy of the Callaghan Government—but it does mean that, when interpreting statute, words in the singular include the plural and words in the plural include the singular.
If we simply amended the references that the noble Baroness highlights in her amendments, we would risk some uncertainty about what those provisions cover. I sympathise with the concern which lies behind her amendments, and I am grateful for her time in discussing this matter in detail. I agree that it would be helpful to make it clearer that the Bill is designed to protect people with multiple characteristics. This clause is being inserted to give clarity, so we should seek to do that throughout.
We have therefore agreed to add a provision in Clause 211—the Bill’s interpretation clause—to make clear that all the various references throughout the Bill to people with a certain characteristic include people with a combination of characteristics. This amendment was tabled yesterday and will be moved at a later day on Report, so your Lordships’ House will have an opportunity to look at and vote on that. I hope that that provision clarifies the intention of the wording used in the Bill and puts the issue beyond doubt. I hope that the noble Baroness will be satisfied, and I am grateful to all noble Lords for their support on this first amendment.
My Lords, I am grateful to the Minister for his response. It is a very practical response and certainly one that I accept as a way forward. I am sure that the whole House is glad to hear of his acknowledgement of the true impact that having more than one protected characteristic can have, and of his commitment to wanting the Bill to do the job it is there to do. With that, I am pleased to withdraw the amendment in my name.
My Lords, this has been an interesting debate that in a curious way moves us from the debate on the first group, which was about the high level of aspiration for this Bill, for the work of those involved in it and indeed for Parliament as a whole, down to some of the nitty-gritty points that emerge from some of the Bill’s proposals. I am very much looking forward to the Minister’s response.
In a sense, where the noble Lord, Lord Clement-Jones, ends, I want to start. The noble and learned Lord, Lord Garnier, did a good job of introducing the points made previously by his colleague, the noble Baroness, Lady Buscombe, in relation to those unfortunate exercises of public comment on businesses, and indeed individuals, that have no reason to receive them. There does not seem to be a satisfactory sanction for that. In a sense he was drawn by the overarching nature of Clause 1, but I think we have established between us that Clause 1 does not have legal effect in the way that he would like, so we would probably need to move further forward. The Government probably need to pick up his points in relation to some of the issues that are raised further down, because they are in fact not dissimilar and could be dealt with.
The key issue is the one that my noble friend Lady Kennedy ended on, in the sense that the law online and the law offline, as mentioned by the noble Lord, Lord Clement-Jones, seem to be at variance about what you can and cannot do in relation to threats issued, whether or not they are general, to a group or groups in society. This is a complex area that needs further thought of the nature that has been suggested, and may well refer back to the points made by the noble Baroness, Lady Morgan. There is something here that we are not tackling correctly. I look forward to the Government’s response. We would support movement in that area should that agreement be made.
Unfortunately, the noble Lord, Lord Russell, whom I am tempted to call my noble friend because he is a friend, has just moved out of his seat—I do not need to give him a namecheck any more—but he and I went to a meeting yesterday, I think, although I have lost track of time. It was called by Luke Pollard MP and related to the incel movement or, as the meeting concluded, what we should call the alleged incel movement, because by giving it a name we somehow give it a position. I wanted to make that point because a lot of what we are talking about here is in the same territory. It was an informal research-focused meeting to hear all the latest research being done on the group of activities going under the name of the alleged incel movement.
I mention that because it plays into a lot of the discussion here. The way in which those who organise it do so—the name Andrew Tate has already been mentioned—was drawn into the debate in a much broader context by that research, particularly because representatives from the Home Office made the interesting point that the process by which the young men who are involved in this type of activity are groomed to join groups and are told that by doing so they are establishing a position that has been denied to them by society in general, and allegedly by women in particular, is very similar to the methods used by those who are cultivating terrorism activity. That may seem to be a big stretch but it was convincing, and the argument and debate around that certainly said to me that there are things operating within the world of social media, with its ability to reach out to those who often feel alone, even if they are not, and who feel ignored, and to reach them in a way that causes them to overreact in the way they deal with the issues they face.
That point was picked up by others, including my noble friend Lady Kennedy and the noble Baroness, Lady Burt, in relation to the way in which the internet itself is in some way gendered against women. I do not in any sense want to apportion blame anywhere for that; it is a much more complex issue than single words can possibly address, but it needs to be addressed. As was said in the meeting and has been said today, there are cultural, educational and holistic aspects here. We really do not tackle the symptoms or the effects of it, but we should also look at what causes people to act in the way they have because of, or through the agency of, the internet.
Having said that, I support the amendments from the noble Lord, Lord Allan, and I look forward to the Government’s response to them. Amendment 5B raises the issue that it will be detrimental to society if people stop posting and commenting on things because they fear that they will be prosecuted—or not even prosecuted but attacked. The messages that they want to share will be lost as a result, and that is a danger that we do not want to encourage. It will be interesting to hear the Minister’s response to that.
The noble Baroness, Lady Burt, made powerful points about the way in which the offence of cyberflashing is going to be dealt with, and the differences between that and the intimate image abuse that we are coming on to in the next group. It may well be that this is the right way forward, and indeed we support the Government in the way that they are going, but it is important to recognise her point that we need a test of whether it is working. The Government may well review the impact of the Bill in the normal way of things, but this aspect needs particular attention; we need to know whether there are prosecutions and convictions and whether people understand the implication of the change in practice. We need publicity, as has been said, otherwise it will not be effective in any case. These issues, mentioned by the noble Baroness, Lady Burt, and picked up by the noble Baroness, Lady Morgan, are important. We will have other opportunities to discuss them, but at this stage we should at least get a response to that.
If it is true that in Northern Ireland there is now a different standard for the way in which cyberflashing offences are to be undertaken—taking into account the points made very well by the noble Baroness, Lady Fox, and the worry about encouraging more offences for which crimes may not necessarily be appropriate at this stage, particularly the one about recklessness—do the Government not have a slight problem here? In the first case, do we really accept that we want differences between the various regions and nations of our country in these important issues? We support devolution but we also need to have a sense of what the United Kingdom as a whole stands for in its relationship with these types of criminal offence, if they are criminal. If that happens, do we need a better understanding of why one part of the country has moved in a particular way, and is that something that we are missing in picking up action that is perhaps necessary in other areas? As my noble friend Lady Kennedy has also said, some of the work she has been doing in Scotland is ahead of the work that we have been doing in this part of the United Kingdom, and we need to pick up the lessons from that as well.
As I said at the beginning, this is an interesting range of amendments. They are not as similar as the grouping might suggest, but they point in a direction that needs government attention, and I very much look forward to the Minister’s comments on them.
I am grateful to my noble friends Lady Buscombe and Lord Leicester and my noble and learned friend Lord Garnier for the amendments that they have tabled, with which we began this helpful debate, as well as for their time earlier this week to discuss them. We had a good debate on this topic in Committee and I had a good discussion with my noble friend Lady Buscombe and my noble and learned friend Lord Garnier on Monday. I will explain why the Government cannot accept the amendments that they have brought forward today.
I understand my noble friends’ concerns about the impact that fake reviews can have on businesses, but the Bill and the criminal offences it contains are not the right place to address this issue. The amendments would broaden the scope of the offences and likely result in overcriminalisation, which I know my noble friends would not want to see.
I appreciate the Minister’s response. Could he also respond to my suggestion that it would be helpful for some of the people working on the front line to meet officials to go through their concerns in more detail?
I am very happy to make that commitment. It would be useful to have their continued engagement, as we have had throughout the drafting of the Bill.
The noble Baroness, Lady Burt of Solihull, has tabled a number of amendments related to the new offence of cyberflashing. I will start with her Amendment 6. We believe that this amendment reduces the threshold of the new offence to too great an extent. It could, for example, criminalise a person sending a picture of naked performance art to a group of people, where one person might be alarmed by the image but the sender sends it anyway because he or she believes that it would be well received. That may be incorrect, unwise and insensitive, but we do not think it should carry the risk of being convicted of a serious sexual offence.
Crucially, the noble Baroness’s amendment requires that the harm against the victim be proven in court. Not only does this add an extra step for the prosecution to prove in order for the perpetrator to be convicted, it creates an undue burden on the victim, who would be cross-examined about his or her—usually her—experience of harm. For example, she might have to explain why she felt humiliated; this in itself could be retraumatising and humiliating for the victim. By contrast, Clause 170 as drafted means that the prosecution has only to prove and focus on the perpetrator’s intent.
I am very grateful for the Minister’s comments. This is the crux of my confusion: I am not entirely sure why it is necessary for the victim to appear in court. In intimate image abuse, is it not the case that the victim does not have to make an appearance in court? What is the difference between intimate image abuse and cyberflashing abuse? I do not get why one attracts a physical court appearance and the other does not. They seem to be different sides of the same coin to me.
If a defendant said that he—usually he—had sent an image believing that the consent of the recipient was implied, the person making the complaint would be cross-examined on whether or not she had indeed given that consent. If an offence predicated on proof of non-consent or proof of harm were made out, the victim could be called to give evidence and be cross-examined in court. The defence would be likely to lead evidence challenging the victim’s characteristics and credibility. We do not want that to be a concern for victims; we do not want that to be a barrier to victims coming forward and reporting abuse for fear of having their sexual history or intentions cross-examined.
My Lords, we are coming to this in the next group, but that is a consent-based offence, is it not?
It is—and I shall explain more in that group why we take that approach. But the offence of cyberflashing matches the existing offence of flashing, which is not a consent-based offence. If somebody flashes at someone in public, it does not matter whether the person who sees that flashing has consented to it—it is the intent of the flasher that is the focus of the court. That is why the Law Commission and we have brought the cyberflashing offence forward in the same way, whereas the sharing of intimate images without somebody’s consent relies on the consent to sharing. But I shall say a bit more when we get to that group, if the noble Lord will allow.
I am sure that the noble and learned Lord, Lord Garnier, is going to come in, and he knows a great deal more about this than I do. But we are getting into the territory where we talk about whether or not somebody needs to appear in court in order to show consent. That was all that I was trying to point out, in a way—that, if the Minister accepted the amendment on behalf of my noble friend, and then the complainant had to appear in court, why is that not the case with intimate abuse?
Perhaps I can respond to the point about intimate abuse when we come on to the next group—that might be helpful.
It might be helpful—except for the refusal to accept my noble friend’s amendment.
If the defendant said that they had sent an image because they thought that consent had been obtained, the person whose consent was under question would find themselves cross-examined on it in a way that we do not want to see. We do not want that to be a barrier to people reporting this, in the same way that it is not for people who report flashing on the streets.
My Lords, I do not want to interfere in private grief, but the courts have powers to protect witnesses, particularly in cases where they are vulnerable or will suffer acute distress, by placing screens in the way and controlling the sorts of cross-examinations that go on. I accept the concern expressed by the noble Baroness, Lady Burt, but I think that my noble friend the Minister will be advised that there are protective measures in place already for the courts to look after people of the sort that she is worried about.
There are indeed but, as my noble and learned friend’s interjection makes clear, those are still means for people to be cross-examined and give their account in court, even with those mitigations and protections. That is really the crux of the issue here.
We have already debated the risk that the approach that the noble Baroness sets out in her Amendments 5C and 7A criminalises sending messages, and people whom we would not deem to be criminal. I want to reassure her and your Lordships’ House that the intent-based offence, as drafted at Clause 170, provides the comprehensive protections for victims that we all want to see, including situations where the perpetrator claims it was “just for a joke”. The offence is committed if a perpetrator intended to cause humiliation, and that captures many supposed “joke” motives, as the perverted form of humour in this instance is often derived from the victim’s humiliation, alarm or distress.
Indeed, it was following consultation with victims’ groups and others that the Law Commission added humiliation as a form of intent to the offence to address those very concerns. Any assertions made by a defendant in this regard would not be taken at face value but would be considered and tested by the police and courts in the usual way, alongside the evidence. The Crown Prosecution Service and others are practised in prosecuting intent, and juries and magistrates may infer intention from the context of the behaviour and its foreseeable consequences.
The addition of defences, as the noble Baroness suggests in her Amendment 7A, is unfortunately still not sufficient to ensure that we are not overcriminalising here. Even with the proposed defences, sending a picture of genitalia without consent for medical reasons would still risk being considered a criminal Act and potentially compel a medical professional to justify that he or she has an adequate defence.
It is about the burden on the medical professionals and the question of whether it comes to court when the police investigate it and the prosecution make out. We do not want to see that sort of behaviour being overly criminalised or the risk of prosecution hanging over people for reasons where it is not needed. We want to make sure that the offence is focused on the behaviour that we all want to tackle here.
The Law Commission has looked at this extensively—and I am glad the noble Baroness has had the opportunity to speak to it directly—and brought forward these proposals, which mirror the offence of flashing that already exists in criminal law. We think that is the right way of doing it and not risking the overcriminalisation of those whom noble Lords would not want to capture.
Contrary to some concerns that have been expressed, the onus is never on the victim to marshal evidence or prove the intent of the perpetrator. It is for the police and the Crown Prosecution Service when investigating the alleged offence or prosecuting the case in court. That is why we and the Law Commission consulted the police and the CPS extensively in bringing the offence forward.
By contrast, as I say, the consent-based approach is more likely to put onerous pressure on the victim by focusing the case on his or her behaviour and sexual history instead of the behaviour of the perpetrator. I know and can tell from the interjections that noble Lords still have some concerns or questions about this offence as drafted. I reassure them, as my noble friend Lady Morgan of Cotes urged, that we will be actively monitoring and reviewing the implementation of this offence, along with the Crown Prosecution Service and the police, to ensure that it is working effectively and bringing perpetrators to justice.
The noble Baroness, Lady Burt, also raised the importance of public engagement and education in this regard. As she may know, the Government have a long-term campaign to tackle violence against women and girls. The Enough campaign covers a range of online and offline forms of abuse, including cyberflashing. The campaign includes engaging with the public to deepen understanding of this offence. It focuses on educating young people about healthy relationships, on targeting perpetrators and on ensuring that victims of violence against women and girls can access support. Future phases of the Enough campaign will continue to highlight the abusive nature and unacceptability of these behaviours, and methods for people safely to challenge them.
In addition, in our tackling violence against women and girls strategy the Government have committed to invest £3 million better to understand what works to prevent violence against women and girls, to invest in high-quality, evidence-informed prevention projects, including in schools, aiming to educate and inform children and young people about violence against women and girls, healthy relationships and the consequences of abuse.
With that commitment to keep this under review—to ensure that it is working in the way that the Law Commission and the Government hope and expect it to—and with that explanation of the way we will be encouraging the public to know about the protections that are there through the law and more broadly, I hope noble Lords will be reassured and will not press their amendments.
Before the Minister sits down, I express my gratitude that he has indicated that my amendment would have some serious impact. I thank the noble Lord, Lord Clement-Jones, for saying that there should be some learning among men in the House and in wider society about what puts real fear in the hearts of women and how it affects how women conduct their lives. I thank those who said that some change is necessary.
We have to remember that this clause covers a threatening communications offence. I know that something is going to be said about the particular vulnerability of women and girls—the noble Baroness, Lady Morgan, mentioned it, and I am grateful for that—but this offence is not specific to one gender. It is a general offence that someone commits if a message they send conveys a threat of death or serious harm.
I reassure the noble Baroness, Lady Fox, that we are not talking about a slight—saying to a woman that she is ugly or something. This is not about insults but about serious threats. The business about it being reckless as to whether or not it is going to be carried out is vital. Clause 164(1)(c)(i) says an offence is committed if it is intended that an individual encountering the message would fear that the threat would be carried out. I would like to see added the words, “whether or not by the person sending the message”.
Just think of this in the Irish context of years gone by. If someone sent a message saying, “You should be kneecapped”, it is very clear that we would be talking about something that would put someone in terror and fear. It is a serious fear, so I am glad that this is supported by the Minister, and I hope we will progress it to the next stage.
My Lords, without wishing to disrupt the very good nature of this debate, I remind the House that the Companion advises against speaking more than once on Report, except for specific questions or points of elucidation.
None the less, I am grateful to the noble Baroness for her clarification and expansion of this point. I am glad that she is satisfied with the approach we have set out.
It is not specific to women; it is general.
The issue the noble Baroness has highlighted will protect all victims against people trying to evade the law, and I am grateful to her. We will bring forward an amendment at Third Reading.
My Lords, I will be incredibly brief because everything that needs to be said has been said at least twice. I am grateful to those who have taken the trouble to listen to what I had to say, and I am grateful to the Minister for his response. I beg leave to withdraw my amendment.
My Lords, I am grateful for the opportunity to continue some of the themes we touched on in the last group and the debate we have had throughout the passage of the Bill on the importance of tackling intimate image abuse. I shall introduce the government amendments in this group that will make a real difference to victims of this abhorrent behaviour.
Before starting, I take the opportunity again to thank the Law Commission for the work it has done in its review of the criminal law relating to the non-consensual taking, making and sharing of intimate images. I also thank my right honourable friend Dame Maria Miller, who has long campaigned for and championed the victims of online abuse. Her sterling efforts have contributed greatly to the Government’s approach and to the formulation of policy in this sensitive area, as well as to the reform of criminal law.
As we announced last November, we intend to bring forward a more expansive package of measures based on the Law Commission’s recommendations as soon as parliamentary time allows, but the Government agree with the need to take swift action. That is why we are bringing forward these amendments now, to deliver on the recommendations which fall within the scope of the Bill, thereby ensuring justice for victims sooner.
These amendments repeal the offence of disclosing private sexual photographs and films with intent to cause distress and replace it with four new sexual offences in the Sexual Offences Act 2003. The first is a base offence of sharing an intimate photograph or film without consent or reasonable belief in consent. This recognises that the sharing of such images, whatever the intent of the perpetrator, should be considered a criminal violation of the victim’s bodily autonomy.
The amendments create two more serious offences of sharing an intimate photograph or film without consent with intent to cause alarm, distress or humiliation, or for the purpose of obtaining sexual gratification. Offenders committing the latter offence may also be subject to notification requirements, commonly referred to as being on the sex-offenders register. The amendments create an offence of threatening to share an intimate image. These new sharing offences are based on the Law Commission’s recommended approach to the idea of intimate photographs or films to include images which show or appear to show a person nude or partially nude, or which depict sexual or toileting activity. This will protect more victims than the current Section 33 offence, which protects only images of a private and sexual nature.
Finally, these clauses will, for the first time, make it a criminal offence to share a manufactured or so-called deepfake image of another person without his or her consent. This form of intimate image abuse is becoming more prevalent, and we want to send a clear message that it will not be tolerated.
By virtue of placing these offences in the Sexual Offences Act 2003, we are extending to these offences also the current special measures, so that victims can benefit from them in court, and from anonymity provisions, which are so important when something so intimate has been shared without consent. This is only the first stage in our reform of the law in this area. We are committed to introducing additional changes, giving effect to further recommendations of the Law Commission’s report which are beyond the scope of the Bill, when parliamentary time allows.
I hope that noble Lords from across your Lordships’ House will agree that these amendments represent an important step forward in tackling intimate image abuse and protecting victims. I commend them to the House, and I beg to move.
My Lords, I welcome these new offences. From my professional experience, I know that what came to be known as “sextortion” created some of the most distressing cases you could experience, where an individual would obtain intimate images, often by deception, and then use them to make threats. This is where a social network is particularly challenging; it enables people to access a network of all the family and friends of an individual whose photo they now hold and to threaten to distribute it to their nearest and dearest. This affects men and women; many of the victims were men who were honey-potted into sharing intimate images and in the worst cases it led to suicide. It was not uncommon that people would feel that there was no way out; the threat was so severe that they would take their own lives. It is extremely welcome that we are doing something about it, and making it more obvious to anyone who is thinking about committing this kind of offence that they run the risk of criminal prosecution.
I have a few specific questions. The first is on the definitions in proposed new Section 66D, inserted by government Amendment 8, where the Government are trying to define what “intimate” or “nudity” represents. This takes me back again to my professional experience of going through slide decks and trying to decide what was on the right or wrong side of a nudity policy line. I will not go into the detail of everything it said, not least because I keep noticing younger people in the audience here, but I will leave you with the thought that you ended up looking at images that involved typically fishnets, in the case of women, and socks, in the case of men—I will leave the rest to your Lordships’ imaginations to determine at what point someone has gone from being clothed to nude. I can see in this amendment that the courts are going to have to deal with the same issues.
The serious point is that, where there is alignment between platform policies, definitions and what we do not want to be distributed, that is extremely helpful, because it then means that if someone does try to put an intimate image out across one of the major platforms, the platform does not have to ask whether there was consent. They can just say that it is in breach of their policy and take it down. It actually has quite a beneficial effect on slowing transmission.
The other point that comes out of that is that some of these questions of intimacy are quite culturally subjective. In some cultures, even a swimsuit photo could be used to cause humiliation and distress. I know this is extremely difficult; we do not want to be overly censorious but, at the same time, we do not want to leave people exposed to threats, and if you come from a culture where a swimsuit photo would be a threat, the definitions may not work for you. So I hope that, as we go through this, there will be a continued dialogue between experts in the platforms who have to deal with these questions and people working on the criminal offence side. To the extent that we can achieve it, there should be alignment and the message should go out that if you are thinking of distributing an image like this, you run the risk of being censored by the platforms but also of running into a criminal prosecution. That is on the mechanics of making it work.
My Lords, I am grateful to the Minister for introducing this suite of government amendments. From these Benches we welcome them. From the nature of the debate, this seems to be very much a work in progress. I wish the Minister well as he and the Justice Minister continue to pick their way through a route to get us to where we need to be. I too thank the Law Commission, Dame Maria Miller MP and so many other campaigners who, as noble Lords have said, have got us to this important point.
However, as I am sure is recognised, with the best of intentions, the government amendments still leave some areas that are as yet unresolved, particularly on sharing images with others: matters such as revenge porn and sending unwanted pictures on dating apps. There are areas still to be explored. The Minister and the Justice Minister said in a letter that, when parliamentary time allows, there will be a broader package of offences being brought forward. I realise that the Minister cannot be precise, but I would appreciate some sense of urgency or otherwise in terms of parliamentary time and when that might be.
We are only just starting to understand the impact of, for example, artificial intelligence, which we are about to come on to. That will be relevant in this regard too. We all understand that this is a bit of a moveable feast. The test will be whether this works. Can the Minister say a bit more about how this suite of measures will be kept under review and, in so doing, will the Government be looking at keeping an eye on the number of charges that are brought? How will this be reported to the House?
In line with this, will there be some consideration of the points that were raised in the previous group? I refer particularly to the issues raised in the amendments tabled by the noble Baroness, Lady Burt, especially where there may not be the intent, or the means, to obtain sexual gratification. They might be about “having a bit of a laugh”, as the noble Baroness said—which might be funny to some but really not funny to others.
In welcoming this, I hope that the Minister will indicate that this is just one step along the way and when we will see further steps.
I am happy to respond clearly to that. As my right honourable friend Edward Argar MP and I said in our letter, this is just the first step towards implementing the changes which the Law Commission has recommended and which we agree are needed. We will implement a broader package of offences, covering, for instance, the taking of intimate images without consent, which were also part of the Law Commission’s report. The parameters of this Bill limit what we can do now. As I said in my opening remarks, we want to bring those forward now so that we can provide protections for victims in all the ways that the Bill gives us scope to do. We will bring forward further provisions when parliamentary time allows. The noble Baroness will understand that I cannot pre-empt when that is, although if we make good progress on the Bill, parliamentary time may allow for it sooner.
The noble Baroness also asked about our review. We will certainly take into account the number of prosecutions and charges that are brought. That is always part of our consideration of criminal law, but I am happy to reassure her that this will be the case here. These are new offences, and we want to make sure that they are leading to prosecutions to deter people from doing it.
The noble Lord, Lord Allan of Hallam, asked whether images will include those shared on virtual reality platforms and in other novel ways. As he knows, the Bill is written in a technologically neutral way to try to be future-proof and capture those technologies which have not yet been invented. I mentioned deepfakes in my opening remarks, which we can envisage. An image will be included on whatever platform it is shared, if it appears to be a photograph or film—that is to say, if it is photo-real. I hope that reassures him.
If the Minister has time, can he actually direct us to that, because it is important that we are clear that it really is captured?
In the amendments, if I can, I will. In the meantime, I reassure my noble friend Lady Morgan of Cotes that, as I said in opening, placing these offences in the Sexual Offences Act means that we are also extending the current special measures provisions to these offences, as we heard in our debate on the last group, so that victims can benefit from those in court. The same applies to anonymity provisions, which are so important when something so intimate has been shared without someone’s consent.
I promised in the previous group to outline the difference in the consent basis between this offence and the cyberflashing offence. Both are abhorrent behaviours which need to be addressed in criminal law. Although the levels of harm and distress may be the same in each case, the Law Commission recommended different approaches to take into account the different actions of the perpetrator in each offence. Sharing an intimate image of somebody without their consent is, in and of itself, wrongful, and a violation of their bodily privacy and sexual autonomy. Sending a genital image without the consent of the recipient is not, in and of itself, wrongful; for instance, the example I gave in the previous debate about an artistic performance, or a photograph which depicts a naked protester. If that was sent without the consent of the recipient, it is not always or necessarily harmful. This is an issue which the Law Commission looked at in some detail.
The criminal law must take the culpability of the perpetrator into account. I reassure noble Lords that both we and the Law Commission have looked at these offences considerably, working with the police and prosecutors in doing so. We are confident that the Bill provides the comprehensive protection for victims that we all want to see, including in situations where a perpetrator may claim that it was just a joke.
The terms “photograph” and “film” are defined in proposed new Section 66D(5). That refers to the definition in new Section 66A, which refers to an image which is made or altered in any way
“which appears to be a photograph or film”.
That is where the point I make about photo-reality is captured.
The noble Baroness, Lady Kidron, is right to highlight that this is a matter not just for the criminal law. As we discussed on the previous group, it is also a matter for public education, so that young people and users of any age are aware of the legal boundaries and legal issues at stake here. That is why we have the public education campaigns to which I alluded in the previous group.
I believe I misspoke when I asked my question. I referred to under-18s. Of course, if they are under 18 then it is child sexual abuse. I meant someone under the age of 18 with an adult image. I put that there for the record.
If the noble Baroness misspoke, I understood what she intended. I knew what she was getting at.
With that, I hope noble Lords will be content not to press their amendments and that they will support the government amendments.
(1 year, 5 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, the Government are committed to protecting children against accessing pornography online. As technology evolves, it is important that the regulatory framework introduced by the Bill keeps pace with emerging risks to children and exposure to pornography in new forms, such as generative artificial intelligence.
Part 5 of the Bill has been designed to be future-proof, and we assess that it would already capture AI-generated pornography. Our Amendments 206 and 209 will put beyond doubt that content is “provider pornographic content” where it is published or displayed on a Part 5 service by means of an automated tool or algorithm, such as a generative AI bot, made available on the service by a provider. Amendments 285 and 293 make clear that the definition of an automated tool includes a bot. Amendment 276 clarifies the definition of a provider of a Part 5 service, to make clear that a person who controls an AI bot that generates pornography can be regarded as the provider of a service.
Overall, our amendments provide important certainty for users, providers and Ofcom on the services and content in scope of the Part 5 duties. This will ensure that the new, robust duties for Part 5 providers to use age verification or age estimation to prevent children accessing provider pornographic content will also extend to AI-generated pornography. I beg to move.
My Lords, the noble Baroness, Lady Kidron, has unfortunately been briefly detained. If you are surprised to see me standing up, it is because I am picking up for her. I start by welcoming these amendments. I am grateful for the reaction to the thought-provoking debate that we had in Committee. I would like to ask a couple of questions just to probe the impact around the edges.
Amendment 27 looks as if it implies that purely content-generating machine-learning or AI bots could be excluded from the scope of the Bill, rather than included, which is the opposite of what we were hoping to achieve. That may be us failing to understand the detail of this large body of different amendments, but I would welcome my noble friend the Minister’s response to make sure that in Amendment 27 we are not excluding harm that could be generated by some form of AI or machine-learning instrument.
Maybe I can give my noble friend the Minister an example of what we are worried about. This is a recent scenario that noble Lords may have seen in the news, of a 15 year-old who asked, “How do I have sex with a 30 year-old?”. The answer was given in forensic detail, with no reference to the fact that it would in fact be statutory rape. Would the regulated service, or the owner of the regulated service that generated that answer, be included or excluded as a result of Amendment 27? That may be my misunderstanding.
This group is on AI-generated pornography. My friend, the noble Baroness, Lady Kidron, and I are both very concerned that it is not just about pornography, and that we should make sure that AI is included in the Bill. Specifically, many of us with teenage children will now be learning how to navigate the Snap AI bot. Would harm generated by that bot be captured in these amendments, or is it only content that is entirely pornographic? I hope that my noble friend the Minister can clarify both those points, then we will be able to support all these amendments.
My Lords, this has been a short but important debate and I am grateful to noble Lords for their broad support for the amendments here and for their questions. These amendments will ensure that services on which providers control a generative tool, such as a generative AI bot, are in scope of Part 5 of the Bill. This will ensure that children are protected from any AI-generated pornographic content published or displayed by provider-controlled generative bots. These changes will not affect the status of any non-pornographic AI-generated content, or AI-generated content shared by users.
We are making a minor change to definitions in Part 3 to ensure that comments or reviews on content generated by a provider-controlled artificial intelligence source are not regulated as user-generated content. This is consistent with how the Bill treats comments and reviews on other provider content. These amendments do not have any broader impact on the treatment of bots by Part 3 of the Bill’s regime beyond the issue of comments and reviews. The basis on which a bot will be treated as a user, for example, remains unchanged.
I am grateful to the noble Lord, Lord Clement-Jones, for degrouping his Amendment 152A so that I can come back more fully on it in a later group and I am grateful for the way he spoke about it in advance. I am grateful too for my noble friend Lady Harding’s question. These amendments will ensure that providers which control a generative tool on a service, such as a generative AI bot, are in scope of Part 5 of the Bill. A text-only generative AI bot would not be in scope of Part 5. It is important that we focus on areas which pose the greatest risk of harm to children. There is an exemption in Part 5 for text-based provider pornographic content because of the limited risks posed by published pornographic content. This is consistent with the approach of Part 3 of the Digital Economy Act 2017 and its provisions to protect children from commercial online pornography, which did not include text-based content in scope.
The right reverend Prelate the Bishop of Oxford is right to ask whether we think this is enough. These changes certainly help. The way that the Bill is written in a technology-neutral way will help us to future proof it but, as we have heard throughout the passage of the Bill, we all know that this area of work will need constant examination and scrutiny. That is why the Bill is subject the post-Royal Assent review and scrutiny that it is and why we are grateful for the anticipation noble Lords and Members of Parliament in the other place have already given to ensuring that it delivers on what we want to see. I believe these amendments, which put out of doubt important provisions relating to generative AI, are a helpful addition and I beg to move.
My Lords, much like the noble Lord, Lord Clement-Jones, I started off being quite certain I knew what to say about these amendments. I even had some notes—unusual for me, I know—but I had to throw them away, which I always do with my notes, because the arguments have been persuasive. That is exactly why we are here in Parliament discussing things: to try to reach common solutions to difficult problems.
We started with a challenge to the Minister to answer questions about scope, exemptions and discretion in relation to a named service—Wikipedia. However, as the debate went on, we came across the uncomfortable feeling that, having got so far into the Bill and agreed a lot of amendments today improving it, we are still coming up against quite stubborn issues that do not fit neatly into the categorisation and structures that we have. We do not seem to have the right tools to answer the difficult questions before us today, let alone the myriad questions that will come up as the technology advances and new services come in. Why have we not already got solutions to the problems raised by Amendments 281, 281A and 281B?
There is also the rather difficult idea we have from the noble Lord, Lord Russell, of dark patterns, which we need to filter into our thinking. Why does that not fit into what we have got? Why is it that we are still worried about Wikipedia, a service for public good, which clearly has risks in it and is sometimes capable of making terrible mistakes but is definitely a good thing that should not be threatened by having to conform with a structure and a system which we think is capable of dealing with some of the biggest and most egregious companies that are pushing stuff at us in the way that we have talked about?
I have a series of questions which I do not have the answers to. I am looking forward to the Minister riding to my aid on a white charger of enormous proportions and great skill which will take us out without having to fall over any fences.
If I may, I suggest to the Minister a couple of things. First, we are stuck on the word “content”. We will come back to that in the future, as we still have an outstanding problem about exactly where the Bill sets it. Time and again in discussions with the Bill team and with Ministers we have been led back to the question of where the content problem lies and where the harms relate to that, but this little debate has shown beyond doubt that harm can occur independent of and separate from content. We must have a solution to that, and I hope it will be quick.
Secondly, when approaching anybody or anything or any business or any charity that is being considered in scope for this Bill, we will not get there if we are looking only at the question of its size and its reach. We have to look at the risks it causes, and we have to drill down hard into what risks we are trying to deal with using our armoury as we approach these companies, because that is what matters to the children, vulnerable people and adults who would suffer otherwise, and not the question of whether or not these companies are big or small. I think there are solutions to that and we will get there, but, when he comes to respond, the Minister needs to demonstrate to us that he is still willing to listen and think again about one or two issues. I look forward to further discussions with him.
I am grateful to noble Lords for their contributions during this debate. I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, and particularly that the Bill should not inhibit services from providing valuable information which is of benefit to the public. However, I want to be clear that that is why the Bill has been designed in the way that it has. It has a broad scope in order to capture a range of services, but it has exemptions and categorisations built into it. The alternative would be a narrow scope, which would be more likely inadvertently to exempt risky sites or to displace harm on to services which we would find are out of scope of the Bill. I will disappoint noble Lords by saying that I cannot accept their amendments in this group but will seek to address the concerns that they have raised through them.
The noble Lord, Lord Allan, asked me helpfully at the outset three questions, to which the answers are yes, no and maybe. Yes, Wikipedia and OpenStreetMap will be in scope of the Bill because they allow users to interact online; no, we do not believe that they would fall under any of the current exemptions in the Bill; and the maybe is that Ofcom does not have the discretion to exempt services but the Secretary of State can create additional exemptions for further categories of services if she sees fit.
I must also say maybe to my noble friend Lord Moylan on his point about Wikipedia—and with good reason. Wikipedia, as I have just explained, is in scope of the Bill and is not subject to any of its exemptions. I cannot say how it will be categorised, because that is based on an assessment made by the independent regulator, but I reassure my noble friend that it is not the regulator but the Secretary of State who will set the categorisation thresholds through secondary legislation; that is to say, a member of the democratically elected Government, accountable to Parliament, through legislation laid before that Parliament. It will then be for Ofcom to designate services based on whether or not they meet those thresholds.
It would be wrong—indeed, nigh on impossible—for me to second-guess that designation process from the Dispatch Box. In many cases it is inherently a complex and nuanced matter since, as my noble friend Lady Harding said, many services change over time. We want to keep the Bill’s provisions flexible as services change what they do and new services are invented.
I would just like to finish my thought on Wikipedia. Noble Lords are right to mention it and to highlight the great work that it does. My honourable friend the Minister for Technology and the Digital Economy, Paul Scully, met Wikipedia yesterday to discuss its concerns about the Bill. He explained that the requirements for platforms in this legislation will be proportionate to the risk of harm, and that as such we do not expect the requirements for Wikipedia to be unduly burdensome.
I am computing the various pieces of information that have just been given, and I hope the Minister can clarify whether I have understood them correctly. These services will be in scope as user-to-user services and do not have an exemption, as he said. The Secretary of State will write a piece of secondary legislation that will say, “This will make you a category 1 service”—or a category 2 or 2B service—but, within that, there could be text that has the effect that Wikipedia is in none of those categories. So it and services like it could be entirely exempt from the framework by virtue of that secondary legislation. Is that a correct interpretation of what he said?
The Secretary of State could create further exemptions but would have to bring those before Parliament for it to scrutinise. That is why there is a “maybe” in answer to his third question in relation to any service. It is important for the legislation to be future-proofed that the Secretary of State has the power to bring further categorisations before Parliament for it to discuss and scrutinise.
My Lords, I will keep pressing this point because it is quite important, particularly in the context of the point made by the noble Baroness, Lady Kidron, about categorisation, which we will debate later. There is a big difference when it comes to Schedule 11, which defines the categorisation scheme: whether in the normal run of business we might create an exemption in the categorisation secondary legislation, or whether it would be the Secretary of State coming back with one of those exceptional powers that the Minister knows we do not like. He could almost be making a case for why the Secretary of State has to have these exceptional powers. We would be much less comfortable with that than if the Schedule 11 categorisation piece effectively allowed another class to be created, rather than it being an exceptional Secretary of State power.
I do not think that it is, but it will be helpful to have a debate on categorisation later on Report, when we reach Amendment 245, to probe this further. It is not possible for me to say that a particular service will certainly be categorised one way or another, because that would give it carte blanche and we do not know how it may change in the future—estimable though I may think it is at present. That is the difficulty of setting the precise parameters that the noble Baroness, Lady Fox, sought in her contribution. We are setting broad parameters, with exemptions and categorisations, so that the burdens are not unduly heavy on services which do not cause us concern, and with the proviso for the Secretary of State to bring further exemptions before Parliament, as circumstances strike her as fit, for Parliament to continue the debate we are having now.
The noble Baroness, Lady Kidron, in her earlier speech, asked about the functionalities of user-to-user services. The definitions of user-to-user services are broad and flexible, to capture new and changing services. If a service has both user-to-user functionality and a search engine, it will be considered a combined service, with respective duties for the user-to-user services which form part of its service and search duties in relation to the search engine.
I reassure my noble friend Lady Harding of Winscombe that the Bill will not impose a disproportionate burden on services, nor will it impede the public’s access to valuable content. All duties on services are proportionate to the risk of harm and, crucially, to the capacity of companies. The Bill’s proportionate design means that low-risk services will have to put in place only measures which reflect the risk of harm to their users. Ofcom’s guidance and codes of practice will clearly set out how these services can comply with their duties. We expect that it will set out a range of measures and steps for different types of services.
Moreover, the Bill already provides for wholesale exemptions for low-risk services and for Ofcom to exempt in-scope services from requirements such as record-keeping. That will ensure that there are no undue burdens to such services. I am grateful for my noble friend’s recognition, echoed by my noble friend Lady Stowell of Beeston, that “non-profit” does not mean “not harmful” and that there can be non-commercial services which may pose harms to users. That is why it is important that there is discretion for proper assessment.
Amendment 30 seeks to allow Ofcom to withdraw the exemptions listed in Schedule 1 from the Bill. I am very grateful to my noble friend Lord Moylan for his time earlier this week to discuss his amendment and others. We have looked at it, as I promised we would, but I am afraid that we do not think that it would be appropriate for Ofcom to have this considerable power—my noble friend is already concerned that the regulator has too much.
The Bill recognises that it may be necessary to remove certain exemptions if there is an increased risk of harm from particular types of services. That is why the Bill gives the Secretary of State the power to remove particular exemptions, such as those related to services which have limited user-to-user functionality and those which offer one-to-one live aural communications. These types of services have been carefully selected as areas where future changes in user behaviour could necessitate the repeal or amendment of an exemption in Schedule 1. This power is intentionally limited to only these types of services, meaning that the Secretary of State will not be able to remove exemptions for comments on recognised news publishers’ sites. That is in recognition of the Government’s commitment to media freedom and public debate. It would not be right for Ofcom to have the power to repeal those exemptions.
Amendments 281 and 281B, in the name of the noble Lord, Lord Russell of Liverpool, are designed to ensure that the lists of features under the definition of “functionality” in the Bill apply to all regulated services. Amendment 281A aims to add additional examples of potentially addictive functionalities to the Bill’s existing list of features which constitute a “functionality”. I reassure him and other noble Lords that the list of functionalities in the Bill is non-exhaustive. There may be other functionalities which could cause harm to users and which services will need to consider as part of their risk assessment duties. For example, if a provider’s risk assessment identifies that there are functionalities which risk causing significant harm to an appreciable number of children on its service, the Bill will require the provider to put in place measures to mitigate and manage that risk.
He and other noble Lords spoke about the need for safety by design. I can reassure them this is already built into the framework of the Bill, which recognises how functionalities including many of the things mentioned today can increase the risk of harm to users and will encourage the safe design of platforms.
Amendments 281 and 281B have the effect that regulated services would need to consider the risk of harm of functionalities that are not relevant for their kind of service. For example, sharing content with other users is a functionality of user-to-user services, which is not as relevant for search services. The Bill already outlines specific features that both user-to-user and search services should consider, which are the most relevant functionalities for those types of service. Considering these functionalities would create an unnecessary burden for regulated services which would detract from where their efforts can best be focused. That is why I am afraid I cannot accept the amendments that have been tabled.
My Lords, surely it is the role of the regulators to look at functionalities of this kind. The Minister seemed to be saying that it would be an undue burden on the regulator. Is not that exactly what we are meant to be legislating about at this point?
Perhaps I was not as clear as I could or should have been. The regulator will set out in guidance the duties that fall on the businesses. We do not want the burden on the business to be unduly heavy, but there is an important role for Ofcom here. I will perhaps check—
But these functionalities are a part of their business model, are they not?
Hence Ofcom will make the assessments about categorisation based on that. Maybe I am missing the noble Lord’s point.
I think we may need further discussions on the amendment from the noble Lord, Lord Russell.
I will check what I said but I hope that I have set out why we have taken the approach that we have with the broad scope and the exemptions and categorisations that are contained in it. With that, I urge the noble Lord to withdraw his amendment.
My Lords, that was a very useful debate. I appreciate the Minister’s response and his “yes, no, maybe” succinctness, but I think he has left us all more worried than when the debate started. My noble friend Lord Clement-Jones tied it together nicely. What we want is for the regulator to be focused on the greatest areas of citizen risk. If there are risks that are missing, or things that we will be asking the regulator to do that are a complete waste of time because they are low risk, then we have a problem. We highlighted both those areas. The noble Lord, Lord Russell, rightly highlighted that we are not content with just “content” as the primary focus of the legislation; it is about a lot more than content. In my amendment and those by the noble Lord, Lord Moylan, we are extremely worried—and remain so—that the Bill creates a framework that will trap Wikipedia and services like it, without that being our primary intention. We certainly will come back to this in later groups; I will not seek to press the amendment now, because there is a lot we all need to digest. However, at the end of this process, we want to get to point where the regulator is focused on things that are high risk to the citizen and not wasting time on services that are very low risk. With that, I beg leave to withdraw my amendment.
My Lords, the government amendments in this group relate to the categories of primary priority and priority content that is harmful to children.
Children must be protected from the most harmful online content and activity. As I set out in Committee, the Government have listened to concerns about designating primary priority and priority categories of content in secondary legislation and the need to protect children from harm as swiftly as possible. We have therefore tabled amendments to set out these categories in the Bill. I am grateful for the input from across your Lordships’ House in finalising the scope of these categories.
While it is important to be clear about the kinds of content that pose a risk of harm to children, I acknowledge what many noble Lords raised during our debates in Committee, which is that protecting children from online harm is not just about content. That is why the legislation takes a systems and processes approach to tackling the risk of harm. User-to-user and search service providers will have to undertake comprehensive, mandatory risk assessments of their services and consider how factors such as the design and operation of a service and its features and functionalities may increase the risk of harm to children. Providers must then put in place measures to manage and mitigate these risks, as well as systems and processes to prevent and protect children from encountering the categories of harmful content.
We have also listened to concerns about cumulative harm. In response to this, the Government have tabled amendments to Clause 209 to make it explicit that cumulative harm is addressed. This includes cumulative harm that results from algorithms bombarding a user with content, or where combinations of functionality cumulatively drive up the risk of harm. These amendments will be considered in more detail under a later group of amendments, but they are important context for this discussion.
I turn to the government amendments, starting with Amendment 171, which designates four categories of primary priority content. First, pornographic content has been defined in the same way as in Part 5—to give consistent and comprehensive protection for children, regardless of the type of service on which the pornographic content appears. The other three categories capture content which encourages, promotes or provides instructions for suicide, self-harm or eating disorders. This will cover, for example, glamorising or detailing methods for carrying out these dangerous activities. Designating these as primary priority content will ensure that the most stringent child safety duties apply.
Government Amendment 172 designates six categories of priority content. Providers will be required to protect children from encountering a wide range of harmful violent content, which includes depictions of serious acts of violence or graphic injury against a person or animal, and the encouragement and promotion of serious violence, such as content glamorising violent acts. Providers will also be required to protect children from encountering abusive and hateful content, such as legal forms of racism and homophobia, and bullying content, which sadly many children experience online.
The Government have heard concerns from the noble Baronesses, Lady Kidron and Lady Finlay of Llandaff, about extremely dangerous activities being pushed to children as stunts, and content that can be harmful to the health of children, including inaccurate health advice and false narratives. As such, we are designating content that encourages dangerous stunts and challenges as a category of priority content, and content which encourages the ingestion or inhalation of, or exposure to, harmful substances, such as harmful abortion methods designed to be taken by a person without medical supervision.
Amendment 174, from the noble Baroness, Lady Kidron, seeks to add “mis- and disinformation” and “sexualised content” to the list of priority content. On the first of these, I reiterate what I said in Committee, which is that the Bill will protect children from harmful misinformation and disinformation where it intersects with named categories of primary priority or priority harmful content—for example, an online challenge which is promoted to children on the basis of misinformation or disinformation, or abusive content with a foundation in misinformation or disinformation. However, I did not commit to misinformation and disinformation forming its own stand-alone category of priority harmful content, which could be largely duplicative of the categories that we have already included in the Bill and risks capturing a broad range of legitimate content.
We have already addressed key concerns related to misinformation and disinformation content which presents the greatest risk to children by including content which encourages the ingestion or inhalation of, or exposure to, harmful substances to the list of priority categories. However, the term “mis- and disinformation”, as proposed by Amendment 174, in its breadth and subjectivity risks inadvertently capturing a wide range of content resulting in disproportionate, excessive censorship of the content children see online, including in areas of legitimate debate. The harm arising from misinformation or disinformation usually arises from the context or purpose of the content, rather than the mere fact that it is untrue. Our balanced approach ensures that children are protected from the most prevalent and concerning harms associated with misinformation and disinformation.
My Lords, we spent a lot of time in Committee raising concerns about how pornography and age verification were going to operate across all parts of the Bill. I have heard what the Minister has said in relation to this group, priority harms to children, which I believe is one of the most important groups under discussion in the Bill. I agree that children must be protected from the most harmful content online and offline.
I am grateful to the Government for having listened carefully to the arguments put forward by the House in this regard and commend the Minister for all the work he and his team have done since them. I also commend the noble Lord, Lord Bethell. He and I have been in some discussion between Committee and now in relation to these amendments.
In Committee, I argued for several changes to the Bill which span three groups of amendments. One of my concerns was that pornography should be named as a harm in the Bill. I welcome the Government’s Amendment 171, which names pornography as a primary priority content. I also support Amendment 174 in the name of the noble Baroness, Lady Kidron. She is absolutely right that sexualised content can be harmful to children if not age appropriate and, in that regard, before she even speaks, I ask the Minister tousb reconsider his views on this amendment and to accept it.
Within this group are the amendments which move the definition of “pornographic content” from Part 5 to Clause 211. In that context, I welcome the Government’s announcement on Monday about a review of the regulation, legislation and enforcement of pornography offences.
In Committee, your Lordships were very clear that there needed to be a consistent approach across the Bill to the regulation of pornography. I am in agreement with the amendments tabled in Committee to ensure that consistency applies across all media. In this regard, I thank the noble Baroness, Lady Benjamin, for her persistence in raising this issue. I also thank my colleagues on the Opposition Front Bench, the noble Lord, Lord Stevenson, and the noble Baroness, Lady Merron.
I appreciate that the Government made this announcement only three days ago, but I hope the Minister will set out a timetable for publishing the terms of reference and details of how this review will take place. The review is too important to disappear into the long grass over the Summer Recess, never to be heard of again, so if he is unable to answer my question today, will he commit to writing to your Lordships with the timeframe before the House rises for the summer? Will he consider the active involvement of external groups in this review, as much expertise lies outside government in this area? In that regard, I commend CARE, CEASE and Barnardo’s for all their input into the debates on the Bill.
My Lords, I think the noble Baroness’s comments relate to the next group of amendments, on pornography. She might have skipped ahead, but I am grateful for the additional thinking time to respond to her questions. Perhaps she will save the rest of her remarks for that group.
I thank the Minister for that. In conclusion, I hope he will reflect on those issues and come back, maybe at the end of the next group. I remind the House that in February the APPG on Commercial Sexual Exploitation, in its inquiry on pornography, recommended that the regulation of pornography should be consistent across all online platforms and between the online and offline spheres. I hope we can incorporate the voices I have already mentioned in the NGO sphere in order to assist the Government and both Houses in ensuring that we regulate the online platforms and that children are protected from any harms that may arise.
My Lords, like the noble Baroness, Lady Harding, I want to make it very clear that I think the House as a whole welcomes the change of heart by the Government to ensure that we have in the Bill the two sides of the question of content that will be harmful to children. We should not walk away from that. We made a big thing of this in Committee. The Government listened and we have now got it. The fact that we do not like it—or do not like bits of it—is the price we pay for having achieved something which is, probably on balance, good.
The shock comes from trying to work out why it is written the way it is, and how difficult it is to see what it will mean in practice when companies working to Ofcom’s instructions will take this and make this happen in practice. That lies behind, I think I am right in saying, the need for the addition to Amendment 172 from the noble Baroness, Lady Kidron, which I have signed, along with the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. Both of them have spoken well in support of it and I do not need to repeat those points.
Somehow, in getting the good of Amendments 171 and 172, we have lost the flexibility that we think we want as well to try to get that through. The flexibility does exist, because the Government have retained powers to amend and change both primary priority content that is harmful to children and the primary content. Therefore, subject to approval through the secondary legislation process, this House will continue to have a concern about that—indeed, both Houses will.
Somehow, however, that does not get to quite where the concern comes from. The concern should be both the good points made by the noble Lord, Lord Russell—I should have caught him up in the gap and said I had already mentioned the fact that we had been together at the meeting. He found some additional points to make which I hope will also be useful to future discussion. I am glad he has done that. He is making a very good point in relation to cultural context and the work that needs to go on—which we have talked about in earlier debates—in order to make this live: in other words, to make people who are responsible for delivering this through Ofcom, but also those who are delivering it through companies, to understand the wider context. In that sense, clearly we need the misinformation/disinformation side of that stuff. It is part and parcel of the problems we have got. But more important even than that is the need to see about the functionality issues. We have come back to that. This Bill is about risk. The process that we will be going through is about risk assessment and making sure that the risks are understood by those who deliver services, and the penalties that follow the failure of the risk assessment process delivering change that we want to see in society.
However, it is not just about content. We keep saying that, but we do not see the changes around it. The best thing that could happen today would be if the Minister in responding accepted that these clauses are good—“Tick, we like them”—but could we just not finalise them until we have seen the other half of that, which is: what are the other risks to which those users of services that we have referred to and discussed are receiving through the systemic design processes that are designed to take them in different directions? It is only when we see the two together that we will have a proper concern.
I may have got this wrong, but the only person who can tell us is the Minister because he is the only one who really understands what is going on in the Bill. Am I not right in saying—I am going to say I am right; he will say no, I am not, but I am, aren’t I?—that we will get to Clauses 208 and 209, or the clauses that used to be 208 and 209, one of which deals with harms from content and the other deals with functionality? We may need to look at the way in which those are framed in order to come back and understand better how these lie and how they interact with that. I may have got the numbers wrong—the Minister is looking a bit puzzled, so I probably have—but the sense is that this will probably not come up until day 4. While I do not want to hold back the Bill, we may need to look at some of the issues that are hidden in the interstices of this set of amendments in order to make sure that the totality is better for those who have to use it.
My Lords, this has been a useful debate. As the noble Baroness, Lady Kidron, says, because I spoke first to move the government amendments, in effect I got my response in first to her Amendment 174, the only non-government amendment in the group. That is useful because it allows us to have a deeper debate on it.
The noble Baroness asked about the way that organisations such as the British Board of Film Classification already make assessments of sexualised content. However, the Bill’s requirement on service providers and the process that the BBFC takes to classify content are not really comparable. Services will have far less time and much more content to consider them the BBFC does, so will not be able to take the same approach. The BBFC is able to take an extended time to consider maybe just one scene, one image or one conversation, and therefore can apply nuance to its assessments. That is not possible to do at the scale at which services will have to apply the child safety duties in the Bill. We therefore think there is a real risk that they would excessively apply those duties and adversely affect children’s rights online.
I know the noble Baroness and other noble Lords are rightly concerned with protecting rights to free expression and access to information online for children and for adults. It is important that we strike the right balance, which is what we have tried to do with the government amendments in this group.
To be clear, the point that I made about the BBFC was not to suggest a similar arrangement but to challenge the idea that we cannot categorise material of a sexualised nature. Building on the point made by the noble Lord, Lord Allan, perhaps we could think about it in terms of the amber light rather than the red light—in other words, something to think about.
I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.
My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.
The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—
I clarify that what I said was aimed at protecting children. Somebody corrected me and asked, “Do you know that this says ‘abusive’?”—of course I do. What I suggested was that this is an area that is very contentious when we talk about introducing it to children. I am thinking about safeguarding children in this instance, not just copying and pasting a bit of an Act.
I take this opportunity to ask my noble friend the Minister a question; I want some clarity about this. Would an abusive comment about a particular religion—let us say a religion that practised cannibalism or a historical religion that sacrificed babies, as we know was the norm in Carthage—count as “priority harmful content”? I appreciate that we are mapping the language of the Equality Act, but are we creating a new offence of blasphemy in this Bill?
As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:
“Content which is abusive and which targets any of the following characteristics”.
It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.
I will keep this short, because I know that everyone wants to get on. It would be said that it is abusive to misgender someone; in the context of what is going on in sixth forms and schools, I suggest that this is a problem. It has been suggested that showing pictures of the Prophet Muhammad in an RE lesson—these are real-life events that happen offline—is abusive. I am suggesting that it is not as simple as saying the word “abusive” a lot. In this area, there is a highly contentious and politicised arena that I want to end, but I think that this will exacerbate, not help, it.
My noble friend seemed to confirm what I said. If I wish to be abusive—in fact, I do wish to be abusive—about the Carthaginian religious practice of sacrificing babies to Moloch, and I were to do that in a way that came to the attention of children, would I be caught as having created “priority harmful content”? My noble friend appears to be saying yes.
Does my noble friend wish to do that and direct it at children?
With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?
May I attempt to assist the Minister? This is the “amber” point described by the noble Lord, Lord Allan: “priority content” is not the same as “primary priority content”. Priority content is our amber light. Even the most erudite and scholarly description of baby eating is not appropriate for five year-olds. We do not let it go into “Bod” or any of the other of the programmes we all grew up on. This is about an amber warning: that user-to-user services must have processes that enable them to assess the risk of priority content and primary priority content. It is not black and white, as my noble friend is suggesting; it is genuinely amber.
My Lords, we may be slipping back into a Committee-style conversation. My noble friend Lord Moylan rightly says that this is the first chance we have had to examine this provision, which is a concession wrung out of the Government in Committee. As the noble Lord, Lord Stevenson, says, sometimes that is the price your Lordships’ House pays for winning these concessions, but it is an important point to scrutinise in the way that my noble friend Lord Moylan and the noble Baroness, Lady Fox, have done.
I will try to reassure my noble friend and the noble Baroness. This relates to the definition of a characteristic with which we began our debates today. To be a characteristic it has to be possessed by a person; therefore, the content that is abusive and targets any of the characteristics has to be harmful to an individual to meet the definition of harm. Further, it has to be material that would come to the attention of children in the way that the noble Baronesses who kindly leapt to my defence and added some clarity have set out. So my noble friend would be able to continue to criticise the polytheistic religions of the past and their tendencies to his heart’s content, but there would be protections in place if what he was saying was causing harm to an individual—targeting them on the basis of their race, religion or any of those other characteristics—if that person was a child. That is what noble Lords wanted in Committee, and that is what the Government have brought forward.
My noble friend and others asked why mis- and disinformation were not named as their own category of priority harmful content to children. Countering mis- and disinformation where it intersects with the named categories of primary priority or priority harmful content, rather than as its own issue, will ensure that children are protected from the mis- and disinformation narratives that present the greatest risk of harm to them. We recognise that mis- and disinformation is a broad and cross-cutting issue, and we therefore think the most appropriate response is to address directly the most prevalent and concerning harms associated with it; for example, dangerous challenges and hoax health advice for children to self-administer harmful substances. I assure noble Lords that any further harmful mis- and disinformation content will be captured as non-designated content where it presents a material risk of significant harm to an appreciable number of children.
In addition, the expert advisory committee on mis- and disinformation, established by Ofcom under the Bill, will have a wide remit in advising on the challenges of mis- and disinformation and how best to tackle them, including how they relate to children. Noble Lords may also have seen that the Government have recently tabled amendments to update Ofcom’s statutory media literacy duty. Ofcom will now be required to prioritise users’ awareness of and resilience to misinformation and disinformation online. This will include children and their awareness of and resilience to mis- and disinformation.
My noble friend Lady Harding of Winscombe talked about commercial harms. Harms exacerbated by the design and operation of a platform—that is, their commercial models—are covered in the Bill already through the risk assessment and safety duties. Financial harm, as used in government Amendment 237, is dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This exemption ensures that there is no regulatory overlap.
The noble Lord, Lord Russell of Liverpool, elaborated on remarks made earlier by the noble Lord, Lord Stevenson of Balmacara, about their meeting looking at the incel movement, if it can be called that. I assure the noble Lord and others that Ofcom has a review and report duty and will be required to stay on top of changes in the online harms landscape and report to government on whether it recommends changes to the designated categories of content because of the emerging risks that it sees.
The noble Baroness, Lady Kidron, anticipated the debate we will have on Monday about functionalities and content. I am grateful to her for putting her name to so many of the amendments that we have brought forward. We will continue the discussions that we have been having on this point ahead of the debate on Monday. I do not want to anticipate that now, but I undertake to carry on those discussions.
In closing, I reiterate what I know is the shared objective across your Lordships’ House—to protect children from harmful content and activity. That runs through all the government amendments in this group, which cover the main categories of harmful content and activity that, sadly, too many children encounter online every day. Putting them in primary legislation enables children to be swiftly protected from encountering them. I therefore hope that noble Lords will be heartened by the amendments that we have brought forward in response to the discussion we had in Committee.
(1 year, 5 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, as noble Lords will be aware, the Government removed the legal but harmful provisions from the Bill in another place, given concerns about freedom of expression. I know that many noble Lords would not have taken that approach, but I am grateful for their recognition of the will of the elected House in this regard as well as for their constructive contributions about ways of strengthening the Bill while continuing to respect that.
I am therefore glad to bring forward a package of amendments tabled in my name relating to adult safety. Among other things, these strengthen our existing approach to user empowerment and terms of service by rebalancing the power over the content adults see and interact with online, moving the choice away from unaccountable technology companies and towards individual users.
First, we are introducing a number of amendments, which I am pleased to say have the support of the Opposition Front Bench, which will introduce a comprehensive duty on category 1 providers to carry out a full assessment of the incidence of user empowerment content on their services. The amendments will mean that platforms can be held to account by Ofcom and their users when they fail to assess the incidence of this kind of content on their services or when they fail to offer their users an appropriate ability to control whether or not they view it.
Amendments 19 to 21 and 26—I am grateful to noble Lords opposite for putting their names to them—will strengthen the user empowerment content duty. Category 1 providers will now need proactively to ask their registered adult users how they would like the control features to be applied. We believe that these amendments achieve two important aims that your Lordships have been seeking from these duties: first, they ensure that they are more visible for registered adult users; and, secondly, they offer better protection for young adult users.
Amendments 55 and 56, tabled by the noble Lord, Lord Clement-Jones, my noble friend Lord Moylan and the noble Baroness, Lady Fox of Buckley, seek to provide users with a choice over how the tools are applied for each category of content set out in Clause 12(10), (11) and (12). The legislation gives platforms the flexibility to decide what tools they offer in compliance with Clause 12(2). A blanket approach is unlikely to be consistent with the duty on category 1 services to have particular regard to the importance of protecting users’ freedom of expression when putting these features in place. Additionally, the measures that Ofcom will recommend in its code of practice must consider the impact on freedom of expression so are unlikely to be a blanket approach.
Amendments 58 and 63 would require providers to set and enforce consistent terms of service on how they identify the categories of content to which Clause 12(2) applies; and to apply the features to content only when they have reasonable grounds to infer that it is user empowerment content. I assure noble Lords that the Bill’s freedom of expression duties will prevent providers overapplying the features or adopting an inconsistent or capricious approach. If they do, Ofcom can take enforcement action.
Amendments 59, 64 and 181, tabled by the noble Lord, Lord Clement-Jones, seek to require that the user empowerment and user verification features are provided at no cost. I reassure the noble Lord that the effect of these amendments is already achieved by the drafting of Clause 12. Category 1 providers will be compliant with their duties only if they proactively ask all registered users whether or not they want to use the user empowerment content features, which would not be possible with a paywall. Amendment 181 is similar and applies to user verification. While the Bill does not specify that verification must be free of charge, category 1 providers can meet the duties in the Bill only by offering all adult users the option to verify themselves.
Turning to Amendment 204, tabled by the noble Baroness, Lady Finlay of Llandaff, I share her concern about the impact that self-harm and suicide content can have. However, as I said in Committee, the Bill goes a long way to provide protections for both children and adults from this content. First, it includes the new criminal offence of encouraging or assisting self-harm. This then feeds through into the Bill’s illegal content duties. Companies will be required to take down such content when it is reported to them by users.
Beyond the illegal content duties, there are specific protections in place for children. The Government have tabled amendments designating content that encourages, promotes or provides instructions as a category of primary priority content, meaning that services will have to prevent children of all ages encountering it. For adults, the Government listened to concerns and, as mentioned, have strengthened the user empowerment duties to make it easier for adult users to opt in to using them by offering a forced choice. We have made a careful decision, however, to balance these protections with users’ right to freedom of expression and therefore cannot require platforms to treat legal content accessed by adults in a prescribed way. That is why, although I share the noble Baroness’s concerns about the type of content that she mentions, I cannot accept her amendment and hope that she will agree.
The Bill’s existing duties require category 1 platforms to offer users the ability to verify their identity. Clause 12 requires category 1 platforms to offer users the ability to filter out users who have not verified their identity. Amendment 183 from my noble friend Lord Moylan seeks to give Ofcom the discretion to decide when it is and is not proportionate for category 1 services to offer users the ability to verify their identity. We do not believe that these will be excessively burdensome, given that they will apply only to category 1 companies, which have the resource and capacity to offer such tools.
Amendment 182 would require platforms to offer users the option to make their verification status visible. The existing duty in Clause 57, in combination with the duty in Clause 12, will already provide significant protections for adults from anonymous abuse. Adult users will now be able to verify their own status and decide to interact only with other verified users, whether or not their status is visible. We do not believe that this amendment would provide additional protections.
The Government carefully considered mandating that all users display their verification status, which may heighten some users’ safety, but it would be detrimental to vulnerable users, who may need to remain anonymous for perfectly justifiable reasons. Further government amendments in my name will expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports in relation to user empowerment content.
Separately, but also related to transparency, government Amendments 189 and 202 make changes to Clause 67 and Schedule 8. These relate to category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. Our amendments tighten these parts of the Bill so that all the providers’ terms through which they might indicate that a certain type of content is not allowed on their service, are captured by these duties.
I hope that noble Lords will therefore accept the Government amendments in this group and that my anticipatory remarks about their amendments will give them some food for thought as they make their contributions. I beg to move.
My Lords, I am happy to acknowledge and recognise what the Government did when they created user empowerment duties to replace legal but harmful. I think they were trying to counter the dangers of over-paternalism and illiberalism that oblige providers to protect adult users from content that allegedly would cause them harm.
At least the new provisions brought into the Bill have a different philosophy completely. They enhance users’ freedom as individuals and allow them to apply voluntary content filters and freedom of choice, on the principle that adults can make decisions for themselves.
In case anyone panics, I am not making a philosophical speech. I am reminding the Government that that is what they said to us—to everybody—“We are getting rid of legal but harmful because we believe in this principle”. I am worried that some of the amendments seem to be trying to backtrack from that different basis of the Bill—and that more liberal philosophy—to go back to the old legal but harmful. I say to the noble Lord, Lord Allan of Hallam, that the cat is distinctly not dead.
The purpose of Amendment 56 is to try to ensure that providers also cannot thwart the purpose of Clause 12 and make it more censorious and paternalistic. I am not convinced that the Government needed to compromise on this as I think Amendment 60 just muddies the waters and fudges the important principle that the Government themselves originally established.
Amendment 56 says that the default must be no filtering at all. Then users have to make an active decision to switch on the filtering. The default is that you should be exposed to a full flow of ideas and, if you do not want that, you have to actively decide not to and say that you want a bowdlerised or sanitised version.
Amendment 56 takes it a bit further, in paragraph (b), and applies different levels of filtering in terms of content of democratic importance and journalistic content. In the Bill itself, the Government accept the exceptional nature of those categories of content, and this just allows users to be able to do the same and say, “No; I might want to filter some things out but bear in mind the exceptional importance of democratic and journalistic content”. I worry that the government amendments signal to users that certain ideas are dangerous and must be hidden. That is my big concern. In other words, they might be legal but they are harmful: that is what I think these amendments try to counter.
One of the things that worries me about the Bill is the danger of echo chambers. I know we are concentrating on harms, but I think echo chambers are harmful. I started today quite early at Blue Orchid at 55 Broadway with a big crowd of sixth formers involved in debating matters. I complimented Keir Starmer on his speech on the importance of oracy and encouraging young people to speak. I stressed to all the year 12 and year 13 young people that the important thing was that they spoke out but also that they listened to contrary opinions and got out of their safe spaces and echo chambers. They were debating very difficult topics such as commercial surrogacy, cancel culture and the risks of contact sports. I am saying all that to them and then I am thinking, “We have now got a piece of legislation that says you can filter out all the stuff you do not want to hear and create your own safe space”. So I just get anxious that we do not inadvertently encourage in the young—I know this is for all adults—that antidemocratic tendency to not want to hear what you do not want to hear, even when it would be good to hear as many opinions as possible.
I also want to press the Minister on the problem of filtering material that targets race, religion, sex, sexual orientation, disability and gender reassignment. I keep trying to raise the problem that it could lead to diverse philosophical views around those subjects also being removed by overzealous filtering. You might think that you know what you are asking to be filtered out. If you say you want to filter out material that is anti-religion, you might not mean that you do not want any debates on religious tolerance. For example, there was that major controversy over the “The Lady of Heaven” film. I know the Minister was interested, as I was, in the dangers of censorship in relation to that. You would not want, because you said, “Don’t target me for my religion”, to not be able to access that debate.
I think there is a danger that we are handing a lot of power to filterers to make filtering decisions based on their values when we are not clear about what they are. Look at what has happened with the banks in the last few days. Their values have closed down people’s bank accounts because they disagree on values. Again, we say “Don’t target on race”, but I have been having lots of arguments with people recently who have accused the Government, through their Illegal Migration Bill, of being racist. I think we just need to know that we are not accepting an ideological filtering of what we see.
Amendment 63 is key because it requires providers’ terms of service to include provisions about how content to which Clause 12(2) applies is identified, precisely to try to counter these problems. It imposes a duty on providers to apply those provisions consistently, as the noble Lord, Lord Moylan, explained. The point that providers have to set out how they identify content that is allegedly hostile, for example, to religion, or racially abusive, is important because this is about empowering users. Users need to know whether this will be done by machine learning or will it be a human doing it. Do they look for red flags and, if so, what are the red flags? How are these things decided? That means that providers have to state clearly and be accountable for their definition of any criteria that could justify them filtering out and disturbing the flow of democratic information. It is all about transparency and accountability in that sense.
Finally, in relation to Amendment 183, I am worried about the notion of filtering out content from unverified users for a range of reasons. It indicates somehow that there is a direct link between being unverified or anonymous and harm or being dodgy, which I think that is illegitimate. It has already been explained that there will be a detrimental impact on certain organisations —we have talked about Reddit, but I like to remember Mumsnet. There are quite a lot of organisations with community-centred models, where the structure is that influencers broadcast to their followers and where there are pseudonymous users. Is the requirement to filter out those contributors likely to lead to those models collapsing? I need to be reassured on this because I am not convinced at all. As has been pointed out, there will be a two-tier internet because those who are unable or unwilling to disclose their identity online or to be verified by someone would be or could be shut out from public discussions. That is a very dangerous place to have ended up, even though I am sure it is not what the Government intend.
My Lords, I am grateful for the broad, if not universal, support for the amendments that we have brought forward following the points raised in Committee. I apologise for anticipating noble Lords’ arguments, but I am happy to expand on my remarks in light of what they have said.
My noble friend Lord Moylan raised the question of non-verified user duties and crowdsourced platforms. The Government recognise concerns about how the non-verified user duties will work with different functionalities and platforms, and we have engaged extensively on this issue. These duties are only applicable to category 1 platforms, those with the largest reach and influence over public discourse. It is therefore right that such platforms have additional duties to empower their adult users. We anticipate that these features will be used in circumstances where vulnerable adults wish to shield themselves from anonymous abuse. If users decide that they are restricting their experience on a particular platform, they can simply choose not to use them. In addition, before these duties come into force, Ofcom will be required to consult effective providers regarding the codes of practice, at which point they will consider how these duties might interact with various functionalities.
My noble friend and the noble Lord, Lord Allan of Hallam, raised the potential for being bombarded with pop-ups because of the forced-choice approach that we have taken. These amendments have been carefully drafted to minimise unnecessary prompts or pop-ups. That is why we have specified that the requirement to proactively ask users how they want these tools to be applied is applicable only to registered users. This approach ensures that users will be prompted to make a decision only once, unless they choose to ignore it. After a decision has been made, the provider should save this preference and the user should not be prompted to make the choice again.
The noble Lord, Lord Clement-Jones, talked further about his amendments on the cost of user empowerment tools as a core safety duty in the Bill. Category 1 providers will not be able to put the user empowerment tools in Clause 12 behind a pay wall and still be compliant with their duties. That is because they will need to offer them to users at the first possible opportunity, which they will be unable to do if they are behind a pay wall. The wording of Clause 12(2) makes it clear that providers have a duty to include user empowerment features that an adult user may use or apply.
The Minister may not have the information today, but I would be happy to get it in writing. Can he clarify exactly what will be expected of a service that already prohibits all the Clause 12 bad stuff in their terms of service?
I will happily write to the noble Lord on that.
Clause 12(4) further sets out that all search user empowerment content tools must be made available to all adult users and be easy to access.
The noble Lord, Lord Clement-Jones, on behalf of the noble Baroness, Lady Finlay, talked about people who will seek out suicide, self-harm or eating-disorder content. While the Bill will not prevent adults from seeking out legal content, it will introduce significant protections for adults from some of the most harmful content. The duties relating to category 1 services’ terms of service are expected hugely to improve companies’ own policing of their sites. Where this content is legal and in breach of the company’s terms of service, the Bill will force the company to take it down.
We are going even further by introducing a new user empowerment content-assessment duty. This will mean that where content relates to eating disorders, for instance, but which is not illegal, category 1 providers need fully to assess the incidence of this content on their service. They will need clearly to publish this information in accessible terms of service, so users will be able to find out what they can expect on a particular service. Alternatively, if they choose to allow suicide, self-harm or eating content disorder which falls into the definition set out in Clause 12, they will need proactively to ask users how they would like the user empowerment content features to be applied.
My noble friend Lady Morgan was right to raise the impact on vulnerable people or people with disabilities. While we anticipate that the changes we have made will benefit all adult users, we expect them particularly to benefit those who may otherwise have found it difficult to find and use the user empowerment content features independently—for instance, some users with types of disabilities. That is because the onus will now be on category 1 providers proactively to ask their registered adult users whether they would like these tools to be applied at the first possible opportunity. The requirement also remains to ensure that the tools are easy to access and to set out clearly what tools are on offer and how users can take advantage of them.
My Lords, does the Minister have any more to say on identity verification?
I am being encouraged to be brief so, if I may, I will write to the noble Lord on that point.
My Lords, I will speak to the government amendments now but not anticipate the non-government amendments in this group.
As noble Lords know, protecting children is a key priority for this Bill. We have listened to concerns raised across your Lordships’ House about ensuring that it includes the most robust protections for children, particularly from harmful content such as pornography. We also recognise the strength of feeling about ensuring the effective use of age-assurance measures, by which we mean age verification and age estimation, given the important role they will have in keeping children safe online.
I thank the noble Baroness, Lady Kidron, and my noble friends Lady Harding of Winscombe and Lord Bethell in particular for their continued collaboration over the past few months on these issues. I am very glad to have tabled a significant package of amendments on age assurance. These are designed to ensure that children are prevented from accessing pornography, whether it is published by providers in scope of the Part 5 duties or allowed by user-to-user services that are subject to Part 3 duties. The Bill will be explicit that services will need to use highly effective age verification or age estimation to meet these new duties.
These amendments will also ensure that there is a clear, privacy-preserving and future-proof framework governing the use of age assurance, which will be overseen by Ofcom. Our amendments will, for the first time, explicitly require relevant providers to use age verification or age estimation to protect children from pornography. Publishers of pornographic content, which are regulated in Part 5, will need to use age verification or age estimation to ensure that children are not normally able to encounter content which is regulated provider pornographic content on their service.
Further amendments will ensure that, where such tools are proactive technology, Ofcom may also require their use for Part 5 providers to ensure compliance. Amendments 279 and 280 make further definitional changes to proactive technology to ensure that it can be recommended or required for this purpose. To ensure parity across all regulated pornographic content in the Bill, user-to-user providers which allow pornography under their terms of service will also need to use age verification or age estimation to prevent children encountering pornography where they identify such content on their service. Providers covered by the new duties will also need to ensure that their use of these measures meets a clear, objective and high bar for effectiveness. They will need to be highly effective at correctly determining whether a particular user is a child. This new bar will achieve the intended outcome behind the amendments which we looked at in Committee, seeking to introduce a standard of “beyond reasonable doubt” for age assurance for pornography, while avoiding the risk of legal challenge or inadvertent loopholes.
To ensure that providers are using measures which meet this new bar, the amendments will also require Ofcom to set out, in its guidance for Part 5 providers, examples of age-verification and age-estimation measures which are highly effective in determining whether a particular user is a child. Similarly, in codes of practice for Part 3 providers, Ofcom will need to recommend age-verification or age-estimation measures which can be used to meet the new duty to use highly effective age assurance. This will meet the intent of amendments tabled in Committee seeking to require providers to use measures in a manner approved by Ofcom.
I confirm that the new requirement for Part 3 providers will apply to all categories of primary priority content that is harmful to children, not just pornography. This will mean that providers which allow content promoting or glorifying suicide, self-harm and eating disorders will also be required to use age verification or age estimation to protect children where they identify such content on their service.
Further amendments clarify that a provider can conclude that children cannot access a service—and therefore that the service is not subject to the relevant children’s safety duty—only if it uses age verification or age estimation to ensure that children are not normally able to access the service. This will ensure consistency with the new duties on Part 3 providers to use these measures to prevent children’s access to primary priority content. Amendment 34 inserts a reference to the new user empowerment duties imposed on category 1 providers in the child safety duties.
Amendment 214 will require Part 5 providers to publish a publicly available summary of the age-verification or age-estimation measures that they are using to ensure that children are not normally able to encounter content that is regulated provider pornographic content on their service. This will increase transparency for users on the measures that providers are using to protect children. It also aligns the duties on Part 5 providers with the existing duties on Part 3 providers to include clear information in terms of service on child protection measures or, for search engines, a publicly available statement on such measures.
I thank the noble Baroness, Lady Kidron, for her tireless work relating to Amendment 124, which sets out a list of age-assurance principles. This amendment clearly sets out the important considerations around the use of age-assurance technologies, which Ofcom must have regard to when producing its codes of practice. Amendment 216 sets out the subset of principles which apply to Part 5 guidance. Together, these amendments ensure that providers are deploying age-assurance technologies in an appropriate manner. These principles appear as a full list in Schedule 4. This ensures that the principles can be found together in one place in the Bill. The wider duties set out in the Bill ensure that the same high standards apply to both Part 3 and Part 5 providers. These principles have been carefully drafted to avoid restating existing duties in the Bill. In accordance with good legislative drafting practice, the principles also do not include reference to other legislation which already directly applies to providers. In its relevant guidance and codes, however, Ofcom may include such references as it deems appropriate.
Finally, I highlight the critical importance of ensuring that users’ privacy is protected throughout the age-assurance processes. I make it clear that privacy has been represented in these principles to the furthest degree possible, by referring to the strong safeguards for user privacy already set out in the Bill.
In recognition of these new principles and to avoid duplication, Amendment 127 requires Ofcom to refer to the age-assurance principles, rather than to the proactive technology principles, when recommending age-assurance technologies that are also proactive technology.
We have listened to the points raised by noble Lords about the importance of having clear and robust definitions in the Bill for age assurance, age verification and age estimation. Amendment 277 brings forward those definitions. We have also made it clear that self-declared age, without additional, more robust measures, is not to be regarded as age verification or age estimation for compliance with duties set out in the Bill. Amendment 278 aligns the definition of proactive technology with these new definitions.
The Government are clear that the Bill’s protections must be implemented as quickly as is feasible. This entails a complex programme of work for the Government and Ofcom, as well as robust parliamentary scrutiny of many parts of the regime. All of this will take time to deliver. It is right, however, that we set clear expectations for when the most pressing parts of the regulation—those targeting illegal content and protecting children—should be in place. These amendments create an 18-month statutory deadline from the day the Bill is passed for Ofcom’s implementation of those areas. By this point, Ofcom must submit draft codes of practice to the Secretary of State to be laid in Parliament and publish its final guidance relating to illegal content duties, duties about content harmful to children and duties about pornography content in Part 5. This also includes relevant cross-cutting duties, such as content reporting procedures, which are relevant to illegal content and content harmful to children.
In line with convention, most of the Bill’s substantive provisions will be commenced two months after Royal Assent. These amendments ensure that a set of specific clauses will commence earlier—on the day of Royal Assent—allowing Ofcom to begin vital implementation work sooner than it otherwise would have done. Commencing these clauses early will enable Ofcom to launch its consultation on draft codes of practice for illegal content duties shortly after Royal Assent.
Amendment 271 introduces a new duty on Ofcom to produce and publish a report on in-scope providers’ use of age-assurance technologies, and for this to be done within 18 months of the first date on which both Clauses 11 and 72(2), on pornography duties, are in force. I thank the noble Lord, Lord Allan of Hallam, for the amendment he proposed in Committee, to which this amendment responds. We believe that this amendment will improve transparency in how age-assurance solutions are being deployed by providers, and the effectiveness of those solutions.
Finally, we are also making a number of consequential and technical amendments to the Bill to split Clauses 11 and 25 into two parts. This is to ensure these do not become unwieldy and that the duties are clear for providers and for Ofcom. I beg to move.
(1 year, 4 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
We began this group on the previous day on Report, and I concluded my remarks, so it is now for other noble Lords to contribute on the amendments that I spoke to on Thursday.
My Lords, I rise emphatically to welcome the government amendments in this group. They are a thoughtful and fulsome answer to the serious concerns expressed from the four corners of the Chamber by a great many noble Lords at Second Reading and in Committee about the treatment of age verification for pornography and online harms. For this, I express my profound thanks to my noble friend the Minister, the Secretary of State, the Bill team, the Ofcom officials and all those who have worked so hard to refine this important Bill. This is a moment when the legislative team has clearly listened and done everything it possibly can to close the gap. It is very much the House of Lords at its best.
It is worth mentioning the exceptionally broad alliance of noble Lords who have worked so hard on this issue, particularly my compadres, my noble friend Lady Harding, the noble Baroness, Lady Kidron, and the right reverend Prelate the Bishop of Oxford, who all signed many of the draft amendments. There are the Front-Benchers, including the noble Lords, Lord Stevenson, Lord Knight, Lord Clement-Jones and Lord Allan of Hallam, and the noble Baroness, Lady Merron. There are the Back-Benchers behind me, including my noble friends Lady Jenkin and Lord Farmer, the noble Lords, Lord Morrow, Lord Browne and Lord Dodds, and the noble Baroness, Lady Foster. Of those in front of me, there are the noble Baronesses, Lady Benjamin and Lady Ritchie, and there is also a number too large for me to mention, from all across the House.
I very much welcome the sense of pragmatism and proportionality at the heart of the Online Safety Bill. I welcome the central use of risk assessment as a vital tool for policy implementation and the recognition that some harms are worse than others, that some children need more protection than others, that we are legislating for future technologies that we do not know much about and that we must engage industry to achieve effective implementation. As a veteran of the Communications Act 2003, I strongly support the need for enabling legislation that has agility and a broad amount of support to stand the test of time.
My Lords, this has been a good debate, perhaps unfairly curtailed in terms of the range of voices we have heard, but I am sure the points we wanted to have on the table are there and we can use them in summarising the debate we have had so far.
I welcome the Government’s amendments in this group. They have gone a long way to resolving a number of the difficulties that were left after the Digital Economy Act. As the noble Lord, Lord Clement-Jones, has said, we now have Part 3 and Part 5 hooked together in a consistent and effective way and definitions of “age verification” and “age estimation”. The noble Lord, Lord Grade, is sadly not in his place today—I normally judge the quality of the debate by the angle at which he resides in that top corner there. He is not here to judge it, but I am sure he would be upright and very excited by what we have been hearing so far. His point about the need for companies to be clearly responsible for what they serve up through their services is really important in what we are saying here today.
However, despite the welcome links across to the ICO age-appropriate design code, with the concerns we have been expressing on privacy there are still a number of questions which I think the Minister will want to deal with, either today or in writing. Several noble Lords have raised the question of what “proportionate” means in this area. I have mentioned it in other speeches in other groups. We all want the overall system to be proportionate in the way in which it allocates the powers, duties and responsibilities on the companies providing us with the services they do. But there is an exception for the question of whether children should have access to material which they should not get because of legal constraints, and I hope that “proportionate” is not being used in any sense to evade that.
I say that particularly because the concern has been raised in other debates—and I would be grateful if the Minister could make sure when he comes to respond that this issue is addressed—that smaller companies with less robust track records in terms of their income and expenditures might be able to plead that some of the responsibilities outlined in this section of the Bill do not apply to them because otherwise it would bear on their ability to continue. That would be a complete travesty of where we are trying to get to here, which is an absolute bar on children having access to material that is illegal or in the lists now in the Bill in terms of priority content.
The second worry that people have raised is: will the system that is set up here actually work in practice, particularly if it does not apply to all companies? That relates perhaps to the other half of the coin that I have just mentioned.
The third point, raised by a number of Peers, is: where does all this sit in relation to the review of pornography which was announced recently? A number of questions have been asked about issues which the Minister may be unable to respond to, but I suspect he may also want to write to us on the wider issue of timing and the terms of reference once they are settled.
I think we need to know this as we reach the end of the progress on this Bill, because you cannot expect a system being set up with the powers that are being given to Ofcom to work happily and well if Ofcom knows it is being reviewed at the same time. I hope that some consideration will be given to how we get the system up and running, even if the timescale is now tighter than it was, if at the same time a review rightly positioned to try to look at the wider range of pornography is going to impact on its work.
I want to end on the question raised by a large number of noble Lords: how does all this work sit with privacy? Where information and data are being shared on the basis of assuring access to services, there will be a worry if privacy is not ensured. The amendments tabled by the noble Baroness, Lady Kidron, are very salient to this. I look forward to the Minister’s response to them.
My Lords, I am sorry that the noble Baroness, Lady Benjamin, was unable to be here for the start of the debate on Thursday and therefore that we have not had the benefit of hearing from her today. I am very glad that she was here to hear the richly deserved plaudits from across the House for her years of campaigning on this issue.
I am very glad to have had the opportunity to discuss matters directly with her including, when it was first announced, the review that we have launched. I am pleased that she gave it a conditional thumbs up. Many of her points have been picked up by other noble Lords today. I did not expect anything more than a conditional thumbs up from her, given her commitment to getting this absolutely right. I am glad that she is here to hear some of the answers that I am able to set out, but I know that our discussions would have continued even if she had been able to speak today and that her campaigns on this important issue will not cease; she has been tireless in them. I am very grateful to her, my noble friends Lord Bethell and Lady Harding, the noble Baroness, Lady Kidron, and many others who have been working hard on this.
Let me pick up on their questions and those of the noble Baroness, Lady Ritchie of Downpatrick, and others on the review we announced last week. It will focus on the current regulatory landscape and how to achieve better alignment of online and offline regulation of commercial pornography. It will also look at the effectiveness of the criminal law and the response of the criminal justice system relating to pornography. This would focus primarily on the approach taken by law enforcement agencies and the Crown Prosecution Service, including considering whether changes to the criminal law would address the challenges identified.
The review will be informed by significant expert input from government departments across Whitehall, the Crown Prosecution Service and law enforcement agencies, as well as through consultation with the industry and with civil society organisations and regulators including, as the noble Baroness, Lady Ritchie, rightly says, some of the many NGOs that do important work in this area. It will be a cross-government effort. It will include but not be limited to input from the Ministry of Justice, the Home Office, the Department for Science, Innovation and Technology and my own Department for Culture, Media and Sport. I assure my noble friend Lord Farmer that other government departments will of course be invited to give their thoughts. It is not an exhaustive list.
I detected the enthusiasm for further details from noble Lords across the House. I am very happy to write as soon as I have more details on the review, to keep noble Lords fully informed. I can be clear that we expect the review to be complete within 12 months. The Government are committed to undertaking it in a timely fashion so that any additional safeguards for protecting UK users of online services can be put in place as swiftly as possible.
My noble friend Lord Bethell asked about international alignment and protecting Britain for investment. We continue to lead global discussions and engagement with our international partners to develop common approaches to online safety while delivering on our ambition to make the UK the safest place in the world to be online.
The noble Baroness, Lady Kidron, asked about the new requirements. They apply only to Part 3 providers, which allow pornography or other types of primary priority content on their service. Providers that prohibit this content under their terms of service for all users will not be required to use age verification or age estimation. In practice, we expect services that prohibit this content to use other measures to meet their duties, such as effective content moderation and user reporting. This would protect children from this content instead of requiring measures that would restrict children from seeing content that is not allowed on the service in the first place.
These providers can still use age verification and age estimation to comply with the existing duty to prevent children encountering primary priority content. Ofcom can still recommend age-verification and age-estimation measures in codes of practice for these providers where proportionate. On the noble Baroness’s second amendment, relating to Schedule 4, Ofcom may refer to the age-assurance principles set out in Schedule 4 in its children’s codes of practice.
On the 18-month timetable, I can confirm that 18 months is a backstop and not a target. Our aim is to have the regime in force as quickly as possible while making sure that services understand their new duties. Ofcom has set out in its implementation road map that it intends to publish draft guidance under Part 5 this autumn and draft children’s codes next spring.
The noble Baroness, Lady Ritchie, also asked about implementation timetables. I can confirm that Part 3 and Part 5 duties will be implemented at the same time. Ofcom will publish draft guidance shortly after Royal Assent for Part 5 duties and codes for the illegal content duties in Part 3. Draft codes for Part 3 children’s duties will follow in spring next year. Some Part 3 duties relating to category 1 services will be implemented later, after the categorisation thresholds have been set in secondary legislation.
The noble Lord, Lord Allan of Hallam, asked about interoperability. We have been careful to ensure that the Bill is technology neutral and to allow for innovation across the age-assurance market. We have also included a principle on interoperability in the new list of age-assurance principles in Schedule 4 and the Part 5 guidance.
At the beginning of the debate, on the previous day on Report, I outlined the government amendments in this group. There are some others, which noble Lords have spoken to. Amendments 125 and 217, from the noble Baroness, Lady Kidron, seek to add additional principles on user privacy to the new lists of age-assurance principles for both Part 3 and 5, which are brought in by Amendments 124 and 216. There are already strong safeguards for user privacy in the Bill. Part 3 and 5 providers will need to have regard to the importance of protecting users’ privacy when putting in place measures such as age verification or estimation. Ofcom will be required to set out, in codes of practice for Part 3 providers and in guidance for Part 5 providers, how they can meet these duties relating to privacy. Furthermore, companies that use age-verification or age-estimation solutions will need to comply with the UK’s robust data protection laws or face enforcement action.
Adding the proposed new principles would, we fear, introduce confusion about the nature of the privacy duties set out in the Bill. Courts are likely to assume that the additions are intended to mean something different from the provisions already in the Bill relating to privacy. The new amendments before your Lordships imply that privacy rights are unqualified and that data can never be used for more than one purpose, which is not the case. That would introduce confusion about the nature of—
My Lords, I apologise to the Minister. Can he write giving chapter and verse for that particular passage by reference to the contents of the Bill?
I am very happy to do that. That would probably be better than me trying to do so at length from the Dispatch Box.
Government Amendment 124 also reinforces the importance of protecting children’s privacy, including data protection, by ensuring that Ofcom will need to have regard to standards set out under Section 123 of the Data Protection Act 2018 in the age-appropriate design code. I hope that explains why we cannot accept Amendments 125 or 217.
The noble Baroness, Lady Fox, has Amendment 184 in this group and was unable to speak to it, but I am very happy to respond to it and the way she set it out on the Marshalled List. It seeks to place a new duty on Ofcom to evaluate whether internet service providers, internet-connected devices or individual websites should undertake user-identification and age-assurance checks. This duty would mean that such an evaluation would be needed before Ofcom produces guidance for regulated services to meet their duties under Clauses 16 and 72.
Following this evaluation, Ofcom would need to produce guidance on age-verification and age-assurance systems, which consider cybersecurity and a range of privacy considerations, to be laid before and approved by Parliament. The obligation for Ofcom to evaluate age assurance, included in the noble Baroness’s amendment, is already dealt with by Amendment 271, which the Government have tabled to place a new duty on Ofcom to publish a report on the effectiveness of age-assurance solutions. That will specifically include consideration of cost to business, and privacy, including the processing of personal data.
I just realised I forgot to thank the Government for Amendment 271, which reflected something I raised in Committee. I will reflect back to the Minister that, as is reinforced by his response now, it goes precisely where I wanted to. That is to make sure—I have raised this many times—that we are not implementing another cookie banner, but are implementing something and then going back to say, “Did it work as we intended? Were the costs proportionate to what we achieved?” I want to put on the record that I appreciate Amendment 271.
I appreciate the noble Lord’s interjection and, indeed, his engagement on this issue, which has informed the amendments that we have tabled.
In relation to the amendment of the noble Baroness, Lady Fox, as I set out, there are already robust safeguards for user privacy in the Bill. I have already mentioned Amendment 124, which puts age-assurance principles in the Bill. These require Ofcom to have regard, when producing its codes of practice on the use of age assurance, to the principle of protecting the privacy of users, including data protection. We think that the noble Baroness’s amendment is also unnecessary. I hope that she and the noble Baroness, Lady Kidron, will be willing to not move their amendments and to support the government amendments in the group.
There is always a simple question. We are in a bit of a mess—again. When I said at Second Reading that I thought we should try to work together, as was picked up by the noble Baroness in her powerful speech, to get the best Bill possible out of what we had before us, I really did not know what I was saying. Emotion caught me and I ripped up a brilliant speech which will never see the light of day and decided to wing it. I ended up by saying that I thought we should do the unthinkable in this House—the unthinkable in politics, possibly—and try to work together to get the Bill to come right. As the noble Lord, Lord Clement-Jones, pointed out, I do not think I have ever seen, in my time in this House, so many government amendments setting out a huge number of what we used to call concessions. I am not going to call them concessions—they are improvements to the Bill. We should pay tribute to the Minister, who has guided his extensive team, who are listening anxiously as we speak, in the good work they have been doing for some time, getting questioned quite seriously about where it is taking us.
The noble Lord, Lord Clement-Jones, is quite right to pick up what the pre-legislative scrutiny committee said about this aspect of the work we are doing today and what is in the Bill. We have not really nailed the two big things that social media companies ask: this amplification effect, where a single tweet—or thread, let us call it now—can go spinning around the world and gather support, comment, criticism, complaint, anger and all sorts of things that we probably do not really understand in the short period of time it takes to be read and reacted to. That amplification is not something we see in the real world; we do not really understand it and I am not quite sure we have got to the bottom of where we should be going at this stage.
The second most important point—the point we are stuck on at the moment; this rock, as it were, in the ocean—is the commercial pressure which, of course, drives the way in which companies operate. They are in it for the money, not the social purpose. They did not create public spaces for people to discuss the world because they think it is a good thing. There is no public service in this—this is a commercial decision to get as much money as possible from as many people as possible and, boy, are they successful.
But commercial pressures can have harms; they create harms in ways that we have discussed, and the Bill reflects many of those. This narrow difference between the way the Bill describes content, which is meant to include many of the things we have been talking about today—the four Cs that have been brought into the debate helpfully in recent months—does not really deal with the commercial pressures under which people are placed because of the way in which they deal with social media. We do not think the Bill is as clear as it could be; nor does it achieve as much as it should in trying to deal with that issue.
That is in part to do with the structure. It is almost beyond doubt that the sensibility of what we are trying to achieve here is in the Bill, but it is there at such a level of opacity that it does not have the clarity of the messages we have heard today from those who have spoken about individuals—Milly and that sort of story—and the impact on people. Even the noble Lord, Lord Bethell, whose swimming exploits we must admire, is an unwitting victim of the drive of commercial pressures that sees him in his underwear at inappropriate moments in order that they should seek the profits from that. I think it is great, but I wonder why.
I want to set the Minister a task: to convince us, now that we are at the bar, that when he says that this matter is still in play, he realises what that must imply and will give us a guarantee that we will be able to gain from the additional time that he seeks to get this to settle. There is a case, which I hope he will agree to, for having in the Bill an overarching statement about the need to separate out the harms that arise from content and the harms that arise from the system discussions and debates we have been having today where content is absent. I suggest that, in going back to Clause 1, the overarching objectives clause, it might well be worth seeing whether that might be strengthened so that it covers this impact, so that the first thing to read in the Bill is a sense that we embrace, understand and will act to improve this question of harm arising absent content. There is a case for putting into Clauses 10, 11, 25 and 82 the wording in Amendments 35, 36, 37A and 240, in the name of the noble Baroness, Lady Kidron, and to use those as a way of making sure that every aspect of the journey through which social media companies must go to fulfil the duties set out in the Bill by Ofcom reflects both the content that is received and the design choices made by those companies in bringing forward those proposals for material content harms and the harms that arise from the design choices. Clauses 208 and 209 also have to provide a better consideration of how one describes harms so that they are not always apparently linked to content.
That is a very high hurdle, particularly because my favourite topic of how this House works will be engaged. We have, technically, already passed Clause 1; an amendment was debated and approved, and now appears in versions of the Bill. We are about to finish with Clauses 10 and 11 today, so we are effectively saying to the Minister that he must accept that there are deficiencies in the amendments that have already been passed or would be, if we were to pass Amendments 35, 36, 37A, 85 and 240 in the name of the noble Baroness, Lady Kidron, and others. It is not impossible, and I understand that it would be perfectly reasonable, for the Government to bring back a series of amendments on Third Reading reflecting on the way in which the previous provisions do not fulfil the aspirations expressed all around the House, and therefore there is a need to change them. Given the series of conversations throughout this debate—my phone is red hot with the exchanges taking place, and we do not have a clear signal as to where that will end up—it is entirely up to the Minister to convince the House whether these discussions are worth it.
To vote on this when we are so close seems ridiculous, because I am sure that if there is time, we can make this work. But time is not always available, and it will be up to the Minister to convince us that we should not vote and up to the noble Baroness to decide whether she wishes to test the opinion of the House. We have a three-line Whip on, and we will support her. I do not think that it is necessary to vote, however—we can make this work. I appeal to the Minister to get over the bar and tell us how we are to do it.
My Lords, I am very grateful for the discussion we have had today and the parallel discussions that have accompanied it, as well as the many conversations we have had, not just over the months we have been debating the Bill but over the past few days.
I will turn in a moment to the amendments which have been the focus of the debate, but let me first say a bit about the amendments in this group that stand in my name. As noble Lords have kindly noted, we have brought forward a number of changes, informed by the discussions we have had in Committee and directly with noble Lords who have taken an interest in the Bill for a long time.
Government Amendments 281C, 281D, 281E and 281G relate to the Bill’s interpretation of “harm”, which is set out in Clause 209. We touched on that briefly in our debate on Thursday. The amendments respond to concerns which I have discussed with many across your Lordships’ House that the Bill does not clearly acknowledge that harm and risk can be cumulative. The amendments change the Bill to make that point explicit. Government Amendment 281D makes it clear that harm may be compounded in instances where content is repeatedly encountered by an individual user. That includes, but is not limited to, instances where content is repeatedly encountered as a result of algorithms or functionalities on a service. Government Amendment 281E addresses instances in which the combination of multiple functionalities on a service cumulatively drives up the risk of harm.
Those amendments go hand in hand with other changes that the Government have made on Report to strengthen protections for children. Government Amendment 1, for instance, which we discussed at the beginning of Report, makes it clear that services must be safe by design and that providers must tackle harms which arise from the design and operation of their service. Government Amendments 171 and 172 set out on the face of the Bill the categories of “primary priority” and “priority” content which is harmful to children to allow the protections for children to be implemented as swiftly as possible following Royal Assent. As these amendments demonstrate, the Government have indeed listened to concerns which have been raised from all corners of your Lordships’ House and made significant changes to strengthen the Bill’s protections for children. I agree that it has been a model of the way in which your Lordships’ House operates, and the Bill has benefited from it.
Let me turn to the amendments in the name of the noble Baroness, Lady Kidron. I am very grateful for her many hours of discussion on these specific points, as well as her years of campaigning which led to them. We have come a long way and made a lot of progress on this issue since the discussion at the start of Committee. The nature of online risk versus harm is one which we have gone over extensively. I certainly accept the points that the noble Baroness makes; I know how heartfelt they are and how they are informed by her experience sitting in courtrooms and in coroners’ inquests and talking to people who have had to be there because of the harms they or their families have encountered online. The Government are firmly of the view that it is indisputable that a platform’s functionalities, features or wider design are often the single biggest factor in determining whether a child will suffer harm. The Bill makes it clear that functions, features and design play a key role in the risk of harm occurring to a child online; I draw noble Lords’ attention to Clause 11(5), which makes it clear that the child safety duties apply across all areas of a service, including the way it is designed, operated and used, as well as content present on the service. That makes a distinction between the design, operation and use, and the content.
In addition, the Bill’s online safety objectives include that regulated services should be designed and operated so as to protect from harm people in the United Kingdom who are users of the service, including with regard to algorithms used by the service, functionalities of the services and other features relating to the operation of the service. There is no reference to content in this section, again underlining that the Bill draws a distinction.
This ensures that the role of functionalities is properly accounted for in the obligations on providers and the regulator, but I accept that noble Lords want this to be set out more clearly. Our primary aim must be to ensure that the regulatory framework can operate as intended, so that it can protect children in the way that they deserve and which we all want to see. Therefore, we cannot accept solutions that, however well meaning, may inadvertently weaken the Bill’s framework or allow providers to exploit legal uncertainty to evade their duties. We have come back to that point repeatedly in our discussions.
I beg to move.
Amendment 39 (to Amendment 38)
My Lords, as we have heard, this is a small group of amendments concerned with preventing size and lack of capacity being used as a reasonable excuse for allowing children to be unsafe. Part of the problem is the complexity of the Bill and the way it has been put together.
For example, Clause 11, around user-to-user services, is the pertinent clause and it is headed “Safety duties protecting children”. Clause 11(2) is preceded in italics with the wording “All services” so anyone reading it would think that what follows applies to all user-to-user services regardless of size. Clause 11(3) imposes a duty on providers
“to operate a service using proportionate systems and processes”
to protect children from harm. That implies that there will be judgment around what different providers can be expected to do to protect children; for example, by not having to use a particular unaffordable technical solution on age assurance if they can show the right outcome by doing things differently. That starts to fudge things a little.
The noble Lord, Lord Bethell, who introduced this debate so well with Amendment 39, supported by my noble friend Lady Ritchie, wants to be really sure that the size of the provider can never be used to argue that preventing all children from accessing porn is disproportionate and that a few children slipping through the net might just be okay.
The clarity of Clause 11 unravels even further at the end of the clause, where in subsection (12)(b) it reads that
“the size and capacity of the provider of a service”
is relevant
“in determining what is proportionate”.
The clause starts to fall apart at that point quite thoroughly in terms of anyone reading it being clear about what is supposed to happen.
Amendment 43 seeks to take that paragraph out, as we have heard from the noble Lord, Lord Russell, and would do the same for search in Amendment 87. I have added my name to these amendments because I fear that the ambiguity in the wording of this clause will give small and niche platforms an easy get out from ensuring that children are safe by design.
I use the phrase “by design” deliberately. We need to make a choice with this Bill even at this late stage. Is the starting point in the Bill children’s safety by design? Or is the starting point one where we do not want to overly disrupt the way providers operate their business first—which is to an extent how the speech from the noble Lord, Lord Allan, may have been heard—and then overlay children’s safety on top of that?
Yesterday, I was reading about how children access inappropriate and pornographic content, not just on Twitter, Instagram, Snapchat, TikTok and Pinterest but on Spotify and “Grand Theft Auto”—the latter being a game with an age advisory of “over 17” but which is routinely played by teenaged children. Wherever we tolerate children being online, there are dangers which must be tackled. Listening to the noble Baroness, Lady Harding, took me to where a big chunk of my day job in education goes to—children’s safeguarding. I regularly have to take training in safeguarding because of the governance responsibilities that I have. Individual childminders looking after one or two children have an assessment and an inspection around their safeguarding. In the real world we do not tolerate a lack of safety for children in this context. We should not tolerate it in the online world either.
The speech from the noble Lord, Lord Russell, reminded me of the breadcrumbing from big platforms into niche platforms that is part of that incel insight that he referenced. Content that is harmful to children can also be what some children are looking for, which keeps them engaged. Small, emergent services aggressively seeking growth could set algorithms accordingly. They must not be allowed to believe that engaging harmful content is okay until they get to the size that they need to be to afford the age-assurance technology which we might envisage in the Bill. I hope that the Minister shares our concerns and can help us with this problem.
My Lords, short debates can be helpful and useful. I am grateful to noble Lords who have spoken on this group.
I will start with Amendment 39, tabled by my noble friend Lord Bethell. Under the new duty at Clause 11(3)(a), providers which allow pornography or other forms of primary priority content under their terms of service will need to use highly effective age verification or age estimation to prevent children encountering it where they identify such content on their service, regardless of their size or capacity. While the size and capacity of providers is included as part of a consideration of proportionality, this does not mean that smaller providers or those with less capacity can evade the strengthened new duty to protect children from online pornography. In response to the questions raised by the noble Baronesses, Lady Ritchie of Downpatrick and Lady Kidron, and others, no matter how much pornographic content is on a service, where providers do not prohibit this content they would still need to meet the strengthened duty to use age verification or age estimation.
Proportionality remains relevant for the purposes of providers in scope of the new duty at Clause 11(3)(a) only in terms of the age-verification or age-estimation measures that they choose to use. A smaller provider with less capacity may choose to go for a less costly but still highly effective measure. For instance, a smaller provider with less capacity might seek a third-party solution, whereas a larger provider with greater capacity might develop their own solution. Any measures that providers use will need to meet the new high bar of being “highly effective”. If a provider does not comply with the new duties and fails to use measures which are highly effective at correctly determining whether or not a particular user is a child, Ofcom can take tough enforcement action.
The other amendments in this group seek to remove references to the size and capacity of providers in provisions relating to proportionality. The principle of proportionate, risk-based regulation is fundamental to the Bill’s regulatory framework, and we consider that the Bill as drafted already strikes the correct balance. The Bill ultimately will regulate a large number of services, ranging from some of the biggest companies in the world to smaller, voluntary organisations, as we discussed in our earlier debate on exemptions for public interest services.
The provisions regarding size and capacity recognise that what it is proportionate to require of companies of various sizes and business models will be different. Removing this provision would risk setting a lowest common denominator standard which does not create incentives for larger technology companies to do more to protect their users than smaller organisations. For example, it would not be proportionate for a large multinational company which employs thousands of content moderators and which invests in significant safety technologies to argue that it is required to take only the same steps to comply as a smaller provider which might have only a handful of employees and a few thousand UK users.
While the size and capacity of providers is included as part of a consideration of proportionality, let me be clear that this does not mean that smaller providers or those with less capacity do not need to meet the child safety duties and other duties in the Bill, such as the illegal content safety duties. These duties set out clear requirements for providers. If providers do not meet these duties, they will face enforcement action.
I hope that is reassuring to my noble friend Lord Bethell and to the other noble Lords with amendments in this group. I urge my noble friend to withdraw his amendment.
My Lords, I thank my noble friend the Minister for that reassurance. He put the points extremely well. I very much welcome his words from the Dispatch Box, which go a long way towards clarifying and reassuring.
This was a short and perfectly formed debate. I will not go on a tour d’horizon of everyone who has spoken but I will mention the noble Lord, Lord Allan of Hallam. He is entirely right that no one wants gratuitously to hound out businesses from the UK that contribute to the economy and to our life here. There are good regulatory principles that should be applied by all regulators. The five regulatory principles of accountability, transparency, targeting, consistency and proportionality are all in the Legislative and Regulatory Reform Act 2006. Ofcom will embrace them and abide by them. That kind of reassurance is important to businesses as they approach the new regulatory regime.
I take on board what my noble friend the Minister said in terms of the application of regulations regardless of size or capacity, and the application of these strengthened duties, such as “highly effective”, regardless of any economic or financial capacity. I feel enormously reassured by what he has said. I beg leave to withdraw my amendment.
It is always nice to be nice to the Minister.
I will reference, briefly, the introduction of the amendments in the name of the noble Baroness, Lady Fraser of Craigmaddie, which I signed. They were introduced extremely competently, as you would expect, by my noble and learned kinsman Lord Hope. It is important to get the right words in the right place in Bills such as this. He is absolutely right to point out the need to be sure that we are talking about the right thing when we say “freedom of expression”—that we do mean that and not “freedom of speech”; we should not get them mixed up—and, also, to have a consistent definition that can be referred to, because so much depends on it. Indeed, this group might have run better and more fluently if we had started with this amendment, which would have then led into the speeches from those who had the other amendments in the group.
The noble Baroness is not present today, but not for bad news: for good news. Her daughter is graduating and she wanted to be present at that; it is only right that she should do that. She will be back to pick up other aspects of the devolution issues she has been following very closely, and I will support her at that time.
The debate on freedom of expression was extremely interesting. It raised issues that, perhaps, could have featured more fully had this been timetabled differently, as both noble Lords who introduced amendments on this subject said. I will get my retaliation in first: a lot of what has been asked for will have been done. I am sure that the Minister will say that, if you look at the amendment to Clause 1, the requirement there is that freedom of expression is given priority in the overall approach to the Bill, and therefore, to a large extent, the requirement to replace that at various parts of the Bill may not be necessary. But I will leave him to expand on that; I am sure that he will.
Other than that, the tension I referred to in an earlier discussion, in relation to what we are made to believe about the internet and the social media companies, is that we are seeing a true public square, in which expressions and opinions can be exchanged as freely and openly as they would be in a public space in the real world. But, of course, neither of those places really exists, and no one can take the analogy further than has been done already.
The change, which was picked up by the noble Baroness, Lady Stowell, in relation to losing “legal but harmful”, has precipitated an issue which will be left to social media companies to organise and police—I should have put “policing” in quotation marks. As the noble Baroness, Lady Kidron, said, the remedy for much of this will be an appeals mechanism that works both at the company level and for the issues that need rebalancing in relation to complexity or because they are not being dealt with properly. We will not know that for a couple of years, but at least that has been provided for and we can look forward to it. I look forward to the Minister’s response.
My Lords, I hope that the noble Baroness, Lady Fox, and my noble friend Lord Moylan do feel that they have been listened to. It was striking, in this debate, that they had support from all corners of your Lordships’ House. I know that, at various points in Committee, they may have felt that they were in a minority, but they have been a very useful and welcome one. This debate shows that many of the arguments that they have made throughout the passage of the Bill have resonated with noble Lords from across the House.
Although I have not signed amendments in the names of the noble Baroness and my noble friend Lord Moylan, in many cases it is not because I disagree with them but because I think that what they do is already covered in the Bill. I hope to reassure them of that in what I say now.
Amendments 77 to 81 from the noble Baroness, Lady Fox, would require services to have particular regard to freedom of expression and privacy when deciding on their terms of service. Services will already need to have particular regard to users’ rights when deciding on safety systems to fulfil their duties. These requirements will be reflected in providers’ terms of service, as a result of providers’ duties to set out their safety measures in their terms of service. The framework will also include a range of measures to allow scrutiny of the formulation, clarity and implementation of category 1 providers’ own terms of service.
However, there are some points on which we disagree. For instance, we do not think that it would be appropriate for all providers to have a general duty to have a particular regard to freedom of expression when deciding on their own terms of service about content. We believe that the Bill achieves the right balance. It requires providers to have regard to freedom of expression when carrying out their safety duties, and it enables public scrutiny of terms of service, while recognising providers’ own freedom of expression rights as private entities to set the terms of service that they want. It is of course up to adults to decide which services to use based on the way those services are drawn up and the way the terms of service set out what is permissible in them.
Nothing in the Bill restricts service providers’ ability to set their own terms and conditions for legal content accessed by adults—that is worth stressing. Ofcom will not set platforms’ terms and conditions, nor will it take decisions on whether individual pieces of content should, or should not, be on a platform. Rather, it will ensure that platforms set clear terms and conditions, so that adults know what to expect online, and ensure that platforms have systems and processes in place to enforce those terms and conditions themselves.
Amendment 226 from the noble Baroness, Lady Fox, would require providers to use all relevant information that is reasonably available to them whenever they make judgments about content under their terms of service. That is, where they have included or drafted those terms of service in compliance with duties in the Bill. Her amendment would be to an existing requirement in Clause 173, which already requires providers to take this approach whenever they implement a system or process to comply, and this system is making judgments about certain content. For example, Clause 173 already covers content judgments made via systems and processes that a category 1 provider implements to fulfil its Clause 65 duties to enforce its own terms of service consistently. So we feel that Clause 173 is already broad enough to achieve the objectives that the noble Baroness, Lady Fox, seeks.
My noble friend Lord Moylan’s amendments seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties and when drafting codes or guidance. As we discussed in Committee, Ofcom has existing obligations to protect freedom of expression, and the Bill will include additional measures in this regard. We are also making additional amendments to underline the importance of freedom of expression. I am grateful to the noble and learned Lord, Lord Hope of Craighead, and my noble friend Lady Fraser of Craigmaddie for their work to define “freedom of expression” in the Bill. The Bill’s new overarching statement at Clause 1, as the noble Lord, Lord Stevenson, rightly pointed out, lists “freedom of expression”, signalling that it is a fundamental part of the Bill. That is a helpful addition.
Amendment 188 in the name of the noble Baroness, Lady Fox, seeks to disapply platforms’ Clause 65 duties when platforms’ terms of service restrict lawful expression, or expression otherwise protected by Article 10 of the European Convention on Human Rights. Her amendment would mean that category 1 providers’ Clause 65 duties to enforce clear, accessible terms of service in a consistent manner would not apply to any of their terms of service, where they are making their own decisions restricting legal content. That would greatly undermine the application of these provisions in the Bill.
Article 10 of the European Convention on Human Rights concerns individuals’ and entities’ rights to receive and impart ideas without undue interference by public authorities, not private entities. As such, it is not clear how a service provider deciding not to allow a certain type of content on its platform would engage the Article 10 rights of a user.
Beyond the legal obligations regarding the treatment of certain kinds of user-generated content imposed by this Bill and by other legislation, platforms are free to decide what content they wish, or do not wish, to have on their services. Provisions in the Bill will set out important duties to ensure that providers’ contractual terms on such matters are clear, accessible and consistently enforced.
My Lords, before my noble friend sits down, perhaps I could seek a point of clarification. I think I heard him say, at the beginning of his response to this short debate, that providers will be required to have terms of service which respect users’ rights. May I ask him a very straightforward question: do those rights include the rights conferred by Article 10 of the European Convention on Human Rights? Put another way, is it possible for a provider operating in the United Kingdom to have terms and conditions that abridge the rights conferred by Article 10? If it is possible, what is the Government’s defence of that? If it is not possible, what is the mechanism by which the Bill achieves that?
As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights
“to receive and impart ideas without undue interference”
by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.
On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.
(1 year, 4 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, as I set out in Committee, the Government are bringing forward a package of amendments to address the challenges that bereaved parents and coroners have faced when seeking to access data after the death of a child.
These amendments have been developed after consultation with those who, so sadly, have first-hand experience of these challenges. I thank in particular the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens and Frankie Thomas for raising awareness of the challenges they have faced when seeking access to information following the heartbreaking cases involving their children. I am also grateful to the noble Baroness, Lady Kidron, for championing this issue in Parliament and more widely. I am very happy to say that she is supporting the government amendments in this group.
The loss of any life is heartbreaking, but especially so when it involves a child. These amendments will create a more straightforward and humane process for accessing data and will help to ensure that parents and coroners receive the answers they need in cases where a child’s death may be related to online harms. We know that coroners have faced challenges in accessing relevant data from online service providers, including information about a specific child’s online activity, where that might be relevant to an investigation or inquest. It is important that coroners can access such information.
As such, I turn first to Amendments 246, 247, 249, 250, 282, 283 and 287, which give Ofcom an express power to require information from regulated services about a deceased child’s online activity following a request from a coroner. This includes the content the child had viewed or with which he or she had engaged, how the content came to be encountered by the child, the role that algorithms and other functionalities played, and the method of interaction. It also covers any content that the child generated, uploaded or shared on the service.
Crucially, this power is backed up by Ofcom’s existing enforcement powers, so that, where a company refuses to provide information requested by Ofcom, companies may be subject to enforcement action, including senior management liability. To ensure that there are no barriers to Ofcom sharing information with coroners, first, Amendment 254 enables Ofcom to share information with a coroner without the prior consent of a business to disclose such information. This will ensure that Ofcom is free to provide information it collects under its existing online safety functions to coroners, as well as information requested specifically on behalf of a coroner, where that might be useful in determining whether social media played a part in a child’s death.
Secondly, coroners must have access to online safety expertise, given the technical and fast-moving nature of the industry. As such, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest, following a request from a coroner. This may include, for example, information about a company’s systems and processes, including how algorithms have promoted specific content to a child. To this end, the Chief Coroner’s office will consider issuing non-statutory guidance and training for coroners about social media as appropriate, subject to the prioritisation of resources. We are confident that this well-established framework provides an effective means to provide coroners with training on online safety issues.
It is also important that we address the lack of transparency from large social media services about their approach to data disclosure. Currently, there is no common approach to this issue, with some services offering memorialisation or contact-nomination processes, while others seemingly lack any formal policy. To tackle this, a number of amendments in this group will require the largest services—category 1, 2A and 2B services—to set out policies relating to the disclosure of data regarding the online activities of a deceased child in a clear, accessible and sufficiently detailed format in their terms of service. These companies will also be required to provide a written response to data requests in a timely manner and must provide a dedicated helpline, or similar means, for parents to communicate with the company, in order to streamline the process. This will address the painful radio silence experienced by many bereaved parents. The companies must also offer options so that parents can complain when they consider that a platform is not meeting its obligations. These must be easy to access, easy to use and transparent.
The package of amendments will apply not only to coroners in England and Wales but also to Northern Ireland and equivalent investigations in Scotland, where similar sad events have occurred.
The Government will also address other barriers which are beyond the scope of this Bill. For example, we will explore measures to introduce data rights for bereaved parents who wish to request information about their deceased children through the Data Protection and Digital Information Bill. We are also working, as I said in Committee, with our American counterparts to clarify and, where necessary, address unintended barriers to information sharing created by the United States Stored Communications Act. I beg to move.
My Lords, I thank the Minister and indeed the Secretary of State for bringing forward these amendments in the fulsome manner that they have. I appreciate it, but I know that Bereaved Families for Online Safety also appreciates it. The Government committed to bringing forward these amendments on the last day in Committee, so they have been pre-emptively welcomed and discussed at some length. One need only read through Hansard of 22 June to understand the strength of feeling about the pain that has been caused to families and the urgent need to prevent others experiencing the horror faced by families already dealing with the loss of their child.
I will speak briefly on three matters only. First, I must once again thank bereaved families and colleagues in this House and in the other place for their tireless work in pressing this issue. This is one of those issues that does not allow for celebration. As I walked from the Chamber on 22 June, I asked one of the parents how they felt. They said: “It is too late for me”. It was not said in bitterness but in acknowledgement of their profound hurt and the failure of companies voluntarily to do what is obvious, moral and humane. I ask the Government to see the sense in the other amendments that noble Lords brought forward on Report to make children safer, and make the same, pragmatic, thoughtful solution to those as they have done on this group of amendments. It makes a huge difference.
Secondly, I need to highlight just one gap; I have written to the Secretary of State and the Minister on this. I find it disappointing that the Government did not find a way to require senior management to attend an inquest to give evidence. Given that the Government have agreed that senior managers should be subject to criminal liability under some circumstances, I do not understand their objections to summoning them to co-operate with legal proceedings. If a company submits information in response to Ofcom and at the coroner’s request the company’s senior management is invited to attend the inquest, it makes sense that someone should be required to appear to answer and follow up those questions. Again, on behalf of the bereaved families and specifically their legal representatives, who are very clear on the importance of this part of the regime, I ask the Government to reconsider this point and ask the Minister to undertake to speak to the department and the MoJ, if necessary, to make sure that, if senior managers are asked to attend court, they are mandated to do so.
Thirdly, I will touch on the additional commitments the Minister made beyond the Bill, the first of which is the upcoming Data Protection and Digital Information Bill. I am glad to report that some of the officials working on the Bill have already reached out, so I am grateful to the Minister that this is in train, but I expect it to include guidance for companies that will, at a minimum, cover data preservation orders and guidance about the privacy of other users in cases where a child has died. I think that privacy for other users is central to this being a good outcome for everybody, and I hope we are able to include that.
I am pleased to hear about the undertaking with the US regarding potential barriers, and I believe—and I would love to hear from the Minister—that the objective is to make a bilateral agreement that would allow data to be shared between the two countries in the case of a child’s death. It is very specific requirement, not a wide-ranging one. I believe, if we can do it on a bilateral basis, it would be easier than a broad attempt to change the data storage Act.
I turn finally to training for coroners. I was delighted that the Chief Coroner made a commitment to consider issuing non-legislative guidance and training on social media for coroners and the offer of consultation with experts, including Ofcom, the ICO and bereaved families and their representatives, but this commitment was made subject to funding. I ask the Minister to agree to discuss routes to funding from the levy via Ofcom’s digital literacy duty. I have proposed an amendment to the government amendment that would make that happen, but I would welcome the opportunity to discuss it with the Minister. Coroners must feel confident in their understanding of the digital world, and I am concerned that giving this new route to regulated companies via Ofcom without giving them training on how to use it may create a spectre of failure or further frustration and distress for bereaved families. I know there is not a person in the House who would want that to be the outcome of these welcome government amendments.
My Lords, I am grateful for the recognition of the work that has been done here, led by the noble Baroness, Lady Kidron, but involving many others, including officials who have worked to bring this package forward.
Noble Lords took the opportunity to ask a number of questions. The noble Baroness, Lady Kidron, asked about senior management liability. Ofcom will have extensive enforcement powers at its disposal if service providers do not comply with its information requests issued on behalf of a coroner. The powers will include the ability to hold senior managers criminally liable for non-compliance. Those powers are in line with Ofcom’s existing information-gathering powers in the Bill. Where Ofcom has issued an information request to a company, that company may be required to name a senior manager who is responsible for ensuring compliance with the requirements of the notice. If the named senior manager is found to have failed to comply with that information notice, or has failed to take all reasonable steps to prevent a failure to comply with the notice, that individual will be held personally liable and could be subject to imprisonment.
On the point about them not appearing in court, coroners have well-established powers to require senior managers to attend court. The enforcement powers available to Ofcom are in line with Ofcom’s existing information-gathering powers in the Bill. They do not extend to Ofcom requiring senior managers to appear in court as part of a coronial investigation. We do not think that would be appropriate for Ofcom, given that the coroner’s existing remit already covers this. The noble Baroness raised many specific instances that had come to her attention, and if she has specific examples of people not attending court that she would like to share with us and the Ministry of Justice, of course we would gladly follow those up.
The noble Lord, Lord Knight, rightly mentioned my noble friend Lady Newlove. I can reassure him that I have discussed this package of amendments with her, and had the benefit of her experience as a former Victims’ Commissioner.
On the training for coroners, which is an issue she raised, as did the noble Baroness, Lady Kidron, in her remarks just now, the Chief Coroner for England and Wales has statutory responsibility for maintaining appropriate arrangements for the training of coroners. That is of course independent of government, and exercised through the Judicial College, but the training is mandatory and the Chief Coroner is aware of the issues we are debating now.
The noble Lords, Lord Allan of Hallam and Lord Knight of Weymouth, raised the helpline for parents. Yes, we expect our approach of requiring a dedicated helpline or similar means will involve a human. As we say, we want a more humane process for those who need to use it; we think it would be more effective than requiring a company to provide a named individual contact. We touched on this briefly in Committee, where the point was raised, understandably, about staff turnover or people being absent on leave—that a requirement for a named individual could hinder the contact which families need to see there.
The noble Lord, Lord Allan, also asked some questions about deaths of people other than a child. First, Ofcom’s report in connection with investigations into a death covers any coronial inquest, not just children. More broadly, of course, social media companies may have their own terms and conditions or policies in place setting out when they will share information after somebody has passed away. Companies based outside the UK may have to follow the laws of the jurisdiction in which they are based, which may limit the sharing of data without a court order. While we recognise the difficulty that refusing to disclose data may cause for bereaved relatives in other circumstances, the right to access must, of course, be balanced with the right to privacy. Some adult social media users may be concerned, for instance, about the thought of family members having access to information about their private life after their deaths, so there is a complexity here, as I know the noble Lord understands.
The noble Baroness, Lady Kidron, asked about data preservation orders. I am very glad that officials from another Bill team are already in touch with her, as they should be. As we set out in Committee, we are aware of the importance of data preservation to coroners and bereaved parents, and the Government agree with the principle of ensuring that those data are preserved. We will work towards a solution through the Data Protection and Digital Information Bill. My noble friend Lord Camrose—who is unable to be with us today, also for graduation reasons—and I will be happy to keep the House and all interested parties updated about our progress in resolving the issue of data preservation as we work through this complex problem.
The noble Lord, Lord Clement-Jones, asked about the Information Commissioner’s Office. We expect Ofcom to consult the ICO on all the guidance where its expertise will be relevant, including on providers’ new duties under these amendments. I am grateful, as I say, for the support that they have had and the recognition that this has been a long process since these issues were first raised in the pre-legislative committee. We believe that it is of the utmost importance that coroners and families can access information about a child’s internet use following a bereavement, and that companies’ responses are made in a humane and transparent way.
This group of amendments should be seen alongside the wider protections for children in the Bill, and I hope they will help bereaved parents to get the closure that they deserve. The noble Lord, Lord Allan, was right to pay tribute to how these parents, who have campaigned so bravely, have turned their grief and frustration into a determination to make sure that no other parents go through the sorts of ordeals that they have. That is both humbling and inspiring, and I am glad that the Bill can help to be a part of the change that they are seeking. I share my noble friend Lady Harding’s wish that it may bring them a modicum of calm. I beg to move.
My Lords, very briefly, I commend these two amendments. Again, the provenance is very clear; the Joint Committee said:
“This regulatory alignment would simplify compliance for businesses, whilst giving greater clarity to people who use the service, and greater protection to children.”
It suggested that the Information Commissioner’s Office and Ofcom should issue a joint statement on how these two regulatory systems will interact once the Online Safety Bill has been enacted. That still sounds eminently sensible, a year and a half later.
My Lords, Amendments 100 and 101 seek further to define the meaning of “significant” in the children’s access assessment, with the intention of aligning this with the meaning of “significant” in the Information Commissioner’s draft guidance on the age-appropriate design code.
I am grateful to the noble Baroness, Lady Kidron, for the way in which she has set out the amendments and the swiftness with which we have considered it. The test in the access assessment in the Bill is already aligned with the test in the code, which determines whether a service is likely to be accessed by children in order to ensure consistency for all providers. The Information Commissioner’s Office has liaised with Ofcom on its new guidance on the likely to access test for the code, with the intention of aligning the two regulatory regimes while reflecting that they seek to do different things. In turn, the Bill will require Ofcom to consult the ICO on its guidance to providers, which will further support alignment between the tests. So while we agree about the importance of alignment, we think that it is already catered for.
With regard to Amendment 100, Clause 30(4)(a) already states that
“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service”.
There is, therefore, already provision in the Bill for this being a significant number in and of itself.
On Amendment 101, the meaning of “significant” must already be more than insignificant by its very definition. The amendment also seeks to define “significant” with reference to the number of children using a service rather than seeking to define what is a significant number.
I hope that that provides some reassurance to the noble Baroness, Lady Kidron, and that she will be content to withdraw the amendment.
I am not sure that, at this late hour, I completely understood what the Minister said. On the basis that we are seeking to align, I will withdraw my amendment, but can we check that we are aligned as my speech came directly from a note from officials that showed a difference? On that basis, I am happy to withdraw.
(1 year, 4 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, the amendments in this group consider regulatory accountability and the roles of Ofcom, the Government and Parliament in overseeing the new framework. The proposals include altering the powers of the Secretary of State to direct Ofcom, issue guidance to Ofcom and set strategic priorities. Ofcom’s operational independence is key to the success of this framework, but the regime must ensure that there is an appropriate level of accountability to government. Parliament will also have important functions, in particular scrutinising and approving the codes of practice which set out how platforms can comply with their duties and providing oversight of the Government’s powers.
I heard the strength of feeling expressed in Committee that the Bill’s existing provisions did not get this balance quite right and have tabled amendments to address this. Amendments 129, 134 to 138, 142, 143, 146 and 147 make three important changes to the power for the Secretary of State to direct Ofcom to modify a draft code of practice. First, these amendments replace the public policy wording in Clause 39(1)(a) with a more defined list of reasons for which the Secretary of State can make a direction. This list comprises: national security, public safety, public health and the UK’s international obligations. This is similar to the list set out in a Written Ministerial Statement made last July but omits “economic policy” and “burden to business”.
This closely aligns the reasons in the Bill with the existing power in Section 5 of the Communications Act 2003. The power is limited to those areas genuinely beyond Ofcom’s remit as a regulator and where the Secretary of State might have access to information or expertise that the regulator does not. Secondly, the amendments clarify that the power will be used only for exceptional reasons. As noble Lords know, this has always been our intent and the changes we are tabling today put this beyond doubt. Thirdly, the amendments increase the transparency regarding the use of the power by requiring the Secretary of State to publish details of a direction at the time the power is used. This will ensure that Parliament has advance sight of modifications to a code and I hope will address concerns that several directions could be made on a single code before Parliament became aware.
This group also considers Amendments 131 to 133, which create an 18-month statutory deadline for Ofcom to submit draft codes of practice to the Secretary of State to be laid in Parliament relating to illegal content, safety duties protecting children and other cross-cutting duties. These amendments sit alongside Amendment 230, which we debated on Monday and which introduced the same deadline for Ofcom’s guidance on Part 5 of the regime.
I am particularly grateful to my noble friend Lady Stowell of Beeston, with whom I have had the opportunity to discuss these amendments in some detail as they follow up points that she and the members of her committee gave particular attention to. I beg to move.
My Lords, I will speak to the amendments in this group in my name: Amendments 139, 140, 144 and 145. I thank the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Viscount, Lord Colville, for signing those amendments and for their continued support on this group. I am also grateful to my noble friend the Minister and his team for engaging with me on the issue of Secretary of State powers. He has devoted a lot of time and energy to this, which is reflected in the wide- ranging group of amendments tabled by him.
Before I go any further, it is worth emphasising that the underlying concern here is making sure that we have confidence, through this new regulation regime, that the Bill strikes the right balance of power between government, Parliament, the regulator and big tech firms. The committee that I chair—the Communications and Digital Select Committee of your Lordships’ House—has most focused on that in our consideration of the Bill. I should say also that the amendments I have brought forward in my name very much have the support of the committee as well.
These amendments relate to Clause 39, which is where the main issue lies in the context of Secretary of State powers, and we have three broad concerns. First, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything. Secondly, these powers allowed the Government to conduct an infinite form of ping-pong with the regulator, enabling the Government to prevail in a dispute. Thirdly, this ping-pong could take place in private with no possibility of parliamentary oversight or being able to intervene, as would be appropriate in the event of a breakdown in the relationship between executive and regulator.
This matters because the Online Safety Bill creates a novel form for regulating the internet and what we can or cannot see online, in particular political speech, and it applies to the future. It is one thing for the current Government, who I support, to say that they would never use the powers in this way. That is great but, as we know, current Governments cannot speak for whoever is in power in the generations to come, so it is important that we get this right.
As my noble friend said, he has brought forward amendments to Clause 39 that help to address this. I support him in and commend him for that. The original laundry list of powers to direct Ofcom has been shortened and now follows the precedent set out in the Communications Act 2003. The government amendments also say that the Secretary of State must now publish their directions to Ofcom, which will improve transparency, and once the code is agreed Ofcom will publish changes so that Parliament can see what changes have been made and why. These are all very welcome and, as I say, they go a long way to addressing some of our concerns, but two critical issues remain.
First, the Government retain an opt-out, which means that they do not have to publish their directions if the Secretary of State believes that doing so would risk
“national security or public safety”,
or international relations. However, those points are now the precise grounds on which the Secretary of State may issue a direction and, if history is any guide, there is a real risk that we will never hear about the directions because the Government have decided that they are a security issue.
My Amendments 139 and 140 would require the Secretary of State to at least notify Parliament of the fact that a direction has been issued and what broad topic it relates to. That would not require any details to be published, so it does not compromise security, but it does give assurance that infinite, secretive ping-pong is not happening behind the scenes. My noble friend spoke so quickly at the beginning that I was not quite sure whether he signalled anything, but I hope that he may be able to respond enthusiastically to Amendments 139 and 140.
Secondly, the Government still have powers for infinite ping-pong. I appreciate that the Government have reservations about capping the number of exchanges between the Secretary of State and Ofcom, but they must also recognise the concern that they appear to be preparing the ground for any future Government to reject infinitely the regulator’s proposals and therefore prevail in a dispute about a politically contentious topic. My Amendments 144 and 145 would clarify that the Government will have a legally binding expectation that they will use no more than the bare minimum number of directions to achieve the intent set out in their first direction.
The Government might think that adding this to the Bill is superfluous, but it is necessary in order to give Parliament and the public confidence about the balance of power in this regime. If Parliament felt that the Secretary of State was acting inappropriately, we would have sufficient grounds to intervene. As I said, the Government acknowledged in our discussions the policy substance of these concerns, and as we heard from my noble friend the Minister in introducing this group, there is an understanding on this. For his part, there is perhaps a belief that what they have done goes far enough. I urge him to reconsider Amendments 144 and 145, and I hope that, when he responds to the debate on this group, he can say something about not only Amendments 139 and 140 but the other two amendments that will give me some grounds for comfort.
My Lords, first, I have to say that, having read Hansard from last Thursday, I feel I should have drawn attention to my interests in the register that relate to the Jewish community. I apologise for not doing so at the time and am pleased to now put this on the record.
I will be brief, as noble Lords have already raised a number of very pertinent points, to which I know the Minister will want to respond. In this group of amendments, there is a very welcome focus on transparency, accountability and the role of Parliament, all of which are absolutely crucial to the success of the Bill. I am grateful to the Minister for his introduction and explanation of the impact of the proposed changes to the role of the Secretary of State and Ofcom, whose codes of practice will be, as the noble Viscount, Lord Colville, said, vitally important to the Bill. We very much welcome the amendments in the name of the noble Baroness, Lady Stowell, which identify the requirements of the Secretary of State. We also welcome the government amendments, which along with the amendments by the noble Baroness, have been signed by my noble friend Lord Stevenson.
The amendments tabled in the name of the noble Lord, Lord Moylan, raise interesting points about the requirement to use the affirmative procedure, among other points. I look forward to the Minister’s response to that and other amendments. It would be helpful to hear from the Minister his thoughts on arrangements for post-legislative scrutiny. It would also be helpful to deliberations to understand whether there have been discussions on this between the usual channels.
My Lords, this is indeed an apposite day to be discussing ongoing ping-pong. I am very happy to speak enthusiastically and more slowly about my noble friend Lady Stowell of Beeston’s Amendments 139 and 140. We are happy to support those, subject to some tidying up at Third Reading. We agree with the points that she has made and are keen to bring something forward which would mean broadly that a statement would be laid before Parliament when the power to direct had been used. My noble friend Lady Harding characterised them as the infinite ping-pong question and the secretive ping-pong question; I hope that deals with the secretive ping-pong point.
My noble friend Lady Stowell’s other amendments focus on the infinite ping-pong question, and the power to direct Ofcom to modify a code. Her Amendments 139, 140, 144 and 145 seek to address those concerns: that the Secretary of State could enter into a private form of ping-pong with Ofcom, making an unlimited number of directions on a code to prevent it from ever coming before Parliament. Let me first be clear that we do not foresee that happening. As the amendments I have spoken to today show, the power can be used only when specific exceptional reasons apply. In that sense, we agree with the intent of the amendments tabled by my noble friend Lady Stowell. However, we cannot accept them as drafted because they rely on concepts— such as the “objective” of a direction—which are not consistent with the procedure for making a direction set out in the Bill.
The amendments I have brought forward mean that private ping-pong between the Secretary of State and Ofcom on a code is very unlikely to happen. Let me set out for my noble friend and other noble Lords why that is. The Secretary of State would need exceptional reasons for making any direction, and the Bill then requires that the code be laid before Parliament as soon as is reasonably practicable once the Secretary of State is satisfied that no further modifications to the draft are required. That does not leave room for the power to be used inappropriately. A code could be delayed in this way and in the way that noble Lords have set out only if the Secretary of State could show that there remained exceptional reasons once a code had been modified. This test, which is a very high bar, would need to be met each time. Under the amendments in my name, Parliament would also be made aware straightaway each time a direction was made, and when the modified code came before Parliament, it would now come under greater scrutiny using the affirmative procedure.
I certainly agree with the points that the noble Lord, Lord Allan, and others made that any directions should be made in as transparent a way as possible, which is why we have tabled these amendments. There may be some circumstances where the Secretary of State has access to information—for example, from the security services—the disclosure of which would have an adverse effect on national security. In our amendments, we have sought to retain the existing provisions in the Bill to make sure that we strike the right balance between transparency and protecting national security.
As the noble Lord mentioned, the Freedom of Information Act provides an additional route to transparency while also containing existing safeguards in relation to national security and other important areas. He asked me to think of an example of something that would be exceptional but not require that level of secrecy. By dropping economic policy and burden to business, I would point him to an example in those areas, but a concrete example evades me this afternoon. Those are the areas to which I would turn his attention.
Can the Minister confirm that the fact that a direction has been made will always be known to the public, even if the substance of it is not because it is withheld under the secrecy provision? In other words, will the public always have a before and after knowledge of the fact of the direction, even if its substance is absent?
Yes; that is right.
I hope noble Lords will agree that the changes we have made and that I have outlined today as a package mean that we have reached the right balance in this area. I am very grateful to my noble friend Lady Stowell —who I see wants to come in—for the time that she too has given this issue, along with members of her committee.
I am grateful to my noble friend for his constructive response to my Amendments 139 and 140. I am sure he will do me the honour of allowing me to see the Government’s reversioning of my amendments before they are laid so that we can be confident at Third Reading that they are absolutely in line with expectations.
Could I press my noble friend a little further on Amendments 144 and 145? As I understood what he said, the objection from within government is to the language in the amendments I have tabled—although as my noble friend Lady Harding said, they are incredibly modest in their nature.
I was not sure whether my noble friend was saying in his defence against accepting them that issuing a direction would have to be exceptional, and that that led to a need to clarify that this would be ongoing. Would each time there is a ping or a pong be exceptional? Forgive me, because it starts to sound a bit ridiculous when we get into this amount of detail, but it seems to me that the “exceptional” issue kicks in at the point where you issue the direction. Once you engage in a dialogue, “exceptional” is no longer really the issue. It is an odd defence against trying to limit the number of times you allow that dialogue to continue. Bearing in mind that he is willing to look again at Amendments 139 and 140, I wonder whether, between now and Third Reading, he would at least ask parliamentary counsel to look again at the language in my original amendment.
I am certainly happy to commit to showing my noble friend the tidying up we think necessary of the two amendments I said we are happy to accept ahead of Third Reading. On the others, as I said, the code could be delayed repeatedly only if the Secretary of State showed that there remained exceptional reasons once it had been modified, and that high bar would need to be met each time. So we do not agree with her Amendments 14 and 145 because of concerns about the drafting of my noble friend’s current amendment and because the government amendments we have brought forward cater for the scenario about which she is concerned. Her amendments would place a constraint on the Secretary of State not to give more directions than are necessary to achieve the objectives set out in the original direction, but they would not achieve the intent I think my noble friend has. The Bill does not require the direction to have a particular objective. Directions are made because the Secretary of State believes that modifications are necessary for exceptional reasons, and the direction must set out the reasons why the Secretary of State believes that a draft should be modified.
Through the amendments the Government have laid today, the direction would have to be for exceptional reasons relating to a narrower list and Parliament would be made aware each time a direction was made. Parliament would also have increased scrutiny in cases where a direction had been made under Clause 39(1)(a), because of the affirmative procedure. However, I am very happy to keep talking to my noble friend, as we will be on the other amendments, so we can carry on our conversation then if she wishes.
Let me say a bit about the amendments tabled by my noble friend Lord Moylan. His Amendment 218 would require the draft statement of strategic priorities laid before Parliament to be approved by resolution of each House. As we discussed in Committee, the statement of strategic priorities is necessary because future technological changes are likely to shape harms online, and the Government must have an avenue through which to state their strategic priorities in relation to these emerging technologies.
The Bill already requires the Secretary of State to consult Ofcom and other appropriate persons when preparing a statement. This provides an opportunity for consideration and scrutiny of a draft statement, including, for example, by committees of Parliament. This process, combined with the negative procedure, provides an appropriate level of scrutiny and is in line with comparable existing arrangements in the Communications Act in relation to telecommunications, the management of radio spectrum and postal services.
My noble friend’s other amendments would place additional requirements on the Secretary of State’s power to issue non-binding guidance to Ofcom about the exercise of its online safety functions. The guidance document itself does not create any statutory requirements —Ofcom is required only to have regard to the guidance —and on that basis, we do not agree that it is necessary to subject it to parliamentary approval as a piece of secondary legislation. As my noble friend Lady Harding of Winscombe pointed out, we do not require that in numerous other areas of the economy, and we do not think it necessary here.
Let me reassure my noble friend Lord Moylan on the many ways in which Parliament will be able to scrutinise the work of Ofcom. Like most other regulators, it is accountable to Parliament in how it exercises its functions. The Secretary of State is required to present its annual report and accounts before both Houses. Ministers from the devolved Administrations must also lay a copy of the report before their respective Parliament or Assembly. Ofcom’s officers can be required to appear before Select Committees to answer questions about its work; indeed, its chairman and chief executive appeared before your Lordships’ Communications and Digital Committee just yesterday. Parliament will also have a role in approving a number of aspects of the regulatory framework through its scrutiny of both primary and secondary legislation.
My Lords, the key question is this: why have these powers over social media when the Secretary of State does not have them over broadcast?
If I may, I will write to the noble Lord having reflected on that question further. We are talking here about the provisions set up in the Bill to deal with online harms; clearly, that is the focus here, which is why this Bill deals with that. I will speak to colleagues who look at other areas and respond further to the noble Lord’s question.
Let me reassure the noble Baroness, Lady Fox, that, through this Bill, both Ofcom and providers are being asked to have regard to freedom of expression. Ofcom already has obligations under the Human Rights Act to be bound by the European Convention on Human Rights, including Article 10 rights relating to freedom of expression. Through this Bill, user-to-user and search services will have to consider and implement safeguards for freedom of expression when fulfilling their duties. Those points are uppermost in our minds.
I am grateful for the support expressed by noble Lords for the government amendments in this group. Given the mixed messages of support and the continued work with my noble friend Lady Stowell of Beeston, I urge her not to move her amendments.
My Lords, as we discussed in Committee, the Bill contains strong protection for women and girls and places duties on services to tackle and limit the kinds of offences and online abuse that we know disproportionately affect them. His Majesty’s Government are committed to ensuring that women and girls are protected online as well as offline. I am particularly grateful to my noble friend Lady Morgan of Cotes for the thoughtful and constructive way in which she has approached ensuring that the provisions in the Bill are as robust as possible.
It is with my noble friend’s support that I am therefore pleased to move government Amendment 152. This will create a new clause requiring Ofcom to produce guidance that summarises, in one clear place, measures that can be taken to tackle the abuse that women and girls disproportionately face online. This guidance will relate to regulated user-to-user and search services and will cover content regulated under the Bill’s frame- work. Crucially, it will summarise the measures in the Clause 36 codes for Part 3 duties, namely the illegal and child safety duties. It will also include a summary of platforms’ relevant Part 4 duties—for example, relevant terms of service and reporting provisions. This will provide a one-stop shop for providers.
Providers that adhere to the codes of practice will continue to be compliant with the duties. However, this guidance will ensure that it is easy and clear for platforms to implement holistic and effective protections for women and girls across their various duties. Any company that says it is serious about protecting women and girls online will, I am sure, refer to this guidance when implementing protections for its users.
Ofcom will have the flexibility to shape the guidance in a way it deems most effective in protecting women and girls online. However, as outlined in this amendment, we expect that it will include examples of best practice for assessing risks of harm to women and girls from content and activity, and how providers can reduce these risks and emphasise provisions in the codes of practice that are particularly relevant to the protection of women and girls.
To ensure that this guidance is effective and makes a difference, the amendment creates a requirement on Ofcom to consult the Domestic Abuse Commissioner and the Victims’ Commissioner, among other people or organisations it considers appropriate, when it creates this guidance. Much like the codes of practice, this will ensure that the views and voices of experts on the issue, and of women, girls and victims, are reflected. This amendment will also require Ofcom to publish this guidance.
I am grateful to all the organisations that have worked with us and with my noble friend Lady Morgan to get to this point. I hope your Lordships will accept the amendment. I beg to move.
My Lords, I will speak very briefly to this amendment; I know that the House is keen to get on to other business today. I very much welcome the amendment that the Government have tabled. My noble friend the Minister has always said that they want to keep women and girls safe online. As has been referred to elsewhere, the importance of making our digital streets safer cannot be overestimated.
As my noble friend said, women and girls experience a disproportionate level of abuse online. That is now recognised in this amendment, although this is only the start, not the end, of the matter. I thank my noble friend and the Secretary of State for their engagement on this issue. I thank the chief executive and the chair of Ofcom. I also thank the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester, who I know cannot be here today, and the noble Lord, Lord Knight, who signed the original amendment that we discussed in Committee.
My noble friend has already talked about the campaigners outside the Chamber who wanted there to be specific mention of women and girls in the Bill. I thank Refuge, the 100,000 people who signed the End Violence Against Women coalition’s petition, BT, Glitch, Carnegie UK, Professor Lorna Woods, the NSPCC and many others who made the case for this amendment.
As my noble friend said, this is Ofcom guidance. It is not necessarily a code of practice, but it is still very welcome because it is broader than just the specific offences that the Government have legislated on, which I also welcome. As he said, this puts all the things that companies, platforms and search engines should be doing to protect women and girls online in one specific place. My noble friend mentioned holistic protection, which is very important.
There is no offline/online distinction these days. Women and girls should feel safe everywhere. I also want to say, because I know that my noble friend has had a letter, that this is not about saying that men and boys should not be safe online; it is about recognising the disproportionate levels of abuse that women and girls suffer.
I welcome the fact that, in producing this guidance, Ofcom will have to consult with the Domestic Abuse Commissioner and the Victims’ Commissioner and more widely. I look forward, as I am sure do all the organisations I just mentioned, to working with Ofcom on the first set of guidance that it will produce. It gives me great pleasure to have signed the amendment and to support its introduction.
My Lords, this very positive government amendment acknowledges that there is not equality when it comes to online abuse. We know that women are 27 times more likely than men to be harassed online, that two-thirds of women who report abuse to internet companies do not feel heard, and three out of four women change their behaviour after receiving online abuse.
Like others, I am very glad to have added my name to support this amendment. I thank the Minister for bringing it before your Lordships’ House and for his introduction. It will place a requirement on Ofcom to produce and publish guidance for providers of Part 3 services in order to make online spaces safer for women and girls. As the noble Baroness, Lady Morgan, has said, while this is not a code of practice—and I will be interested in the distinction between the code of practice that was being called for and what we are expecting now—it would be helpful perhaps to know when we might expect to see it. As the noble Baroness, Lady Burt, just asked, what kind of timescale is applicable?
This is very much a significant step for women and girls, who deserve and seek specific protections because of the disproportionate amount of abuse received. It is crucial that the guidance take a holistic approach which focuses on prevention and tech accountability, and that it is as robust as possible. Can the Minister say whether he will be looking to the model of the Violence against Women and Girls Code of Practice, which has been jointly developed by a number of groups and individuals including Glitch, the NSPCC, 5Rights and Refuge? It is important that this be got right, that we see it as soon as possible and that all the benefits can be felt and seen.
I am very grateful to everyone for the support they have expressed for this amendment both in the debate now and by adding their names to it. As I said, I am particularly grateful to my noble friend Lady Morgan, with whom we have worked closely on it. I am also grateful for her recognition that men and boys also face harm online, as she rightly points out. As we discussed in Committee, this Bill seeks to address harms for all users but we recognise that women and girls disproportionately face harm online. As we have discussed with the noble Baroness, Lady Merron, women and girls with other characteristics such as women of colour, disabled women, Jewish women and many others face further disproportionate harm and abuse. I hope that Amendment 152 demonstrates our commitment to giving them the protection they need, making it easy and clear for platforms to implement protections for them across all the wide-ranging duties they have.
The noble Baroness, Lady Burt of Solihull, asked why it was guidance and not a code of practice. Ofcom’s codes of practice will set out how companies can comply with the duties and will cover how companies should tackle the systemic risks facing women and girls online. Stipulating that Ofcom must produce specific codes for multiple different issues could, as we discussed in Committee, create duplication between the codes, causing confusion for companies and for Ofcom.
As Ofcom said in its letter to your Lordships ahead of Report, it has already started the preparatory work on the draft illegal content and child sexual abuse and exploitation codes. If it were required to create a separate code relating to violence against women and girls, this preparatory work would need to be revised, so there would be the unintended—and, I think, across the House, undesired—consequence of slowing down the implementation of these vital protections. I am grateful for the recognition that we and Ofcom have had on that point.
Instead, government Amendment 152 will consolidate all the relevant measures across codes of practice, such as on illegal content, child safety and user empowerment, in one place, assisting platforms to reduce the risk of harm that women and girls disproportionately face.
On timing, at present Ofcom expects that this guidance will be published in phase 3 of the implementation of the Bill, which was set out in Ofcom’s implementation plan of 15 June. This is when the duties in Part 4 of the Bill, relating to terms of service and so on, will be implemented. The guidance covers the duties in Part 4, so for guidance to be comprehensive and have the most impact in protecting women and girls, it is appropriate for it to be published during phase 3 of the Bill’s implementation.
The noble Baroness, Lady Fox, mentioned the rights of trans people and the rights of people to express their views. As she knows, gender reassignment and religious or philosophical belief are both protected characteristics under the Equality Act 2010. Sometimes those are in tension, but they are both protected in the law.
With gratitude to all the noble Lords who have expressed their support for it, I commend the amendment to the House.
The Minister did not quite grasp what I said but I will not keep the House. Would he be prepared to accept recommendations for a broader consultation—or who do I address them to? It is important that groups such as the Women’s Rights Network and others, which suffer abuse because they say “I know what a woman is”, are talked to in a discussion on women and abuse, because that would be appropriate.
I am sorry—yes, the noble Baroness made a further point on consultation. I want to reassure her and other noble Lords that Ofcom has the discretion to consult whatever body it considers appropriate, alongside the Victims’ Commissioner, the Domestic Abuse Commissioner and others who I mentioned. Those consultees may not all agree. It is important that Ofcom takes a range of views but is able to consult whomever. As I mentioned previously, Ofcom and its officers can be scrutinised in Parliament through Select Committees and in other ways. The noble Baroness could take it up directly with them but could avail herself of those routes for parliamentary scrutiny if she felt that her pleas were falling on deaf ears.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content
“generated directly on the service by a user”,
which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content
“uploaded to or shared on the service by a user”,
which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.
A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.
Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?
I will begin with that. The metaverse is in scope of the Bill, which, as noble Lords know, has been designed to be technology neutral and future-proofed to ensure that it keeps pace with emerging technologies—we have indeed come a long way since the noble Lord, Lord Clement-Jones, the noble Lords opposite and many others sat on the pre-legislative scrutiny committee for the Bill. Even as we debate, we envisage future technologies that may come. But the metaverse is in scope.
The Bill will apply to companies that enable users to share content online or to interact with each other, as well as search services. That includes a broad range of services, such as websites, applications, social media services, video games and virtual reality spaces, including the metaverse.
Any service that enables users to interact, as the metaverse does, will need to conduct a child access test and will need to comply with the child safety duties—if it is likely to be accessed by children. Content is broadly defined in the Bill as,
“anything communicated by means of an internet service”.
Where this is uploaded, shared or directly generated on a service by a user and able to be encountered by other users, it will be classed as user-generated content. In the metaverse, this could therefore include things like objects or avatars created by users. It would also include interactions between users in the metaverse such as chat—both text and audio—as well as images, uploaded or created by a user.
My Lords, I hope I am not interrupting the Minister in full flow. He has talked about users entirely. He has not yet got to talking about what happens where the provider is providing that environment—in exactly the way in which the noble Lord, Lord Knight, illustrated.
We talked about bots controlled by service providers before the noble Lord, Lord Knight, asked questions on this. The Bill is designed to make online service providers responsible for the safety of their users in light of harmful activities that their platforms might facilitate. Providers of a user-to-user service will need to adhere to their duties of care, which apply to all user-generated content present on their service. The Bill does not, however, regulate content published by user-to-user providers themselves. That is because the providers are liable for the content they publish on the service themselves. The one exception to this—as the noble Baroness, Lady Kidron, alluded to in her contribution—is pornography, which poses a particular risk to children and is regulated by Part 5 of the Bill.
I am pleased to reassure the noble Lord, Lord Clement- Jones, that the Bill—
I thank the noble Lord for giving way. The Minister just said that private providers will be responsible for their content. I would love to understand what mechanism makes a provider responsible for their content?
I will write to noble Lords with further information and will make sure that I have picked up correctly the questions that they have asked.
On Amendment 152A, which the noble Lord, Lord Clement-Jones, has tabled, I am pleased to assure him that the Bill already achieves the intention of the amendment, which seeks to add characters and objects that might interact with users in the virtual world to the Bill’s definition of user-generated content. Let me be clear again: the Bill already captures any service that facilitates online user-to-user interaction, including in the metaverse or other augmented reality or immersive online worlds.
The Bill broadly defines “content” as
“anything communicated by means of an internet service”,
so it already captures the various ways in which users may encounter content. Clause 211 makes clear that “encounter” in relation to content for the purposes of the Bill means to,
“read, view, hear or otherwise experience”
content. That definition extends to the virtual worlds which noble worlds have envisaged in their contributions. It is broad enough to encompass any way of encountering content, whether that be audio-visually or through online avatars or objects.
In addition, under the Bill’s definition of “functionality”,
“any feature that enables interactions of any description between users of the service”
will be captured. That could include interaction between avatars or interaction by means of an object in a virtual world. All in-scope services must therefore consider a range of functionalities as part of their risk assessment and must put in place any necessary measures to mitigate and manage any risks that they identify.
I hope that that provides some assurance to the noble Lord that the concerns that he has raised are covered, but I shall happily write on his further questions before we reach the amendment that the noble Baroness, Lady Finlay, rightly flagged in her contribution.
I thank the Minister. I feel that we have been slightly unfair because we have been asking questions about an amendment that we have not been able to table. The Minister has perfectly well answered the actual amendment itself and has given a very positive reply—and in a sense I expected him to say what he said about the actual amendment. But, of course, the real question is about an amendment that I was unable to table.
My Lords, as noble Lords know, His Majesty’s Government are committed to defending the invaluable role of a free media, and our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information online. That is why we have included strong protections for recognised news publishers in the Bill.
Clause 49(9) and (10) set out what is considered “news publisher content” in relation to a regulated user-to-user service, while Clause 52 sets out that news publishers’ content is exempt from search services’ duties. The government amendments clarify minor elements of these exemptions and definitions. Given the evolving consumption habits for news, recognised news publishers might clip or edit content from their published or broadcast versions to cater to different audiences and platforms. We want to ensure that recognised news publisher content is protected in all its forms, as long as that content is created or generated by the news publishers themselves.
First, our amendments clarify that any video or audio content published or broadcast by recognised news publishers will be exempt from the Bill’s safety duties and will benefit from the news publisher appeals process, when shared on platforms in scope of the Bill. These amendments ensure that old terminology works effectively in the internet age. The amendments now also make it clear that any news publisher content that is clipped or edited by the publisher itself will qualify for the Bill’s protections when shared by third parties on social media. However, these protections will not apply when a third-party user modifies that content itself. This will ensure that the protections do not apply to news publisher content that has been edited by a user in a potentially harmful way.
The amendments make it clear that the Bill’s protections apply to links to any article, video or audio content generated by recognised news publishers, clipped or edited, and regardless of the form in which that content was first published or broadcast. Taken together, these amendments ensure that our online safety legislation protects recognised news publishers’ content as intended. I hope noble Lords will support them. I beg to move.
I reassure the noble Lord, Lord Stevenson, that he was right to sign the amendments; I am grateful that he did. I do not know whether it is possible to have a sense of déjà vu about debates that took place before one entered your Lordships’ House, but if so, I feel I have had it over the past hour. I am, however, glad to see the noble Lords, Lord Lipsey and Lord McNally, back in their places and that they have had the chance to express their views, which they were unable to do fully in Committee. I am grateful to noble Lords who have joined in that debate again.
At present, Amendment 159 would enable news publishers that are members of Impress, the sole UK regulator which has sought approval by the Press Recognition Panel, to benefit from the Bill’s protections for news publishers, without meeting the criteria set out in Clause 50(2). This would introduce a legislative advantage for Impress members over other news publishers. The amendment would, in effect, create strong incentives for publishers to join a specific press regulator. We do not consider that to be compatible with our commitment to a free press. To that end, as noble Lords know, we will repeal existing legislation that could have that effect, specifically Section 40 of the Crime and Courts Act 2013, through the media Bill, which was published recently.
Not only is creating an incentive for a publisher to join a specific regulator incompatible with protecting press freedom in the United Kingdom but it would undermine the aforementioned criteria. These have been drafted to be as robust as possible, with requirements including that organisations have publication of news as their principal purpose, that they are subject to a standards code and that their content is created by different persons. Membership of Impress, or indeed any other press regulator, does not and should not automatically ensure that these criteria are met.
Amendment 160 goes further by amending one of these criteria—specifically, the requirement for entities to be subject to a standards code. It would add the requirement that these standards codes be drawn up by a regulator, such as a body such as Impress. This amendment would create further incentives for news publishers to join a press regulator if they are to benefit from the exclusion for recognised news publishers. This is similarly not compatible with our commitment to press freedom.
We believe the criteria set out in Clause 50 of the Bill are already sufficiently strong, and we have taken significant care to ensure that only established news publishers are captured, while limiting the opportunity for bad actors to benefit.
The noble Lord, Lord Allan, asked about protections against that abuse by bad actors. The Bill includes protections for journalism and news publishers, given the importance of a free press in a democratic society. However, it also includes safeguards to prevent the abuse of these protections by bad actors. Platforms will still be able to remove recognised news publisher content that breaches their terms and conditions as long as they notify recognised news publishers and offer a right of appeal first. This means that content will remain online while the appeal is considered, unless it constitutes a relevant offence under the Bill or the platform would incur criminal or civil liability by hosting it. This marks a significant improvement on the status quo whereby social media companies can remove journalistic content with no accountability and little recourse for journalists to appeal.
We are clear that sanctioned news outlets such as RT must not benefit from these protections. We are amending the criteria for determining which entities qualify as recognised news publishers explicitly to exclude entities that are subject to sanctions. The criteria also exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or whose purpose is to support an organisation that is proscribed under that Act. To require Ofcom or another party to assess standards would be to introduce press regulation by the back door.
The noble Baroness, Lady Fox of Buckley, asked about protecting clipped or edited content. Given evolving news consumption habits, recognised news publishers may clip or edit content from their published or broadcast versions to cater to different audiences and to be used on different platforms. We want to ensure recognised news publisher content is protected in all its forms as long as that content is still created or generated by the news publisher. For example, if a broadcaster shares a link to its shorter, online-only version of a long-form TV news programme or documentary on an in-scope platform, this should still benefit from the protections that the Bill affords. The amendment that we have brought forward ensures that this content and those scenarios remain protected but removes the risk of platforms being forced to carry news publisher content that has been edited by a third party potentially to cause harm. I hope that clarifies that.
I am grateful to the noble Lord, Lord Lipsey, for making it clear that he does not intend to press his amendments to a Division, so I look forward to that. I am also grateful for the support for the Government’s amendments in this group.
My Lords, I rise very briefly to support the noble Baroness, Lady Merron, and to make only one point. As someone who has the misfortune of seeing a great deal of upsetting material of all kinds, I have to admit that it sears an image on your mind. I have had the misfortune to see the interaction of animal and human cruelty in the same sequences, again and again. In making the point that there is a harm to humans in witnessing and normalising this kind of material, I offer my support to the noble Baroness.
My Lords, Amendments 180 and 180A seek to require the Secretary of State to conduct a review of existing legislation and how it relates to certain animal welfare offences and, contingent on this review, to make them priority offences under the regulatory framework.
I am grateful for this debate on the important issue of protecting against animal cruelty online, and all of us in this House share the view of the importance of so doing. As the House has discussed previously, this Government are committed to strong animal welfare standards and protections. In this spirit, this Government recognise the psychological harm that animal cruelty content can cause to children online. That is why we tabled an amendment that lists content that depicts real or realistic serious violence or injury against an animal, including by fictional creatures, as priority content that is harmful to children. This was debated on the first day of Report.
In addition, all services will need proactively to tackle illegal animal cruelty content where this amounts to an existing offence such as extreme pornography. User-to-user services will be required swiftly to remove other illegal content that targets an individual victim once made aware of its presence.
The noble Baroness asked about timing. We feel it is important to understand how harm to animals as already captured in the Bill will function before committing to the specific remedy proposed in the amendments.
As discussed in Committee, the Bill’s focus is rightly on ensuring that humans, in particular children, are protected online, which is why we have not listed animal offences in Schedule 7. As many have observed, this Bill cannot fix every problem associated with the internet. While we recognise the psychological harm that can be caused to adults by seeing this type of content, listing animal offences in Schedule 7 is likely to dilute providers’ resources away from protecting humans online, which is the Bill’s main purpose.
However, I understand the importance of taking action on animal mistreatment when committed online, and I am sympathetic to the intention of these amendments. As discussed with the noble Baroness, Defra is confident that the Animal Welfare Act 2006 and its devolved equivalents can successfully bring prosecutions for the commission and action of animal torture when done online in the UK. These Acts do not cover acts of cruelty that take place outside the UK. I know from the discussion we have had in this House that there are real concerns that the Animal Welfare Act 2006 cannot tackle cross-border content, so I wish to make a further commitment today.
The Government have already committed to consider further how the criminal law can best protect individuals from harmful communications, alongside other communications offences, as part of changes made in the other place. To that end, we commit to include the harm caused by animal mistreatment communications as part of this assessment. This will then provide a basis for the Secretary of State to consider whether this offence should be added to Schedule 7 to the OSB via the powers in Clause 198. This work will commence shortly, and I am confident that this, in combination with animal cruelty content listed as priority harms to children, will safeguard users from this type of content online.
For the reasons set out, I hope the noble Baroness and the noble Lord will consider not pressing their amendments.
The Minister has not dealt with Amendment 180A at all.
That really is not good enough, if I may say so. Does the Minister not have any brief of any kind on Amendment 180A?
I am sorry if the noble Lord feels that I have not dealt with it at all.
The words “animal trafficking” have not passed his lips.
My Lords, I am sure the letter will be anticipated.
I am grateful to the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, for their support for Amendment 180. I appreciate the consideration that the Minister has given to the issue. I am in no doubt of his sympathy for the very important matters at stake here. However, he will not be surprised to hear that I am disappointed with the response, not least because, in the Minister’s proposal, a report will go to the Secretary of State and it will then be up to the Secretary of State whether anything happens, which really is not what we seek. As I mentioned at the outset, I would like to test the opinion of the House.
My Lords, child sexual exploitation or abuse is an abhorrent crime. Reporting allows victims to be identified and offenders apprehended. It is vital that in-scope companies retain the data included in reports made to the National Crime Agency. This will enable effective prosecutions and ensure that children can be protected.
The amendments in my name in this group will enable the Secretary of State to include in the regulations about the reporting of child sexual exploitation or abuse content a requirement for providers to retain data. This requirement will be triggered only by a provider making a report of suspected child sexual exploitation or abuse to the National Crime Agency. The provider will need to retain the data included in the report, along with any associated account data. This is vital to enabling prosecutions and to ensuring that children can be protected, because data in reports cannot be used as evidence. Law enforcement agencies request this data only when they have determined that the content is in fact illegal and that it is necessary to progress investigations.
Details such as the types of data and the period of time for which providers must retain this data will be specified in regulations. This will ensure that the requirement is future-proofed against new types of data and will prevent companies retaining types of data that may have become obsolete. The amendments will also enable regulations to include any necessary safeguards in relation to data protection. However, providers will be expected to store, process and share this personal data within the UK GDPR framework.
Regulations about child sexual exploitation or abuse reporting will undergo a robust consultation with relevant parties and will be subject to parliamentary scrutiny. This process will ensure that the regulations about retaining data will be well-informed, effective and fit for purpose. These amendments bring the child sexual exploitation and abuse reporting requirements into line with international standards. I beg to move.
My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.
On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.
Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.
One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.
I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.
As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.
My Lords, from this side we certainly welcome these government amendments. I felt it was probably churlish to ask why it had taken until this late stage to comply with international standards, but that point was made very well by the noble Lord, Lord Allan of Hallam, and I look forward to the Minister’s response.
I am grateful to noble Lords for their support for these amendments and for their commitment, as expected, to ensuring that we have the strongest protections in the Bill for children.
The noble Lord, Lord Allan of Hallam, asked: why only now? It became apparent during the regular engagement that, as he would expect, the Government have with the National Crime Agency on issues such as this that this would be necessary, so we are happy to bring these amendments forward. They are vital amendments to enable law enforcement partners to prosecute offenders and keep children safe.
Reports received by the National Crime Agency are for intelligence only and so cannot be relied on as evidence. As a result, in some cases law enforcement agencies may be required to request that companies provide data in an evidential format. The submitted report will contain a limited amount of information from which law enforcement agencies will have to decide what action to take. Reporting companies may hold wider data that relate to the individuals featured in the report, which could allow law enforcement agencies to understand the full circumstances of the event or attribute identities to the users of the accounts.
The data retention period will provide law enforcement agencies with the necessary time to decide whether it is appropriate to request data in order to continue their investigations. I hope that explains the context of why we are doing this now and why these amendments are important ones to add to the Bill. I am very grateful for noble Lords’ support for them.
(1 year, 4 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, let me add to this miscellany by speaking to the government amendments that stand in my name as part of this group. The first is Amendment 288A, which we mentioned on the first group of amendments on Report because it relates to the new introductory clause, Clause 1, and responds to the points raised by the noble Lord, Lord Stevenson of Balmacara. I am very happy to say again that the Government recognise that people with multiple and combined characteristics suffer disproportionately online and are often at greater risk of harm. This amendment therefore adds a provision in the new interpretation clause, Clause 1, to put beyond doubt that all the references to people with “a certain characteristic” throughout the Bill include people with a combination of characteristics. We had a good debate about the Interpretation Act 1978, which sets that out, but we are happy to set it out clearly here.
In his Amendment 186A, my noble friend Lord Moylan seeks to clarify a broader issue relating to consumer rights and online platforms. He got some general support—certainly gratitude—for raising this issue, although there was a bit of a Committee-style airing of it and a mixture of views on whether this is the right way or the right place. The amendment seeks to make it clear that certain protections for consumers in the Consumer Rights Act 2015 apply when people use online services and do not pay for them but rather give up their personal data in exchange. The Government are aware that the application of the law in that area is not always clear in relation to free digital services and, like many noble Lords, express our gratitude to my noble friend for highlighting the issue through his amendment.
We do not think that the Bill is the right vehicle for attempting to provide clarification on this point, however. We share some of the cautions that the noble Lord, Lord Allan of Hallam, raised and agree with my noble friend Lady Harding of Winscombe that this is part of a broader question about consumer rights online beyond the services with which the Bill is principally concerned. It could be preferable that the principle that my noble friend Lord Moylan seeks to establish through his amendment should apply more widely than merely to category 1 services regulated under the Bill. I assure him that the Bill will create a number of duties on providers which will benefit users and clarify that they have existing rights of action in the courts. We discussed these new protections in depth in Committee and earlier on Report. He drew attention to Clause 65(1), which puts a requirement on all services, not just category 1 services, to include clear and accessible provisions in their terms of service informing users about their right to bring a claim for breach of contract. Therefore, while we are grateful, we agree with noble Lords who suggested that this is a debate for another day and another Bill.
Amendment 191A from the noble Baroness, Lady Kidron, would require Ofcom to issue guidance for coroners and procurators fiscal to aid them in submitting requests to Ofcom to exercise its power to obtain information from providers about the use of a service by a deceased child. While I am sympathetic to her intention, I do not think that her amendment is the right answer. It would be inappropriate for an agency of the Executive to issue guidance to a branch of the judiciary. As I explained in Committee, it is for the Chief Coroner to provide detailed guidance to coroners. This is written to assist coroners with the law and their legal duties and to provide commentary and advice on policy and practice.
The amendment tabled by the noble Baroness cuts across the role of the Chief Coroner and risks compromising the judicial independence of the coroner, as set out in the Constitutional Reform Act 2005. As she is aware, the Chief Coroner has agreed to consider issuing guidance to coroners on social media and to consider the issues covered in the Bill. He has also agreed to explore whether coroners would benefit from additional training, with the offer of consultation with experts including Ofcom and the Information Commissioner’s Office. I suggest that the better approach would be for Ofcom and the Information Commissioner’s Office to support the Chief Coroner in his consideration of these issues where he would find that helpful.
I agree with the noble Lord, Lord Allan, that coroners must have access to online safety expertise given the technical and fast-moving nature of this sector. As we have discussed previously, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest following a request from a coroner which will provide that expertise. I hope that this reassures the noble Baroness.
I understand the report on a specific death, which is very welcome and part of the regime as we all see it. The very long list of things that the coroner may not know that they do not know, as I set out in the amendment, is the issue which I and other noble Lords are concerned about. If the Government could find a way to make that possible, I would be very grateful.
We are keen to ensure that coroners have access to the information and expertise that they need, while respecting the independence of the judicial process to decide what they do not know and would like to know more about and the role of the Chief Coroner there. It is a point that I have discussed a lot with the noble Baroness and with my noble friend Lady Newlove in her former role as Victims’ Commissioner. I am very happy to continue doing so because it is important that there is access to that.
The noble Lord, Lord Stevenson, spoke to the amendments tabled by the noble Baroness, Lady Merron, about supposedly gendered language in relation to Clauses 141 and 157. As I made clear in Committee, I appreciate the intention—as does Lady Deben—of making clear that a person of either sex can perform the role of chairman, just as they can perform the role of ombudsman. We have discussed in Committee the semantic point there. The Government have used “chairman” here to be consistent with terminology in the Office of Communications Act 2002. I appreciate that this predates the Written Ministerial Statement which the noble Lord cited, but that itself made clear that the Government at the time recognised that in practice, parliamentary counsel would need to adopt a flexible approach to this change—for example, in at least some of the cases where existing legislation originally drafted in the former style is being amended.
The noble Lord may be aware of a further Written Ministerial Statement, made on 23 May last year, following our debates on gendered language on another Bill, when the then Lord President of the Council and Leader of the House of Commons said that the Office of the Parliamentary Counsel would update its drafting guidance in light of that. That guidance is still forthcoming. However, importantly, the term here will have no bearing on Ofcom’s decision-making on who would chair the advisory committees. It must establish that this could indeed be a person of either sex.
Amendment 253 seeks to enable co-operation, particularly via information-sharing, between Ofcom and other regulators within the UK. I reassure noble Lords that Section 393 of the Communications Act 2003 already includes provisions for sharing information between Ofcom and other regulators in the UK.
As has been noted, Ofcom already co-operates effectively with other domestic regulators. That has been strengthened by the establishment of the Digital Regulation Co-operation Forum. By promoting greater coherence, the forum helps to resolve potential tensions, offering clarity for people and the industry. It ensures collaborative work across areas of common interest to address complex problems. Its outputs have already delivered real and wide-ranging impacts, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues, and horizon-scanning activities on new regulatory challenges. We will continue to assess how best to support collaboration between digital regulators and to ensure that their approaches are joined up. We therefore do not think that Amendment 253 is necessary.
My Lords, the Minister has not stated that there is a duty to collaborate. Is he saying that that is, in fact, the case in practice?
Yes, there is a duty, and the law should be followed. I am not sure whether the noble Lord is suggesting that it is not—
I am not sure that I follow the noble Lord’s question, but perhaps—
My Lords, the Minister is saying that, in practice, there is a kind of collaboration between regulators and that there is a power under the Communications Act, but is he saying that there is any kind of duty on regulators to collaborate?
If I may, I will write to the noble Lord setting that out; he has lost me with his question. We believe, as I think he said, that the forum has added to the collaboration in this important area.
The noble Baroness, Lady Finlay, raised important questions about avatars and virtual characters. The Bill broadly defines “content” as
“anything communicated by means of an internet service”,
meaning that it already captures the various ways through which users may encounter content. In the metaverse, this could therefore include things such as avatars or characters created by users. As part of the user-to-user services’ risk assessments, providers will be required to consider more than the risk in relation to user-generated content, including aspects such as how the design and operation of their services, including functionality and how the service is used, might increase the risk of harm to children and the presence of illegal content. A user-to-user service will need to consider any feature which enables interaction of any description between users of the service when carrying out its risk assessments.
The Bill is focused on user-to-user and search services, as there is significant evidence to support the case for regulation based on the risk of harm to users and the current lack of regulatory and other accountability in this area. Hosting, sharing and the discovery of user-generated content and activity give rise to a range of online harms, which is why we have focused on those services. The Bill does not regulate content published by user-to-user service providers themselves; instead, providers are already liable for the content that they publish on their services themselves, and the criminal law is the most appropriate mechanism for dealing with services which publish illegal provider content.
The noble Baroness’s Amendment 275A seeks to require Ofcom to produce a wide-ranging report of behaviour facilitated by emerging technologies. As we discussed in Committee, the Government of course agree that Ofcom needs continually to assess future risks and the capacity of emerging technologies to cause harm. That is why the Bill already contains provisions which allow it to carry out broad horizon scanning, such as its extensive powers to gather information, to commission skilled persons’ reports and to require providers to produce transparency reports. Ofcom has already indicated that it plans to research emerging technologies, and the Bill will require it to update its risk assessments, risk profiles and codes of practice with the outcomes of this research where relevant.
As we touched on in Committee, Clause 56 requires regular reviews by Ofcom into the incidence of content that is harmful to children, and whether there should be changes to regulations setting out the kinds of content that are harmful to children. In addition, Clause 143 mandates that Ofcom should investigate users’ experience of regulated services, which are likely to cover user interactions in virtual spaces, such as the metaverse and those involving content generated by artificial intelligence.
I am most grateful to the Minister; perhaps I could just check something he said. There was a great deal of detail and I was trying to capture it. On the question of harms to children, we all understand that the harms to children are viewed more extensively than harms to others, but I wondered: what counts as unregulated services? The Minister was talking about regulated services. What happens if there is machine-generated content which is not generated by any user but by some random codes that are developed and then randomly incite problematic behaviours?
I am happy to provide further detail in writing and to reiterate the points I have made as it is rather technical. Content that is published by providers of user-to-user services themselves is not regulated by the Bill because providers are liable for the content they publish on the services themselves. Of course, that does not apply to pornography, which we know poses a particular risk to children online and is regulated through Part 5 of the Bill. I will set out in writing, I hope more clearly, for the noble Baroness what is in scope to reassure her about the way the Bill addresses the harms that she has rightly raised.
My Lords, this has indeed been a wide-ranging and miscellaneous debate. I hope that since we are considering the Bill on Report noble Lords will forgive me if I do not endeavour to summarise all the different speeches and confine myself to one or two points.
The first is to thank the noble Baroness, Lady Kidron, for her support for my amendment but also to say that having heard her argument in favour of her Amendment 191A, I think the difference between us is entirely semantic. Had she worded it so as to say that Ofcom should be under a duty to offer advice to the Chief Coroner, as opposed to guidance to coroners, I would have been very much happier with it. Guidance issued under statute has to carry very considerable weight and, as my noble friend the Minister said, there is a real danger in that case of an arm of the Executive, if you like, or a creature of Parliament—however one wants to regard Ofcom—interfering in the independence of the judiciary. Had she said “advice to the Chief Coroner and whoever is the appropriate officer in Scotland”, that would have been something I could have given wholehearted support to. I hope she will forgive me for raising that quibble at the outset, but I think it is a quibble rather than a substantial disagreement.
On my own amendment, I simply say that I am grateful to my noble friend for the brevity and economy with which he disposed of it. He was of course assisted in that by the remarks and arguments made by many other noble Lords in the House as they expressed their support for it in principle.
I think there is a degree of confusion about what the Bill is doing. There seemed to be a sense that somehow the amendment was giving individuals the right to bring actions in the courts against providers, but of course that already happens because that right exists and is enshrined in Article 65. All the amendment would do is give some balance so that consumers actually had some protections in what is normally, in essence, an unequal contest, which is trying to ensure that a large company enforces the terms and contracts that it has written.
In particular, my amendment would give, as I think noble Lords know, the right to demand repeat performance—that is, in essence, the right to put things right, not monetary compensation—and it would frustrate any attempts by providers, in drafting their own terms and conditions, to limit their own liability. That is of course what they seek to do but the Consumer Rights Act frustrates them in their ability to do so.
We will say no more about that for now. With that, I beg leave to withdraw my amendment.
My Lords, transparency and accountability are at the heart of the regulatory framework that the Bill seeks to establish. It is vital that Ofcom has the powers it needs to require companies to publish online safety information and to scrutinise their systems and processes, particularly their algorithms. The Government agree about the importance of improving data sharing with independent researchers while recognising the nascent evidence base and the complexities of this issue, which we explored in Committee. We are pleased to be bringing forward a number of amendments to strengthen platforms’ transparency, which confer on Ofcom new powers to assess how providers’ algorithms work, which accelerate the development of the evidence base regarding researchers’ access to information and which require Ofcom to produce guidance on this issue.
Amendment 187 in my name makes changes to Clause 65 on category 1 providers’ duties to create clear and accessible terms of service and apply them consistently and transparently. The amendment tightens the clause to ensure that all the providers’ terms through which they might indicate that a certain kind of content is not allowed on its service are captured by these duties.
Amendment 252G is a drafting change, removing a redundant paragraph from the Bill in relation to exceptions to the legislative definition of an enforceable requirement in Schedule 12.
In relation to transparency, government Amendments 195, 196, 198 and 199 expand the types of information that Ofcom can require category 1, 2A and 2B providers to publish in their transparency reports. With thanks to the noble Lord, Lord Stevenson of Balmacara, for his engagement on this issue, we are pleased to table these amendments, which will allow Ofcom to require providers to publish information relating to the formulation, development and scope of user-to-user service providers’ terms of service and search service providers’ public statements of policies and procedures. This is in addition to the existing transparency provision regarding their application.
Amendments 196 and 199 would enable Ofcom to require providers to publish more information in relation to algorithms, specifically information about the design and operation of algorithms that affect the display, promotion, restriction, discovery or recommendation of content subject to the duties in the Bill. These changes will enable greater public scrutiny of providers’ terms of service and their algorithms, providing valuable information to users about the platforms that they are using.
As well as publicly holding platforms to account, the regulator must be able to get under the bonnet and scrutinise the algorithms’ functionalities and the other systems and processes that they use. Empirical tests are a standard method for understanding the performance of an algorithmic system. They involve taking a test data set, running it through an algorithmic system and observing the output. These tests may be relevant for assessing the efficacy and wider impacts of content moderation technology, age-verification systems and recommender systems.
Government Amendments 247A, 250A, 252A, 252B, 252C, 252D, 252E and 252F will ensure that Ofcom has the powers to enable it to direct and observe such tests remotely. This will significantly bolster Ofcom’s ability to assess how a provider’s algorithms work, and therefore to assess its compliance with the duties in the Bill. I understand that certain technology companies have voiced some concerns about these powers, but I reassure your Lordships that they are necessary and proportionate.
The powers will be subject to a number of safeguards. First, they are limited to viewing information. Ofcom will be unable to remotely access or interfere with the service for any other purpose when exercising the power. These tests would be performed offline, meaning that they would not affect the services’ provision or the experience of users. Assessing systems, processes, features and functionalities is the focus of the powers. As such, individual user data and content are unlikely to be the focus of any remote access to view information.
Additionally, the power can be used only where it is proportionate to use in the exercise of Ofcom’s functions—for example, when investigating whether a regulated service has complied with relevant safety duties. A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was unlawful. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.
The Bill contains no restriction on services making the existence and detail of the information notice public. Should a regulated service wish to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. In addition, the amendments create no restrictions on the use of this power being viewable to members of the public through a request, such as those under the Freedom of Information Act—noting that under Section 393 of the Communications Act, Ofcom will not be able to disclose information it has obtained through its exercise of these powers without the provider’s consent, unless permitted for specific, defined purposes. These powers are necessary and proportionate and will that ensure Ofcom has the tools to understand features and functionalities and the risks associated with them, and therefore the tools to assess companies’ compliance with the Bill.
Finally, I turn to researchers’ access to data. We recognise the valuable work of researchers in improving our collective understanding of the issues we have debated throughout our scrutiny of the Bill. However, we are also aware that we need to develop the evidence base to ensure that any sharing of sensitive information between companies and researchers can be done safely and securely. To this end, we are pleased to table government Amendments 272B, 272C and 272D.
Government Amendment 272B would require Ofcom to publish its report into researcher access to information within 18 months, rather than two years. This report will provide the evidence base for government Amendments 272C and 272D, which would require Ofcom to publish guidance on this issue. This will provide valuable, evidence-based guidance on how to improve access for researchers safely and securely.
That said, we understand the calls for further action in this area. The Government will explore this issue further and report back to your Lordships’ House on whether further measures to support researchers’ access to data are required—and if so, whether they could be implemented through other legislation, such as the Data Protection and Digital Information Bill. I beg to move.
My Lords, Amendment 247B in my name was triggered by government Amendment 247A, which the Minister just introduced. I want to explain it, because the government amendment is quite late—it has arrived on Report—so we need to look in some detail at what the Government have proposed. The phrasing that has caused so much concern, which the Minister has acknowledged, is that Ofcom will be able to
“remotely access the service provided by the person”.
It is those words—“remotely access”—which are trigger words for anyone who lived through the Snowden disclosures, where everyone was so concerned about remote access by government agencies to precisely the same services we are talking about today: social media services.
My Lords, I am grateful to noble Lords for their contributions in this group. On the point made by the noble Lord, Lord Knight of Weymouth, on why we are bringing in some of these powers now, I say that the power to direct and observe algorithms was previously implicit within Ofcom’s information powers and, where a provider has UK premises, under powers of entry, inspection and audit under Schedule 12. However, the Digital Markets, Competition and Consumers Bill, which is set to confer similar powers on the Competition and Markets Authority and its digital markets unit, makes these powers explicit. We wanted to ensure that there was no ambiguity over whether Ofcom had equivalent powers in the light of that. Furthermore, the changes we are making ensure that Ofcom can direct and observe algorithmic assessments even if a provider does not have relevant premises or equipment in the UK.
I am grateful to the noble Lord, Lord Allan of Hallam, for inviting me to re-emphasise points and allay the concerns that have been triggered, as his noble friend Lord Clement-Jones put it. I am happy to set out again a bit of what I said in opening this debate. The powers will be subject to a number of safeguards. First, they are limited to “viewing information”. They can be used only where they are proportionate in the exercise of Ofcom’s functions, and a provider would have the right to bring a legal challenge against Ofcom if it considered that a particular exercise of the power was done unlawfully. Furthermore, Ofcom will be under a legal obligation to ensure that the information gathered from services is protected from disclosure, unless clearly defined exemptions apply.
These are not secret powers, as the noble Lord rightly noted. The Bill contains no restriction on services making the existence and detail of the information notice public. If a regulated service wished to challenge an information notice served to it by Ofcom, it would be able to do so through judicial review. I also mentioned the recourse that people have through existing legislation, such as the Freedom of Information Act, to give them safeguards, noting that, under Section 393 of the Communications Act, Ofcom will not be able to disclose information that it has obtained through its exercise of these powers without the provider’s consent unless that is permitted for specific, defined purposes.
The noble Lord’s Amendment 247B seeks to place further safeguards on Ofcom’s use of its new power to access providers’ systems remotely to observe tests. While I largely agree with the intention behind it, there are already a number of safeguards in place for the use of that power, including in relation to data protection, legally privileged material and the disclosure of information, as I have outlined. Ofcom will not be able to gain remote access simply for exploratory or fishing purposes, and indeed Ofcom expects to have conversations with services about how to provide the information requested.
Furthermore, before exercising the power, Ofcom will be required to issue an information notice specifying the information to be provided, setting out the parameters of access and why Ofcom requires the information, among other things. Following the receipt of an information notice, a notice requiring an inspection or an audit notice, if a company has identified that there is an obvious security risk in Ofcom exercising the power as set out in the notice, it may not be proportionate to do so. As set out in Ofcom’s duties, Ofcom must have regard to the principles under which regulatory activities should be proportionate and targeted only at cases where action is needed.
In line with current practice, we anticipate Ofcom will issue information notice requests in draft form to identify and address any issues, including in relation to security, before the information notice is issued formally. Ofcom will have a legal duty to exercise its remote access powers in a way that is proportionate, ensuring that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information, and whether there was a less onerous method of obtaining the necessary information to ensure that the use of this power is proportionate. As I said, the remote access power is limited to “viewing information”. Under this power, Ofcom will be unable to interfere or access the service for any other purpose.
In practice, Ofcom will work with services during the process. It is required to specify, among other things, the information to be provided, which will set the parameters of its access, and why it requires the information, which will explain the link between the information it seeks and the online safety function that it is exercising or deciding whether to exercise.
As noble Lords know, Ofcom must comply with the UK’s data protection law. As we have discussed in relation to other issues, it is required to act compatibly with the European Convention on Human Rights, including Article 8 privacy rights. In addition, under Clause 91(7), Ofcom is explicitly prohibited from requiring the provision of legally privileged information. It will also be under a legal obligation to ensure that the information gathered from services is protected from disclosure unless clearly defined exemptions apply, such as those under Section 393(2) of the Communications Act 2003—for example, the carrying out of any of Ofcom’s functions. I hope that provides reassurance to the noble Lord, Lord Allan, and the noble Baroness, Lady Fox, who raised these questions.
I am grateful to the Minister. That was helpful, particularly the description of the process and the fact that drafts have to be issued early on. However, it still leaves open a couple of questions, one of which was very helpfully raised by the noble Lord, Lord Knight. We have in Schedule 12 this other set of protections that could be applied. There is a genuine question as to why this has been put in this place and not there.
The second question is to dig a little more into the question of what happens when there is a dispute. The noble Lord, Lord Moylan, pointed out that if you have created a backdoor then you have created a backdoor, and it is dangerous. If we end up in a situation where a company believes that what it is being asked to do by Ofcom is fundamentally problematic and would create a security risk, it will not be good enough to open up the backdoor and then have a judicial review. It needs to be able to say no at that stage, yet the Bill says that it could be committing a serious criminal offence by failing to comply with an information notice. We want some more assurances, in some form, about what would happen in a scenario where a company genuinely and sincerely believes that what Ofcom is asking for is inappropriate and/or dangerous and it wants not to have to offer it unless and until its challenge has been looked at, rather than having to offer it and then later judicially review a decision. The damage would already have been done by opening up an inappropriate backdoor.
A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.
Before the Minister sits down, to quote the way the Minister has operated throughout Report, there is consensus across the House that there are some concerns. The reason why there are concerns outside and inside the House on this particular amendment is that it is not entirely clear that those protections exist, and there are worries. I ask the Minister whether, rather than just writing, it would be possible to take this back to the department, table a late amendment and say, “Look again”. That has been done before. It is certainly not too late: if it was not too late to have this amendment then it is certainly not too late to take it away again and to adopt another amendment that gives some safeguarding. Seriously, it is worth looking again.
I had not quite finished; the noble Baroness was quick to catch me before I sat down. I still have some way to go, but I will certainly take on board all the points that have been made on this group.
The noble Lord, Lord Knight, asked about Schedule 12. I will happily write with further information on that, but Schedule 12 is about UK premises, so it is probably not the appropriate place to deal with this, as we need to be able to access services in other countries. If there is a serious security risk then it would not necessarily be proportionate. I will write to him with further details.
I am grateful to the Minister for giving way so quickly. I think the House is asking him to indicate now that he will go away and look at this issue, perhaps with some of us, and that, if necessary, he would be willing to look at coming back with something at Third Reading. From my understanding of the Companion, I think he needs to say words to that effect to allow him to do so, if that is what he subsequently wants to do at Third Reading.
I am very happy to discuss this further with noble Lords, but I will reserve the right, pending that discussion, to decide whether we need to return to this at Third Reading.
Amendments 270 and 272, tabled by my noble friend Lady Fraser of Craigmaddie, to whom I am very grateful for her careful scrutiny of the devolved aspects of the Bill, seek to require Ofcom to include separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in the research about users’ experiences of regulated services and in Ofcom’s transparency reports. While I am sympathetic to her intention—we have corresponded on it, for which I am grateful—it is important that Ofcom has and retains the discretion to prioritise information requests that will best shed light on the experience of users across the UK.
My noble friend and other noble Lords should be reassured that Ofcom has a strong track record of using this discretion to produce data which are representative of people across the whole United Kingdom. Ofcom is committed to reflecting the online experiences of users across the UK and intends, wherever possible, to publish data at a national level. When conducting research, Ofcom seeks to gather views from a representative sample of the United Kingdom and seeks to set quotas that ensure an analysable sample within each of the home nations.
It is also worth noting the provisions in the Communications Act 2003 that require Ofcom to operate offices in each of the nations of the UK, to maintain advisory committees for each, and to ensure their representation on its various boards and panels—and, indeed, on the point raised by the noble Baroness, Lady Kidron, to capture the experiences of children and users of all ages. While we must give Ofcom the discretion it needs to ensure that the framework is flexible and remains future-proofed, I hope that I have reassured my noble friend that her point will indeed be captured, reported on and be able to be scrutinised, not just in this House but across the UK.
I am grateful to the Minister for giving way. My premise is that the reason Ofcom reports in a nation-specific way in broadcasting and in communications is because there is a high-level reference in both the Communications Act 2003 and the BBC charter that requires it to do so, because it feeds down into national quotas and so on. There is currently nothing of that equivalence in the Online Safety Bill. Therefore, we are relying on Ofcom’s discretion, whereas in the broadcasting and communications area we have a high-level reference to insisting that there is a breakdown by nation.
We think we can rely on Ofcom’s discretion, and point to its current practice. I hope that will reassure my noble friend that it will set out the information she seeks.
I was about to say that I am very happy to write to the noble Lord, Lord Stevenson, about the manner by which consent is given in Clause 53(5)(c), but I think his question is on something else.
I would be grateful if the Minister could repeat that immediately afterwards, when I will listen much harder.
Just to echo what the noble Baroness was saying, may we take it as an expectation that approaches that are signalled in legislation for broadcasting and communications should apply pari passu to the work of Ofcom in relation to the devolved Administrations?
Yes, and we can point to the current actions of Ofcom to show that it is indeed doing this already, even without that legislative stick.
I turn to the amendments in the name of my noble friend Lord Bethell and the noble Lord, Lord Clement-Jones, on researchers’ access to data. Amendment 237ZA would confer on the Secretary of State a power to make provisions about access to information by researchers. As my noble friend knows, we are sympathetic to the importance of this issue, which is why we have tabled our own amendments in relation to it. However, as my noble friend also knows, in such a complex and sensitive area that we think it is premature to endow the Secretary of State with such broad powers to introduce a new framework. As we touched on in Committee, this is a complex and still nascent area, which is why it is different from the other areas to which the noble Lord, Lord Clement-Jones, pointed in his contribution.
The noble Baroness, Lady Harding, made the point that in other areas where the Minister has agreed to reviews or reports, there are backstop powers; for instance, on app stores. Of course, that was a negotiated settlement, so to speak, but why can the Minister not accede to that in the case of access for researchers, as he has with app stores? Indeed, there is one other example that escapes me, which the Minister has also agreed to.
We touched on the complexity of defining who and what is a researcher and making sure that we do not give rise to bad actors exploiting that. This is a complex area, as we touched on in Committee. As I say, the evidence base here is nascent. It is important first to focus on developing our understanding of the issues to ensure that any power or legislation is fit to address those challenges. Ofcom’s report will not only highlight how platforms can share data with researchers safely but will provide the evidence base for considering any future policy approaches, which we have committed to doing but which I think the noble Lord will agree are worthy of further debate and reflection in Parliament.
The benefit of having a period of time between the last day of Report on Wednesday and Third Reading is that that gives the Minister, the Bill team and parliamentary counsel the time to reflect on the kind of power that could be devised. The wording could be devised, and I would have thought that six weeks would be quite adequate for that, perhaps in a general way. After all, this is not a power that is immediately going to be used; it is a general power that could be brought into effect by regulation. Surely it is not beyond the wit to devise something suitable.
Sit down or stand up—I cannot remember.
I wonder whether the department has looked at the DSA and other situations where this is being worked out. I recognise that it takes a period of time, but it is not without some precedent that a pathway should be described.
We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.
Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.
Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.
Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.
On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.
Just on the timescale, one of the issues that we talked about in Committee was the fact that there needs to be some kind of mechanism created, with a code of practice with reference to data protection law and an approving body to approve researchers as suitable to take information; the noble Baroness, Lady Kidron, referred to the DSA process, which the European Union has been working on. I hope the Minister can confirm that Ofcom might get moving on establishing that. It is not dependent on there being a report in 18 months; in fact, you need to have it in place when you report in 18 months, which means you need to start building it now. I hope the Minister would want Ofcom, within its existing framework, to be encouraging the creation of that researcher approval body and code of practice, not waiting to start that process in 18 months’ time.
I will continue my train of thought on my noble friend’s amendments, which I hope will cover that and more.
My noble friend’s Amendment 273A would allow Ofcom to appoint approved independent researchers to access information. Again, given the nascent evidence base here, it is important to focus on understanding these issues before we commit to a researcher access framework.
Under the skilled persons provisions, Ofcom will already have the powers to appoint a skilled person to assess compliance with the regulatory framework; that includes the ability to leverage the expertise of independent researchers. My noble friend’s Amendment 273B would require Ofcom to produce a code of practice on access to data by researchers. The government amendments I spoke to earlier will require Ofcom to produce guidance on that issue, which will help to promote information sharing in a safe and secure way.
To the question asked by the noble Lord, Lord Allan: yes, Ofcom can start the process and do it quickly. The question here is really about the timeframe in which it does so. As I said in opening, we understand the calls for further action in this area.
I am happy to say to my noble friend Lord Bethell, to whom we are grateful for his work on this and the conversations we have had, that we will explore the issue further and report back on whether further measures to support researchers’ access to data are required and, if so, whether they can be implemented through other legislation, such as the Data Protection and Digital Information (No.2) Bill.
Before the Minister sits down—he has been extremely generous in taking interventions—I want to put on record my understanding of his slightly ambiguous response to Amendment 247A, so that he can correct it if I have got it wrong. My understanding is that he has agreed to go away and reflect on the amendment and that he will have discussions with us about it. Only if he then believes that it is helpful to bring forward an amendment at Third Reading will he do so.
Yes, but I do not want to raise the hopes of the noble Lord or others, with whom I look forward to discussing this matter. I must manage their expectations about whether we will bring anything forward. With that, I beg to move.
My Lords, the amendments in this group relate to provisions for media literacy in the Bill and Ofcom’s existing duty on media literacy under Section 11 of the Communications Act 2003. I am grateful to noble Lords from across your Lordships’ House for the views they have shared on this matter, which have been invaluable in helping us draft the amendments.
Media literacy remains a key priority in our work to tackle online harms; it is essential not only to keep people safe online but for them to understand how to make informed decisions which enhance their experience of the internet. Extensive work is currently being undertaken in this area. Under Ofcom’s existing duty, the regulator has initiated pilot work to promote media literacy. It is also developing best practice principles for platform-based media literacy measures and has published guidance on how to evaluate media literacy programmes.
While we believe that the Communications Act provides Ofcom with sufficient powers to undertake an ambitious programme of media literacy activity, we have listened to the concerns raised by noble Lords and understand the desire to ensure that Ofcom is given media literacy objectives which are fit for the digital age. We have therefore tabled the following amendments seeking to update Ofcom’s statutory duty to promote media literacy, in so far as it relates to regulated services.
Amendment 274B provides new objectives for Ofcom to meet in discharging its duty. The first objective requires Ofcom to take steps to increase the public’s awareness and understanding of how they can keep themselves and others safe when using regulated services, including building the public’s understanding of the nature and impact of harmful content online, such as disinformation and misinformation. To meet that objective, Ofcom will need to carry out, commission or encourage the delivery of activities and initiatives which enhance users’ media literacy in these ways.
It is important to note that, when fulfilling this new objective, Ofcom will need to increase the public’s awareness of the ways in which they can protect groups that disproportionately face harm online, such as women and girls. The updated duty will also compel Ofcom to encourage the development and use of technologies and systems that support users of regulated services to protect themselves and others. Ofcom will be required to publish a statement recommending ways in which others, including platforms, can take action to support their users’ media literacy.
Amendment 274C places a new requirement on Ofcom to publish a strategy setting out how it will fulfil its media literacy functions under Section 11, including the new objectives. Ofcom will be required to update this strategy every three years and report on progress made against it annually to provide assurance that it is fulfilling its duty appropriately. These reports will be supported by the post-implementation review of the Bill, which covers Ofcom’s media literacy duty in so far as it relates to regulated services. This will provide a reasonable point at which to establish the impact of Ofcom’s work, having given it time to take effect.
I am confident that, through this updated duty, Ofcom will be empowered to ensure that internet users become more engaged with media literacy and, as a result, are safer online. I hope that these amendments will find support from across your Lordships’ House, and I beg to move.
My Lords, I welcome this proposed new clause on media literacy and support the amendments in the names of the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth. I will briefly press the Minister on two points. First, proposed new subsection (1C) sets out how Ofcom must perform its duty under proposed new subsection (1A), but it does not explicitly require Ofcom to work in partnership with existing bodies already engaged in and expert in provision of these kinds of activities. The potential for Ofcom to commission is explicit, but this implies quite a top-down relationship, not a collaboration that builds on best practice, enables scale-up where appropriate and generally avoids reinventing wheels. It seems like a wasted opportunity to fast-track delivery of effective programmes through partnership.
My second concern is that there is no explicit requirement to consider the distinct needs of specific user communities. In particular, I share the concerns of disability campaigners and charities that media literacy activities and initiatives need to take into account the needs of people with learning disabilities, autism and mental capacity issues, both in how activities are shaped and in how they are communicated. This is a group of people who have a great need to go online and engage, but we also know that they are at greater risk online. Thinking about how media literacy can be promoted, particularly among learning disability communities, is really important.
The Minister might respond by saying that Ofcom is already covered by the public sector equality duty and so is already obliged to consider the needs of people with protected characteristics when designing and implementing policies. But the unfortunate truth is that the concerns of the learning disability community are an afterthought in legislation compared with other disabilities, which are already an afterthought. The Petitions Committee in the other place, in its report on online abuse and the experience of disabled people, noted that there are multiple disabled people around the country with the skills and experience to advise government and its bodies but that there is a general unwillingness to engage directly with them. They are often described as hard to reach, which is kind of ironic because in fact most of these people use multiple services and so are very easy to reach, because they are on lots of databases and in contact with government bodies all the time.
The Minister may also point out that Ofcom’s duties in the Communications Act require it to maintain an advisory committee on elderly and disabled persons that includes
“persons who are familiar with the needs of persons with disabilities”.
But referring to an advisory committee is not the same as consulting people with disabilities, both physical and mental, and it is especially important to consult directly with people who may have difficulty understanding what is being proposed. Talking to people directly, rather than through an advisory committee, is very much the goal.
Unlike the draft Bill, which had media literacy as a stand-alone clause, the intention in this iteration is to deal with the issue by amending the Communications Act. It may be that in the web of interactions between those two pieces of legislation, my concerns can be set to rest. But I would find it very helpful if the Minister could confirm today that the intention is that media literacy programmes will be developed in partnership with—and build on best practice of—those organisations already delivering in this space and that the organisations Ofcom collaborates with will be fully inclusive of all communities, including those with disabilities and learning disabilities. Only in this way can we be confident that media literacy programmes will meet their needs effectively, both in content and in how they are communicated.
Finally, can the Minister confirm whether Ofcom considers people with lived experience of disability as subject matter experts on disability for the purpose of fulfilling its consultation duties? I asked this question during one of the helpful briefing sessions during the Bill’s progress earlier this year, but I did not get an adequate answer. Can the Minister clarify that for the House today?
My Lords, the Government have moved on this issue, and I very much welcome that. I am grateful to the Minister for listening and for the fact that we now have Section 11 of the Communications Act being brought into the digital age through the Government’s Amendments 274B and 274C. The public can now expect to be informed and educated about content-related harms, reliability and accuracy; technology companies will have to play their part; and Ofcom will have to regularly report on progress, and will commission and partner with others to fulfil those duties. That is great progress.
The importance of this was underscored at a meeting of the United Nations Human Rights Council just two weeks. Nada Al-Nashif, the UN Deputy High Commissioner for Human Rights in an opening statement said that media and digital literacy empowered individuals and
“should be considered an integral part of education efforts”.
Tawfik Jelassi, the assistant director-general of UNESCO, in a statement attached to that meeting, said that
“media and information literacy was essential for individuals to exercise their right to freedom of opinion and expression”—
I put that in to please the noble Baroness, Lady Fox—and
“enabled access to diverse information, cultivated critical thinking, facilitated active engagement in public discourse, combatted misinformation, and safeguarded privacy and security, while respecting the rights of others”.
If only the noble Lord, Lord Moylan, was in his place to hear me use the word privacy. He continued:
“Together, the international community could ensure that media and information literacy became an integral part of everyone’s lives, empowering all to think critically, promote digital well-being, and foster a more inclusive and responsible global digital community”.
I thought those were great words, summarising why we needed to do this.
I am grateful to Members on all sides of the House for the work that they have done on media literacy. Part of repeating those remarks was that this is so much more about empowerment than it is about loading safety on to individuals, as the noble Baroness, Lady Kidron, rightly said in her comments.
Nevertheless, we want the Minister to reflect on a couple of tweaks. Amendment 269C in my name is around an advisory committee being set up within six months and in its first report assessing the need for a code on misinformation. I have a concern that, as the regime that we are putting in place with this Bill comes into place and causes some of the harmful content that people find engaging to be suppressed, the algorithms will go to something else that is engaging, and that something else is likely to be misinformation and disinformation. I have a fear that that will become a growing problem that the regulator will need to be able to address, which is why it should be looking at this early.
Incidentally, that is why the regulator should also look at provenance, as in Amendment 269AA from the noble Lord, Lord Clement-Jones. It was tempting in listening to him to see whether there was an AI tool that could trawl across all the comments that he has made during the deliberations on this Bill to see whether he has quoted the whole of the joint report—but that is a distraction.
My Amendment 269D goes to the need for media literacy on systems, processes and business models, not just on content. Time and again, we have emphasised the need for this Bill to be as much about systems as content. There are contexts where individual, relatively benign pieces of content can magnify if part of a torrent that then creates harm. The Mental Health Foundation has written to many of us to make this point. In the same way that the noble Baroness, Lady Bull, asked about ensuring that those with disability have their own authentic voice heard as these media literacy responsibilities are played out, so the Mental Health Foundation wanted the same kind of involvement from young people; I agree with both. Please can we have some reassurance that this will be very much part of the literacy duties on Ofcom and the obligations it places on service providers?
My Lords, I am grateful to noble Lords for their comments, and for the recognition from the noble Lord, Lord Knight, of the changes that we have made. I am particularly grateful to him for having raised media literacy throughout our scrutiny of this Bill.
His Amendments 269C and 269D seek to set a date by which the establishment of the advisory committee on misinformation and disinformation must take place and to set requirements for its first report. Ofcom recognises the valuable role that the committee will play in providing advice in relation to its duties on misinformation and disinformation, and has assured us that it will aim to establish the committee as soon as is reasonably possible, in recognition of the threats posed by misinformation and disinformation online.
Given the valuable role of the advisory committee, Ofcom has stressed how crucial it will be to have appropriate time to appoint the best possible committee. Seeking to prescribe a timeframe for its implementation risks impeding Ofcom’s ability to run the thorough and transparent recruitment process that I am sure all noble Lords want and to appoint the most appropriate and expert members. It would also not be appropriate for the Bill to be overly prescriptive on the role of the committee, including with regard to its first report, in order for it to maintain the requisite independence and flexibility to give us the advice that we want.
Amendment 269AA from the noble Lord, Lord Clement-Jones, seeks to add advice on content provenance to the duties of the advisory committee. The new media literacy amendments, which update Ofcom’s media literacy duties, already include a requirement for Ofcom to take steps to help users establish the reliability, accuracy and authenticity of content found on regulated services. Ofcom will have duties and mechanisms to be able to advise platforms on how they can help users to understand whether content is authentic; for example, by promoting tools that assist them to establish the provenance of content, where appropriate. The new media literacy duties will require Ofcom to take tangible steps to prioritise the public’s awareness of and resilience to misinformation and disinformation online. That may include enabling users to establish the reliability, accuracy and authenticity of content, but the new duties will not remove content online; I am happy to reassure the noble Baroness, Lady Fox, on that.
The advisory committee is already required under Clause 141(4)(c) to advise Ofcom on its exercise of its media literacy functions, including its new duties relating to content authenticity. The Bill does not stipulate what tools service providers should use to fulfil their duties, but Ofcom will have the ability to recommend in its codes of practice that companies use tools such as provenance technologies to identify manipulated media which constitute illegal content or content that is harmful to children, where appropriate. Ofcom is also required to take steps to encourage the development and use of technologies that provide users with further context about content that they encounter online. That could include technologies that support users to establish content provenance. I am happy to reassure the noble Lord, Lord Clement-Jones, that the advisory committee will already be required to advise on the issues that he has raised in his amendment.
On media literacy more broadly, Ofcom retains its overall statutory duty to promote media literacy, which remains broad and non-prescriptive. The new duties in this Bill, however, are focused specifically on harm; that is because the of nature of the Bill, which seeks to make the UK the safest place in the world to be online and is necessarily focused on tackling harms. To ensure that Ofcom succeeds in the delivery of these new specific duties with regard to regulated services, it is necessary that the regulator has a clearly defined scope. Broadening the duties would risk overburdening Ofcom by making its priorities less clear.
The noble Baroness, Lady Bull—who has been translated to the Woolsack while we have been debating this group—raised media literacy for more vulnerable users. Under Ofcom’s existing media literacy programme, it is already delivering initiatives to support a range of users, including those who are more vulnerable online, such as people with special educational needs and people with disabilities. I am happy to reassure her that, in delivering this work, Ofcom is already working not just with expert groups including Mencap but with people with direct personal experiences of living with disabilities.
The noble Lord, Lord Clement-Jones, raised Ofsted. Effective regulatory co-ordination is essential for addressing the crosscutting opportunities and challenges posed by digital technologies and services. Ofsted will continue to engage with Ofcom through its existing mechanisms, including engagement led by its independent policy team and those held with Ofcom’s online safety policy director. In addition to that, Ofsted is considering mechanisms through which it can work more closely with Ofcom where appropriate. These include sharing insights from inspections in an anonymised form, which could entail reviews of its inspection bases and focus groups with inspectors, on areas of particular concern to Ofcom. Ofsted is committed to working with Ofcom’s policy teams to work these plans up in more detail.
My Lords, could I ask the Minister a question? He has put his finger on one of the most important aspects of this Bill: how it will integrate with the Department for Education and all its responsibilities for schools. Again, talking from long experience, one of the worries is the silo mentality in Whitehall, which is quite often strongest in the Department for Education. Some real effort will be needed to make sure there is a crossover from the powers that Ofcom has to what happens in the classroom.
I hope what I have said about the way that Ofsted and Ofcom are working together gives the noble Lord some reassurance. He is right, and it is not just in relation to the Department for Education. In my own department, we have discussed in previous debates on media literacy the importance of critical thinking, equipping people with the sceptical, quizzical, analytic skills they need—which art, history and English literature do as well. The provisions in this Bill focus on reducing harm because the Bill is focused on making the UK the safest place to be online, but he is right that media literacy work more broadly touches on a number of government departments.
Amendment 274BA would require Ofcom to promote an understanding of how regulated services’ business models operate, how they use personal data and the operation of their algorithmic systems and processes. We believe that Ofcom’s existing duty under the Communications Act already ensures that the regulator can cover these aspects in its media literacy activities. The duty requires Ofcom to build public awareness of the processes by which material on regulated services is selected or made available. This enables Ofcom to address the platform features specified in this amendment.
The Government’s amendments include extensive new objectives for Ofcom, which apply to harmful ways in which a service is used as well as harmful content. We believe it important not to add further to this duty when the outcomes can already be achieved through the existing duty. We do not wish to limit, by implication, Ofcom’s media literacy duties in relation to other, non-regulated services.
We also judge that the noble Lord’s amendment carries a risk of confusing the remits of Ofcom and the Information Commissioner’s Office. UK data protection law already confers a right for people to be informed about how their personal data are being used, making this aspect of the amendment superfluous.
I do not believe that the Minister has dealt with the minimum standards issue.
I do not think that the noble Lord was listening to that point, but I did.
My Lords, I am grateful for the opportunity to set out the need for Clauses 158 and 159. The amendments in this group consider the role of government in two specific areas: the power for the Secretary of State to direct Ofcom about its media literacy functions in special circumstances and the power for the Secretary of State to issue non-binding guidance to Ofcom. I will take each in turn.
Amendment 219 relates to Clause 158, on the Secretary of State’s power to direct Ofcom in special circumstances. These include where there is a significant threat to public safety, public health or national security. This is a limited power to enable the Secretary of State to set specific objectives for Ofcom’s media literacy activity in such circumstances. It allows the Secretary of State to direct Ofcom to issue public statement notices to regulated service providers, requiring providers to set out the steps they are taking to address the threat. The regulator and online platforms are thereby compelled to take essential and transparent actions to keep the public sufficiently informed during crises. The powers ensure that the regulatory framework is future-proofed and well equipped to respond in such circumstances.
As the noble Lord, Lord Clement-Jones, outlined, I corresponded with him very shortly before today’s debate and am happy to set out a bit more detail for the benefit of the rest of the House. As I said to him by email, we expect the media literacy powers to be used only in exceptional circumstances, where it is right that the Secretary of State should have the power to direct Ofcom. The Government see the need for an agile response to risk in times of acute crisis, such as we saw during the Covid-19 pandemic or in relation to the war in Ukraine. There may be a situation in which the Government have access to information, through the work of the security services or otherwise, which Ofcom does not. This power enables the Secretary of State to make quick decisions when the public are at risk.
Our expectation is that, in exceptional circumstances, Ofcom would already be taking steps to address harm arising from the provision of regulated services through its existing media literacy functions. However, these powers will allow the Secretary of State to step in if necessary to ensure that the regulator is responding effectively to these sudden threats. It is important to note that, for transparency, the Secretary of State will be required to publish the reasons for issuing a direction to Ofcom in these circumstances. This requirement does not apply should the circumstances relate to national security, to protect sensitive information.
The noble Lord asked why we have the powers under Clause 158 when they do not exist in relation to broadcast media. We believe that these powers are needed with respect to social media because, as we have seen during international crises such as the Covid-19 pandemic, social media platforms can sadly serve as hubs for low-quality, user-generated information that is not required to meet journalistic standards, and that can pose a direct threat to public health. By contrast, Ofcom’s Broadcasting Code ensures that broadcast news, in whatever form, is reported with due accuracy and presented with due impartiality. Ofcom can fine, or ultimately revoke a licence to broadcast in the most extreme cases, if that code is breached. This means that regulated broadcasters can be trusted to strive to communicate credible, authoritative information to their audiences in a way that social media cannot.
We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.
On the previous group, I raised the issue of legal speech that was labelled as misinformation and removed in the extreme situation of a public health panic. This was seemingly because the Government were keen that particular public health information was made available. Subsequently, we discovered that those things were not necessarily untrue and should not have been removed. Is the Minister arguing that this power is necessary for the Government to direct that certain things are removed on the basis that they are misinformation—in which case, that is a direct attempt at censorship? After we have had a public health emergency in which “facts” have been contested and shown to not be as black and white or true as the Government claimed, saying that the power will be used only in extreme circumstances does not fill me with great confidence.
I am happy to make it clear, as I did on the last group, that the power allows Ofcom not to require platforms to remove content, only to set out what they are doing in response to misinformation and disinformation—to require platforms to make a public statement about what they are doing to tackle it. In relation to regulating news providers, we have brought the further amendments forward to ensure that those subject to sanctions cannot avail themselves of the special provisions in the Bill. Of course, the Secretary of State will be mindful of the law when issuing directions in the exceptional circumstances that these clauses set out.
While the Minister is describing that, can he explain exactly which media literacy power would be invoked by the kind of example I gave when I was introducing the amendment and in the circumstances he has talked about? Would he like to refer to the Communications Act?
It depends on the circumstances. I do not want to give one example for fear of being unnecessarily restrictive. In relation to the health misinformation and disinformation we saw during the pandemic, an example would be the suggestions of injecting oneself with bleach; that sort of unregulated and unhelpful advice is what we have in mind. I will write to the noble Lord, if he wants, to see what provisions of the Communications Act we would want invoked in those circumstances.
In relation to Clause 159, which is dealt with by Amendment 222, it is worth setting out that the Secretary of State guidance and the statement of strategic priorities have distinct purposes and associated requirements. The purpose of the statement of strategic priorities is to enable the Secretary of State to specifically set out priorities in relation to online safety. For example, in the future, it may be that changes in the online experience mean that the Government of the day wish to set out their high-level overarching priorities. In comparison, the guidance allows for clarification of what Parliament and Government intended in passing this legislation—as I hope we will—by providing guidance on specific elements of the Bill in relation to Ofcom’s functions. There are no plans to issue guidance under this power but, for example, we are required to issue guidance to Ofcom in relation to the fee regime.
On the respective requirements, the statement of strategic priorities requires Ofcom to explain in writing what it proposes to do in consequence of the statement and publish an annual review of what it has done. Whereas Ofcom must “have regard” to the guidance, the guidance itself does not create any statutory requirements.
This is a new regime and is different in its nature from other established areas of regulations, such as broadcasting. The power in Clause 159 provides a mechanism to provide more certainty, if that is considered necessary, about how the Secretary of State expects Ofcom to carry out its statutory functions. Ofcom will be consulted before guidance is issued, and there are checks on how often it can be issued and revised. The guidance document itself, as I said, does not create any statutory requirements, so Ofcom is required only to “have regard” to it.
This will be an open and transparent way to put forward guidance appropriately with safeguards in place. The independence of the regulator is not at stake here. The clause includes significant limitations on the power, and the guidance cannot fetter Ofcom’s operational independence. We feel that both clauses are appropriate for inclusion in the Bill, so I hope that the noble Lord will withdraw his amendment.
I thank the Minister for that more extended reply. It is a more reassuring response on Clause 159 than we have had before. On Clause 158, the impression I get is that the media literacy power is being used as a smokescreen for the Government telling social media what it should do, indirectly via Ofcom. That seems extraordinary. If the Government were telling the mainstream media what to do in circumstances like this, we would all be up in arms. However, it seems to be accepted as a part of the Bill and that we should trust the Government. The Minister used the phrase “special circumstances”. That is not the phraseology in the clause; it is that “circumstances exist”, and then it goes on to talk about national security and public health. The bar is very low.
I am sure everyone is getting hungry at this time of day, so I will not continue. However, we still have grave doubts about this clause. It seems an extraordinary indirect form of censorship which I hope is never invoked. In the meantime, I beg leave to withdraw my amendment.
My Lords, clearly, there is a limited number of speakers in this debate. We should thank the noble Lord, Lord Moylan, for tabling this amendment because it raises a very interesting point about the transparency—or not—of the Counter Disinformation Unit. Of course, it is subject to an Oral Question tomorrow as well, which I am sure the noble Viscount will be answering.
There is some concern about the transparency of the activities of the Counter Disinformation Unit. In its report, Ministry of Truth, which deals at some length with the activities of the Counter Disinformation Unit, Big Brother Watch says:
“Giving officials an unaccountable hotline to flag lawful speech for removal from the digital public square is a worrying threat to free speech”.
Its complaint is not only about oversight; it is about the activities. Others such as Full Fact have stressed the fact that there is little or no parliamentary scrutiny. For instance, freedom of information requests have been turned down and Written Questions which try to probe what the activities of the Counter Disinformation Unit are have had very little response. As it says, when the Government
“lobby internet companies about content on their platforms … this is a threat to freedom of expression”.
We need proper oversight, so I am interested to hear the Minister’s response.
My Lords, the Government share the view of my noble friend Lord Moylan about the importance of transparency in protecting freedom of expression. I reassure him and other noble Lords that these principles are central to the Government’s operational response to addressing harmful disinformation and attempts artificially to manipulate our information environment.
My noble friend and others made reference to the operational work of the Counter Disinformation Unit, which is not, as the noble Baroness, Lady Fox, said, the responsibility of my department but of the Department for Science, Innovation and Technology. The Government have always been transparent about the work of the unit; for example, recently publishing a factsheet on GOV.UK which sets out, among other things, how the unit works with social media companies.
I reassure my noble friend that there are existing processes governing government engagements with external parties and emphasise to him that the regulatory framework that will be introduced by the Bill serves to increase transparency and accountability in a way that I hope reassures him. Many teams across government regularly meet industry representatives on a variety of issues from farming and food to telecoms and digital infrastructure. These meetings are conducted within well-established transparency processes and frameworks, which apply in exactly the same way to government meetings with social media companies. The Government have been open about the fact that the Counter Disinformation Unit meets social media companies. Indeed, it would be surprising if it did not. For example, at the beginning of the Russian invasion of Ukraine, the Government worked with social media companies in relation to narratives which were being circulated attempting to deny incidents leading to mass casualties, and to encourage the promotion of authoritative sources of information. That work constituted routine meetings and was necessary in confirming the Government’s confidence in the preparedness and ability of platforms to respond to new misinformation and disinformation threats.
To require additional reporting on a sector-by-sector or department-by-department basis beyond the standardised transparency processes, as proposed in my noble friend’s amendment, would be a disproportionate and unnecessary response to what is routine engagement in an area where the Government have no greater powers or influence than in others. They cannot compel companies to alter their terms of service; nor can or do they seek to mandate any action on specific pieces of content.
I reassure the noble Baroness, Lady Fox, that the Counter Disinformation Unit does not monitor individual people, nor has it ever done so; rather, it tracks narratives and trends using publicly available information online to protect public health, public safety and national security. It has never tracked the activity of individuals, and there is a blanket ban on referring any content from journalists or parliamentarians to social media performs. The Government have always been clear that the Counter Disinformation Unit refers content for consideration only where an assessment has been made that it is likely to breach the platform’s own terms of service. It has no role in deciding what action, if any, to take in response, which is entirely a matter for the platform concerned.
As I said, the Bill will introduce new transparency, accountability and freedom of expression duties for category 1 services which will make the process for any removal or restriction of user-generated content more transparent by requiring category 1 services to set terms of service which are clear, easy for users to understand and consistently enforced. Category 1 services will be prohibited from removing or restricting user-generated content or suspending or banning users where this does not align with those terms of service. Any referrals from government will not, and indeed cannot, supersede these duties in the Bill.
Although I know it will disappoint my noble friend that another of his amendments has not been accepted, I hope I have been able to reassure him about the Government’s role in these processes. As the noble Lord, Lord Clement-Jones, noted, my noble friend Lord Camrose is answering a Question on this in your Lordships’ House tomorrow, further underlining the openness and parliamentary accountability with which we go about this work. I hope my noble friend will, in a similarly post-prandial mood of generosity, suppress his disappointment and feel able to withdraw his amendment.
Before the Minister sits down, I think that it is entirely appropriate for him to say—I have heard it before—“Oh no, nothing was taken down. None of this is believable. No individuals were targeted”. However, that is not the evidence I have seen, and it might well be that I have been shown misinformation. But that is why the Minister has to acknowledge that one of the problems here is that indicated by Full Fact—which, as we know, is often endorsed by government Ministers as fact-checkers. It says that because the Government are avoiding any scrutiny for this unit, it cannot know. It becomes a “he said, she said” situation. I am afraid that, because of the broader context, it would make the Minister’s life easier, and be clearer to the public—who are, after all, worried about this—if he accepted the ideas in the amendment of the noble Lord, Lord Moylan. We would then be clear and it would be out in the open. If the FOIs and so on that have been constantly put forward were answered, would that not clear it up?
I have addressed the points made by the noble Baroness and my noble friend already. She asks the same question again and I can give her the same answer. We are operating openly and transparently here, and the Bill sets out further provisions for transparency and accountability.
My Lords, I see what my noble friend did there, and it was very cunning. He gave us a very worthwhile account of the activities of the Counter Disinformation Unit, a body I had not mentioned at all, as if the Counter Disinformation Unit was the sole locus of this sort of activity. I had not restricted it to that. We know, in fact, that other bodies within government have been involved in undertaking this sort of activity, and on those he has given us no answer at all, because he preferred to answer about one particular unit. He referred also to its standardised transparency processes. I can hardly believe that I am reading out words such as those. The standardised transparency process allows us all to know that encounters take place but still refuses to let us know what actually happens in any particular encounter, even though there is a great public interest in doing so. However, I will not press it any further.
My noble friend, who is genuinely a friend, is in danger of putting himself, at the behest of civil servants and his ministerial colleagues, in some danger. We know what happens in these cases. The Minister stands at the Dispatch Box and says “This has never happened; it never normally happens; it will not happen. Individuals are never spoken of, and actions of this character are never taken”. Then of course, a few weeks or months later, out pour the leaked emails showing that all these things have been happening all the time. The Minister then has to resign in disgrace and it is all very sad. His friends, like myself, rally round and buy him a drink, before we never see him again.
Anyway, I think my noble friend must be very careful that he does not put himself in that position. I think he has come close to doing so this evening, through the assurances he has given your Lordships’ House. Although I do not accept those assurances, I will none the less withdraw the amendment, with the leave of the House.
My Lords, this has been a good debate. It is very hard to see where one would want to take it. If it proves anything, it is that the decision to drop the legal but harmful provisions in the Bill was probably taken for the wrong reasons but was the right decision, since this is where we end up—in an impossible moral quandary which no amount of writing, legalistic or otherwise, will get us out of. This should be a systems Bill, not a content Bill.
My Lords, I start by saying that accurate systems and processes for content moderation are crucial to the workability of this Bill and keeping users safe from harm. Amendment 228 from the noble Lord, Lord Allan of Hallam, seeks to remove the requirement for platforms to treat content as illegal or fraudulent content if reasonable grounds for that inference exist. The noble Lord set out his concerns about platforms over-removing content when assessing illegality.
Under Clause 173(5), platforms will need to have reasonable grounds to determine whether content is illegal or a fraudulent advertisement. Only when a provider has reasonable grounds to infer that said content is illegal or a fraudulent advertisement must it then comply with the relevant requirements set out in the Bill. This would mean removing the content or preventing people from encountering it through risk-based and proportionate systems and processes.
232: Schedule 17, page 247, line 35, at end insert—
“(ba) section (Assessment duties: user empowerment) (assessments related to the adult user empowerment duty set out in section 12(2)), and”
Member’s explanatory statement
This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty in the new clause proposed after Clause 11 in my name to carry out assessments for the purposes of the user empowerment duties in Clause 12(2).
236A: After Clause 194, insert the following new Clause—
“Power to regulate app stores
(1) Subject to the following provisions of this section and section (Power to regulate app stores: supplementary), the Secretary of State may by regulations amend any provision of this Act to make provision for or in connection with the regulation of internet services that are app stores.
(2) Regulations under this section may not be made before OFCOM have published a report under section (OFCOM’s report about use of app stores by children)(report about use of app stores by children).
(3) Regulations under this section may be made only if the Secretary of State, having considered that report, considers that there is a material risk of significant harm to an appreciable number of children presented by either of the following, or by both taken together—
(a) harmful content present on app stores, or
(b) harmful content encountered by means of regulated apps available in app stores.
(4) Before making regulations under this section the Secretary of State must consult—
(a) persons who appear to the Secretary of State to represent providers of app stores,
(b) persons who appear to the Secretary of State to represent the interests of children (generally or with particular reference to online safety matters),
(c) OFCOM,
(d) the Information Commissioner,
(e) the Children’s Commissioner, and
(f) such other persons as the Secretary of State considers appropriate.
(5) In this section and in section (Power to regulate app stores: supplementary)—
“amend” includes repeal and apply (with or without modifications);
“app” includes an app for use on any kind of device, and “app store” is to be read accordingly;
“content that is harmful to children” has the same meaning as in Part 3 (see section 54);
“harmful content” means—
(a) content that is harmful to children,
(b) search content that is harmful to children, and
(c) regulated provider pornographic content;
“regulated app” means an app for a regulated service;
“regulated provider pornographic content” has the same meaning as in Part 5 (see section 70);
“search content” has the same meaning as in Part 3 (see section 51).
(6) In this section and in section (Power to regulate app stores: supplementary) references to children are to children in the United Kingdom.”
Member’s explanatory statement
This amendment provides that the Secretary of State may make regulations amending this Bill so as to bring app stores within its scope. The regulations may not be made until OFCOM have published their report about the use of app stores by children (see the new Clause proposed to be inserted after Clause 147 in my name).
My Lords, we have had some productive discussions on application stores, commonly known as “app stores”, and their role as a gateway for children accessing online services. I am grateful in particular to my noble friend Lady Harding of Winscombe for her detailed scrutiny of this area and the collaborative approach she has taken in relation to it and to her amendments, to which I will turn in a moment. These share the same goals as the amendments tabled in my name in seeking to add evidence-based duties on app stores to protect children.
The amendments in my name will do two things. First, they will establish an evidence base on the use of app stores by children and the role that app stores play in children encountering harmful content online. Secondly, following consideration of this evidence base, the amendments also confer a power on the Secretary of State to bring app stores into scope of the Bill should there be a material risk of significant harm to children on or through them.
On the evidence base, Amendment 272A places a duty on Ofcom to publish a report on the role of app stores in children accessing harmful content on the applications of regulated services. To help build a greater evidence base about the types of harm available on and through different kinds of app stores, the report will consider a broad range of these stores, which could include those available on various devices, such as smartphones, gaming devices and smart televisions. The report will also assess the use and effectiveness of age assurance on app stores and consider whether the greater use of age assurance or other measures could protect children further.
Publication of the report must be two to three years after the child safety duties come into force so as not to interfere with the Bill’s implementation timelines. This timing will also enable the report to take into account the impact of the regulatory framework that the Bill establishes.
Amendment 274A is a consequential amendment to include this report in the Bill’s broader confidentiality provisions, meaning that Ofcom will need to exclude confidential matters—for example, commercially sensitive information—from the report’s publication.
Government Amendments 236A, 236B and 237D provide the Secretary of State with a delegated power to bring app stores into the scope of regulation following consideration of Ofcom’s report. The power will allow the Secretary of State to make regulations putting duties on app stores to reduce the risks of harm presented to children from harmful content on or via app stores. The specific requirements in these regulations will be informed by the outcome of the Ofcom report I have mentioned.
As well as setting out the rules for app stores, the regulations may also make provisions regarding the duties and functions of Ofcom in regulating app stores. This may include information-gathering and enforcement powers, as well as any obligations to produce guidance or codes of practice for app store providers.
By making these amendments, our intention is to build a robust evidence base on the potential risks of app stores for children without affecting the Bill’s implementation more broadly. Should it be found that duties are required, the Secretary of State will have the ability to make robust and comprehensive duties, which will provide further layers of protection for children. I beg to move.
My Lords, before speaking to my Amendment 239A, I thank my noble friend the Minister, the Secretary of State and the teams in both the department and Ofcom for their collaborative approach in working to bring forward this group of amendments. I also thank my cosignatories. My noble friend Lady Stowell cannot be in her place tonight but she has been hugely helpful in guiding me through the procedure, as have been the noble Lords, Lord Stevenson, Lord Clement-Jones and Lord Knight, not to mention the noble Baroness, Lady Kidron. It has been a proper cross-House team effort. Even the noble Lord, Lord Allan, who started out quite sceptical, has been extremely helpful in shaping the discussion.
I also thank the NSPCC and Barnardo’s for their invaluable advice and support, as well as Snap and Match—two companies which have been willing to stick their heads above the parapet and challenge suppliers and providers on which they are completely dependent in the shape of the current app store owners, Apple and Google.
I reassure my noble friend the Minister—and everyone else—that I have no intention of dividing the House on my amendment, in case noble Lords were worried. I am simply seeking some reassurance on a number of points where my amendments differ from those tabled by the Government—but, first, I will highlight the similarities.
As my noble friend the Minister has referred to, I am delighted that we have two packages of amendments that in both cases recognise that this was a really significant gap in the Bill as drafted. Ignoring the elements of the ecosystem that sell access to regulated services, decide age guidelines and have the ability to do age assurance was a substantial gap in the framing of the Bill. But we have also recognised together that it is very important that this is an “and” not an “or”—it is not instead of regulating user-to-user services or search but in addition to. It is an additional layer that we can bring to protect children online, and it is very important that we recognise that—and both packages do.
My Lords, I am very grateful for the strength of support and echo the tributes that have been paid to my noble friend Lady Harding—the winsome Baroness from Winscombe —for raising this issue and working with us so collaboratively on it. I am particularly glad that we were able to bring these amendments on Report; as she knows, it involved some speedy work by the Bill team and some speedy drafting by the Office of the Parliamentary Counsel, but I am glad that we were able to do it on Report, so that I can take it off my list of things to do over the summer, which was kindly written for me by the noble Lord, Lord Clement-Jones.
My noble friend’s amendments were laid before the Government’s, so she rightly asked a couple of questions on where they slightly differ. Her amendment seeks to ensure that other websites or online marketplaces that allow users to download apps are also caught by these duties. I reassure her that the Government’s amendments would capture these types of services. We have intentionally not provided detail about what constitutes an app store to ensure that the Bill remains future-proof. I will say a bit more about that in a moment. Regulations made by the Secretary of State under this power will be able to specify thresholds for which app stores are in scope, giving clarity to providers and users about the application of the duties.
On questions of definition, we are intentionally choosing not to define app stores in these amendments. The term is generally understood as meaning a service that makes applications available, which means that the Secretary of State will be able to impose duties on any such service. Any platform that enables apps to be downloaded can therefore be considered an app store for the purpose of this duty, regardless of whether or not it calls itself one. Regulations will clearly set out which providers are in scope of the duties. The ability to set threshold conditions will also ensure that any duties capture only those that pose the greatest risk of children accessing harmful content.
We touched on the long-running debate about content and functionality. We have made our position on that clear; it will be caught by references to content. I am conscious that we will return to this on Wednesday, when we will have a chance to debate it further.
On timing, as I said, I am glad that we were able to bring these amendments forward at this stage. The publication date for Ofcom’s report is to ensure that Ofcom can prioritise the implementation of the child safety duties and put in place the Bill’s vital protections for children before turning to its research on app stores.
That timing also allows the Secretary of State to base his or her decision on commencement on the effectiveness of the existing framework and to use the research of Ofcom’s report to set out a more granular approach to issues such as risk assessment and safety duties. It is necessary to await the findings of Ofcom’s report before those duties are commenced.
To the questions posed by the noble Baroness, Lady Kidron, and others about the consultation for that report by Ofcom, we expect Ofcom to consult widely and with all relevant parties when producing its report. We do not believe that there is a need for a specific list of consultees given Ofcom’s experience and expertise in this area as well as the great experience it will have through its existing enforcement and wider consultation requirements. In addition, the Secretary of State, before making regulations, will be required to consult a range of key parties, such as the Children’s Commissioner and the Information Commissioner, and those who represent the interests of children, as well as providers of app stores. That can include children themselves.
On the questions asked by the noble Lord, Lord Knight, on loot boxes, he is right that this piece of work is being led by my department. We want to see the games industry take the lead in strengthening protections for children and adults to mitigate the risk of harms. We are pursuing that through a DCMS-led technical working group, and we will publish an update on progress in the coming months. I again express my gratitude to my noble friend Lady Harding and other noble Lords who have expressed their support.
(1 year, 4 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, the government amendments in this group relate to content reporting and complaints procedures. The Bill’s existing duties on each of these topics are a major step forward and will provide users with effective methods of redress. There will now be an enforceable duty on Part 3 services to offer accessible, transparent and easy-to-use complaints procedures. This is an important and significant change from which users and others will benefit directly.
Furthermore, Part 3 services complaints procedures will be required to provide for appropriate action to be taken in response to complaints. The duties here will fundamentally alter how complaints systems are operated by services, and providers will have to make sure that their systems are up to scratch. If services do not comply with their duties, they will face strong enforcement measures.
However, we have listened to concerns raised by your Lordships and others, and share the desire to ensure that complaints are handled effectively. That is why we have tabled Amendments 272AA and 274AA, to ensure that the Bill’s provisions in this area are the subject of a report to be published by Ofcom within two years of commencement.
Amendment 272AA places a requirement on Ofcom to undertake a report about Part 3 services reporting and complaints procedures. The report will assess the measures taken or in use by providers of Part 3 services to enable users and others to report content and make complaints. In assessing the content reporting and complaints measures in place, the report must take into account users’ and others’ experiences of those procedures—including how easy to use and clear they are for reporting content and making complaints, and whether providers are taking appropriate and timely action in response.
In this report, Ofcom must provide advice to the Secretary of State about whether she should use her power set out in Amendment 236C to make regulations imposing an alternative dispute resolution duty on category 1 services. Ofcom may also make wider recommendations about how the complaints and user redress provisions can be strengthened, and how users’ experiences with regard to complaints can be improved more broadly. Amendment 274AA is a consequential amendment ensuring that the usual confidentiality provisions apply to matters contained in that report.
These changes will ensure that the effectiveness of the Bill’s content reporting and complaints provisions can be thoroughly assessed by Ofcom two years after the commencement of the provision, providing time for the relevant reporting and complaints procedures to bed in.
Amendment 236C then provides that the Secretary of State will have a power to make regulations to amend the Act in order to impose an alternative dispute resolution duty on providers of category 1 services. This power can be used after the Secretary of State has published a statement in response to Ofcom’s report. This enables the Secretary of State to impose via regulations a duty on the providers of category 1 services to arrange for and engage in an impartial, out-of-court alternative dispute resolution procedure in respect of complaints. This means that, if the Bill’s existing user redress provisions are found to be insufficient, this requirement can quickly be imposed to strengthen the Bill.
This responds directly to concerns which noble Lords raised about cases where users or parents may feel that they have nowhere to turn if they are dissatisfied with a service’s response to their complaint. We believe that the existing provisions will remedy this, but, if they do not, these new requirements will ensure that there is an impartial, alternative dispute resolution procedure which will work towards the effective resolution of the complaint between the service and the complainant.
At the same time, it will avoid creating a single ombudsman, person or body which may be overwhelmed either through the volume of complaints from multiple services or by the complexity of applying such disparate services’ varying terms of service. Instead, if required, this power will put the onus on the provider to arrange for and engage in an impartial dispute resolution procedure.
Amendment 237D requires that, if regulations are made requiring category 1 services to offer an alternative dispute resolution procedure, such regulation must be subject to the affirmative parliamentary procedure. This ensures that Parliament will continue to have oversight of this process.
I hope that noble Lords are reassured that the Bill not only requires services to provide users and others with effective forms of redress but that these further amendments will ensure that the Bill’s provisions in this area will be thoroughly reviewed and that action can be taken quickly if it is needed. I beg to move.
My Lords, I am grateful to hear what the Minister has just announced. The scheme that was originally prefigured in the pre-legislative scrutiny report has now got some chance of being delivered. I think the process and procedures are quite appropriate; it does need review and thought. There needs to be account taken of practice on the ground, how people have found the new system is working, and whether or not there are gaps that can be filled this way. I give my full support to the proposal, and I am very glad to see it.
Having got to the Dispatch Box early, I will just appeal to our small but very important group. We are on the last day on Report. We are reaching a number of issues where lots of debate has taken place in Committee. I think it would be quite a nice surprise for us all if we were to get through this quickly. The only way to do that is by restricting our contributions.
My Lords, I declare an interest as chair of Trust Alliance Group, which operates the energy and communications ombudsman schemes, so I have a particular interest in the operation of these ADR schemes. I thank the Minister for the flexibility that he has shown in the provision about the report by Ofcom and in having backstop powers for the Secretary of State to introduce such a scheme.
Of course, I understand that the noble Baroness, Lady Newlove, and the UK Safer Internet Centre are very disappointed that this is not going to come into effect immediately, but there are advantages in not setting out the scheme at this very early point before we know what some of the issues arising are. I believe that Ofcom will definitely want to institute such a scheme, but it may be that, in the initial stages, working out the exact architecture is going to be necessary. Of course, I would have preferred to have a mandated scheme, in the sense that the report will look not at the “whether” but the “how”, but I believe that at the end of the day it will absolutely obvious that there needs to be such an ADR scheme in order to provide the kind of redress the noble Baroness, Lady Harding, was talking about.
I also agree with noble Baroness, Lady Morgan, that the kinds of complaints that this would cover should include fraudulent adverts. I very much hope that the Minister will be able to answer the questions that both noble Baronesses asked. As my noble friend said, will he reassure us that the department and Ofcom will not take their foot off the pedal, whatever the Bill may say?
I am grateful to noble Lords for their warm support and for heeding the advice of the noble Lord, Lord Stevenson, on brevity. We must finish our Report today. The noble Lord, Lord Allan, is right to mention my noble friend Lady Newlove, who I have spoken to about this issue, as well as the noble Lord, Lord Russell of Liverpool, who has raised some questions here.
Alongside the strong duties on services to offer content reporting and complaints procedures, our amendments will ensure that the effectiveness of these provisions can be reviewed after they have had sufficient time to bed in. The noble Lord, Lord Allan, asked about timing in more detail. Ofcom must publish the report within the two-year period beginning on the day on which the provision comes into force. That will allow time for the regime to bed in before the report takes place, ensuring that its conclusions are informed by how the procedures work in practice. If necessary, our amendments will allow the Secretary of State to impose via regulations a duty on the providers of category 1 services to arrange for and engage in an impartial, out-of-court alternative dispute resolution procedure, providing the further strengthening which I outlined in opening.
I can reassure my noble friend Lady Morgan of Cotes that reporting mechanisms to facilitate providers’ removal of fraudulent advertisements are exactly the kinds of issues that Ofcom’s codes of practice will cover, subject to consultation and due process. As companies have duties to remove fraudulent advertising once they are alerted to it, we expect platforms will need the necessary systems and processes in place to enable users to report fraudulent adverts so that providers can remove them.
My noble friend Lady Harding asked the question which was posed a lot in Committee about where one goes if all avenues are exhausted. We have added further avenues for people to seek redress if they do not get it but, as I said in Committee, the changes that we are bringing in through this Bill will mark a significant change for people. Rather than focusing on the even-further-diminished possibility of their not having their complaints adequately addressed through the additional amendments we are bringing today, I hope she will see that the provisions in the Bill and in these amendments as bringing in the change we all want to see to improve users’ safety online.
My Lords, Amendments 238A and 238D seek to change the parliamentary process for laying—oh, I am skipping ahead with final day of Report enthusiasm.
As noble Lords know, companies will fund the costs of Ofcom’s online safety functions through annual fees. This means that the regime which the Bill ushers in will be cost neutral to the taxpayer. Once the fee regime is operational, regulated providers with revenue at or above a set threshold will be required to notify Ofcom and to pay a proportionate fee. Ofcom will calculate fees with reference to the provider’s qualifying worldwide revenue.
The Delegated Powers and Regulatory Reform Committee of your Lordships’ House has made two recommendations relating to the fee regime which we have accepted, and the amendments we are discussing in this group reflect this. In addition, we are making an additional change to definitions to ensure that Ofcom can collect proportionate fees.
A number of the amendments in my name relate to qualifying worldwide revenue. Presently, the Bill outlines that this should be defined in a published statement laid before Parliament. Your Lordships’ committee advised that it should be defined through regulations subject to the affirmative procedure. We have agreed with this and are proposing changes to Clause 76 so that Ofcom can make provisions about qualifying worldwide revenue by regulations which, as per the committee’s recommendations, will be subject to the affirmative procedure.
Secondly, the committee recommended that we change the method by which the revenue threshold is defined. Presently, as set out in the Bill, it is set by the Secretary of State in a published statement laid before Parliament. The committee recommended that the threshold be set through regulations subject to the negative procedure and we are amending Clause 77 to make the recommended change.
Other amendments seek to make a further change to enable Ofcom to collect proportionate fees from providers. A provider of a regulated service the qualifying worldwide revenue of which is equal to, or greater than, the financial threshold will be required to notify Ofcom and pay an annual fee, calculated by reference to its qualifying worldwide revenue. Currently, this means that that fee calculation can be based only on the revenue of the regulated provider. The structure of some technology companies, however, means that how they accrue revenue is not always straightforward. The entity which meets the definition of a provider may therefore not be the entity which generates revenue referable to the regulated service.
Regulations to be made by Ofcom about the qualifying worldwide revenue will therefore be able to provide that the revenue accruing to certain entities in the same group as a provider of a regulated service can be taken into account for the purposes of determining qualifying worldwide revenue. This will enable Ofcom, when making such regulations, to make provisions, if necessary, to account for instances where a provider has a complex group structure; for example, where the regulated provider might accrue only a portion of the revenue referrable to the regulated service, the rest of which might be accrued by other entities in the group’s structure. These amendments to Clause 76 address these issues by allowing Ofcom to make regulations which provide that the revenue from certain other entities within the provider’s group structure can be taken into account. I beg to move.
My Lords, we have not talked much about fees in our consideration of the Bill, and I will not talk much about them today, but there are some important questions. We should not skip too lightly over the fact that we will be levying revenues from online providers. That might have a significant impact on the markets. I have some specific questions about this proposed worldwide revenue method but I welcome these amendments and that we will now be getting a better procedure. This will also allow the Minister to say, “All these detailed points can be addressed when these instruments come before Parliament”. That is a good development. However, there are three questions that are worth putting on the record now so that we have time to think about them.
First, what consideration will be given to the impact on services that do not follow a classic revenue model but instead rely on donations and other sorts of support? I know that we will come back to this question in a later group but there are some very large internet service providers that are not the classic advertising-funded model, instead relying on foundations and other things. They will have significant questions about what we would judge their qualifying worldwide revenue to be, given that they operate to these very different models.
The second question concerns the impact on services that may have a very large footprint outside the UK, and significant worldwide revenues, but which do very little business within the UK. The amendment that the Minister has tabled about group revenues is also relevant here. You can imagine an entity which may be part of a very large worldwide group making very significant revenues around the world. It has a relatively small subsidiary that is offering a service in the UK, with relatively low revenues. There are some important questions there around the potential impact of the fees on decision-making within that group. We have discussed how we do not want to end up with less choice for consumers of services in the UK. There is an interesting question there as to whether getting the fee level wrong might lead to worldwide entities saying, “If you’re going to ask me to pay a fee based on my qualifying worldwide revenue, the UK market is just not worth it”. That may particularly true if, for example, the European Union and other markets are also levying a fee. You can see a rational business choice of, “We’re happy to pay the fee to the EU but not to Ofcom if it is levied at a rate that is disproportionate to the business that we do here”.
The third and very topical question is about the Government’s thinking about services with declining revenues but whose safety needs are not reducing and may even be increasing. I hope as I say this that people have Twitter in mind, which has very publicly told us that its revenue is going down significantly. It has also very publicly fired most of its trust and safety staff. You can imagine a model within which, because its revenue is declining, it is paying less to Ofcom precisely when Ofcom needs to do more supervision of it.
I hope that we can get some clarity around the Government’s intentions in these circumstances. I have referenced three areas where the worldwide qualifying revenue calculation may go a little awry. The first is where the revenue is not classic commercial income but comes from other sources. The second is where the footprint in the UK is very small but it is otherwise a large global company which we might worry will withdraw from the market. The third, and perhaps most important, is what the Government’s intention is where a company’s revenue is declining and it is managing its platform less well and its Ofcom needs increase, and what we would expect to happen to the fee level in those circumstances.
My Lords, there is very little to add to that. These are important questions. I simply was struck by the thought that the amount of work, effort and thought that has gone into this should not be kept within this Bill. I wonder whether the noble Lord has thought of offering his services to His Majesty’s Treasury, which has difficulty in raising tax from these companies. It would be nice to see that problem resolved.
I am looking forward to returning to arts and heritage; I will leave that to my noble friend Lady Penn.
The noble Lord, Lord Allan, asked some good questions. He is right: the provisions and the parliamentary scrutiny allow for the flexibility for all these things to be looked at and scrutinised in the way that he set out. I stress that the fee regime is designed to be fair to industry; that is central to the approach we have taken. The Bill stipulates that Ofcom must charge only proportionate and justifiable fees to industry. The provisions that Ofcom can make via regulation about the qualifying worldwide revenue aim to ensure that fees are truly representative of the revenue relating to the regulated service and that they will encourage financial transparency. They also aim to aid companies with complex structures which would otherwise struggle to segregate revenues attributable to the provider and its connected entities.
The revenue of the group undertaking can be considered in scope of a provider’s qualifying worldwide revenue if the entity was a member of the provider’s group during any part of the qualifying period and the entity receives during the qualifying period any amount referrable to a regulated service. The regulations provide Ofcom with a degree of flexibility as to whether or not to make such provisions, because Ofcom will aim to keep the qualifying worldwide revenue simple.
I am grateful for noble Lords’ support for the amendments and believe that they will help Ofcom and the Government to structure a fair and transparent fee regime which charges proportionate fees to fund the cost of the regulatory regime that the Bill brings in.
My Lords, as I was eagerly anticipating, government Amendments 238A and 238D seek to change the parliamentary process for laying the first regulations specifying the category 1 threshold conditions from the negative to the affirmative procedure. I am pleased to bring forward this change in response to the recommendation of your Lordships’ Delegated Powers and Regulatory Reform Committee.
The change will ensure that there are adequate levels of parliamentary scrutiny of the first regulations specifying the category 1 threshold conditions. This is appropriate given that the categorisation of category 1 services will lead to the most substantial duties on the largest and most influential services. As noble Lords are aware, these include the duties on user empowerment, user identity verification, journalistic and news publisher content, content of democratic importance, and fraudulent advertising.
Category 2A services will have only additional transparency and fraudulent advertising duties, and category 2B services will be subject only to additional transparency reporting duties. The burden of these duties is significantly less than the additional category 1 duties, and we have therefore retained the use of the negative resolution procedure for these regulations, as they require less parliamentary scrutiny.
Future changes to the category 1 threshold conditions will also use the negative procedure. This will ensure that the regime remains agile in responding to change, which I know was of particular concern to noble Lords when we debated the categorisation group in Committee. Keeping the negative procedure for such subsequent uses will avoid the risk of future changes being subject to delays because of parliamentary scheduling. I beg to move.
My Lords, I shall speak to Amendment 245. I would like to thank my noble friend the Minister, and also the Minister on leave, for the conversations that I have had with them about this amendment and related issues. As we have already heard, the platform categorisation is extremely important. So far, much of it is unknown, including which sites are actually going to be in which categories. For example, we have not yet seen any proposed secondary regulations. As my noble friend has just outlined, special duties apply, especially for those sites within category 1—user empowerment in particular, but also other duties relating to content and fraudulent advertisements.
Clause 85 and Schedule 11 set out the thresholds for determining which sites will be in category 1, category 2A or category 2B. I am very mindful of the exhortation of the noble Lord, Lord Stevenson, about being brief, but it is amazing how much you have to say about one word to explain this amendment. This amendment proposes to change an “and” to an “or” in relation to determining which sites would fall within category 1. It would move from a test of size “and” functionality to a test of size “or” functionality. This would give Ofcom more flexibility to decide which platforms really need category 1 designation. Category 1 should not be decided just on size; it should also be possible to determine it on the basis of functionality.
Functionality is defined in the Bill in Clause 208. We will get to those amendments shortly, but there is no doubt from what the Government have already conceded, or agreed with those of us who have been campaigning passionately on the Bill for a number of years, that functionality can make a platform harmful. It is perfectly possible to have small platforms that both carry highly harmful content and themselves become harmful in the way that they are designed. We have heard many examples and I will not detain the House with them, but I draw attention to two particular sites which capture how broad this is. The perpetrators of offline hate crimes are often linked to these small platforms. For example, the perpetrator of the 2018 Tree of Life synagogue mass shooting had an online presence on the right-wing extremist social network Gab. In the UK, Jake Davison, the self-proclaimed incel who killed five people in Plymouth in 2021, frequented smaller incel forums after he was banned from Reddit in the days leading up to the mass shooting.
I also want to share with noble Lords an email that I received just this week from a family who had been to see their Member of Parliament, Matt Rodda MP, and also the noble Baroness, Lady Kidron, who I know is very regretful that she cannot be here today. I thank Victoria and Jean Eustace for sharing the story of their sister and daughter. Victoria wrote: “I am writing to you regarding the Online Safety Bill, as my family and I are concerned it will not sufficiently protect vulnerable adults from harm. My sister, Zoe Lyalle, killed herself on 26 May 2020, having been pointed towards a method using an online forum called Sanctioned Suicide. Zoe was 18 years old at the time of her death and as such technically an adult, but she was autistic, so she was emotionally less mature than many 18 year- olds. She found it difficult to critically analyse written content”. She says that “The forum in question is not large and states on its face that it does not encourage suicide, although its content does just that”. The next part I was even more shocked about: “Since Zoe’s death, we have accessed her email account. The forum continues to email Zoe, providing her with updates on content she may have missed while away from the site, as well as requesting donations. One recent email included a link to a thread on the forum containing tips on how best to use the precise method that Zoe had employed”.
In her note to me, the Minister on leave said that she wanted to catch some of the platforms we are talking about with outsized influence. In my reply, I said that those sites on which people are encouraged to take their own lives or become radicalised and therefore take the harms they are seeing online into the real world undoubtedly exercise influence and should be tackled.
It is also perfectly possible for us to have large but safe platforms. I know that my noble friend Lord Moylan may want to discuss this in relation to sites that he has talked about already on this Bill. The risk of the current drafting is a flight of users from these large platforms, newly categorised as category 1, to the small, non-category 1 platforms. What if a platform becomes extremely harmful very quickly? How will it be recategorised speedily but fairly and involving parliamentary oversight?
The Government have run a variety of arguments as to why the “and” in the Bill should not become an “or”. They say that it creates legal uncertainty. Every Bill creates legal uncertainty; that is why we have an army of extremely highly paid lawyers, not just in this country but around the world. They say that what we are talking about is broader than illegal content or content related to children’s safety, but they have already accepted an earlier amendment on safety by design and, in subsections (10) to (12) of Clause 12, that specific extra protections should be available for content related to
“suicide or an act of deliberate self-injury, or … an eating disorder or behaviours associated with an eating disorder”
or abusive content relating to race, religion, sex, sexual orientation, disability or gender reassignment and that:
“Content is within this subsection if it incites hatred against people”.
The Government have already breached some of their own limits on content that is not just illegal or relates to child safety duties. In fact, they have agreed that that content should have enhanced triple-shield protection.
The Government have also said that they want to avoid burdens on small but low-harm platforms. I agree with that, but with an “or” it would be perfectly possible for Ofcom to decide by looking at size or functionality and to exclude those smaller platforms that do not present the harm we all care about. The Minister may also offer me a review of categorisation; however, it is a review of the tiers of categorisation and not the sites within the categories, which I think many of us will have views on over the years.
I come to what we should do on this final day of Report. I am very thankful to those who have had many conversations on this, but there is a fundamental difference of opinion in this House on these matters. We will talk about functionality shortly and I am mindful of the pre-legislative scrutiny committee’s recommendation that this legislation should adopt
“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.
There should be other factors. Ofcom should have the ability to decide whether it takes one factor or another, and not have a series of all the thresholds to be passed, to give it the maximum flexibility. I will listen very carefully to what my noble friend the Minister and other noble Lords say, but at this moment I intend to test the opinion of the House on this amendment.
My Lords, I have good news and bad news for the Minister. The good news is that we have no problem with his amendments. The bad news, for him, is that we strongly support Amendment 245 from the noble Baroness, Lady Morgan of Coates, which, as others have said, we think is a no-brainer.
The beauty of the simple amendment has been demonstrated; it just changes the single word “and” to “or”. It is of course right to give Ofcom leeway—or flexibility, as the noble Baroness, Lady Finlay, described it—in the categorisation and to bring providers into the safety regime. What the noble Baroness, Lady Morgan, said about the smaller platforms, the breadcrumbing relating to the Jake Davison case and the functionality around bombarding Zoe Lyalle with those emails told the story that we needed to hear.
As it stands, the Bill requires Ofcom to always be mindful of size. We need to be more nuanced. From listening to the noble Lord, Lord Allan of Hallam—with his, as ever, more detailed analysis of how things work in practice—my concern is that in the end, if it is all about size, Ofcom will end up having to have a much larger number in scope on the categorisation of size in order to cover all the platforms that it is worried about. If we could give flexibility around size or functionality, that would make the job considerably easier.
We on this side think categorisation should happen with a proportionate, risk-based approach. We think the flexibility should be there, the Minister is reasonable—come on, what’s not to like?
My Lords, I shall explain why the simple change of one word is not as simple as it may at first seem. My noble friend’s Amendment 245 seeks to amend the rule that a service must meet both a number-of-users threshold and a functionality threshold to be designated as category 1 or 2B. It would instead allow the Secretary of State by regulation to require a service to have to meet only one or other of the two requirements. That would mean that smaller user-to-user services could be so categorised by meeting only a functionality threshold.
In practical terms, that would open up the possibility of a future Secretary of State setting only a threshold condition about the number of users, or alternatively about functionality, in isolation. That would create the risk that services with a high number of users but limited functionality would be caught in scope of category 1. That could be of particular concern to large websites that operate with limited functionality for public interest reasons, and I am sure my noble friend Lord Moylan can think of one that fits that bill. On the other hand, it could capture a vast array of low-risk smaller services merely because they have a specific functionality—for instance, local community fora that have livestreaming capabilities. So we share the concerns of the noble Lord, Lord Allan, but come at it from a different perspective from him.
My noble friend Lady Morgan mentioned the speed of designation. The Bill’s approach to the pace of designation for the category 1 watchlist and register is flexible—deliberately so, to allow Ofcom to act as quickly as is proportionate to each emerging service. Ofcom will have a duty proactively to identify, monitor and evaluate emerging services, which will afford it early visibility when a service is approaching the category 1 threshold. It will therefore be ready to act accordingly to add services to the register should the need arise.
The approach set out in my noble friend’s Amendment 245 would not allow the Secretary of State to designate individual services as category 1 if they met one of the threshold conditions. Services can be designated as category 1 only if they meet all the relevant threshold conditions set out in the regulations made by the Secretary of State. That is the case regardless, whether the regulations set out one condition or a combination of several conditions.
The noble Baroness, Lady Finlay, suggested that the amendment would assist Ofcom in its work. Ofcom itself has raised concerns that amendments such as this—to introduce greater flexibility—could increase the risk of legal challenges to categorisation. My noble friend Lady Morgan was part of the army of lawyers before she came to Parliament, and I am conscious that the noble Lord, Lord Clement-Jones, is one as well. I hope they will heed the words of the regulator; this is not a risk that noble Lords should take lightly.
I will say more clearly that small companies can pose significant harm to users—I have said it before and I am happy to say it again—which is why there is no exemption for small companies. The very sad examples that my noble friend Lady Morgan gave in her speech related to illegal activity. All services, regardless of size, will be required to take action against illegal content, and to protect children if they are likely to be accessed by children. This is a proportionate regime that seeks to protect small but excellent platforms from overbearing regulation. However, I want to be clear that a small platform that is a font of illegal content cannot use the excuse of its size as an excuse for not dealing with it.
Category 1 services are those services that have a major influence over our public discourse online. Again, I want to be clear that designation as a category 1 service is not based only on size. The thresholds for category 1 services will be based on the functionalities of a service as well as the size of the user base. The thresholds can also incorporate other characteristics that the Secretary of State deems relevant, which could include factors such as a service’s business model or its governance. Crucially, Ofcom has been clear that it will prioritise engagement with high-risk or high-impact services, irrespective of their categorisation, to understand their existing safety systems and how they plan to improve them.
I associate myself with the comments of my noble friend Lady Stowell on this whole issue, and I refer to my register of interests. One question we should be asking, which goes wider than this Bill, is: who regulates the regulators? It is a standard problem in political science and often known as principal agent theory, whereby the principals delegate powers to the agents for many reasons, and you see agency slack, whereby they develop their own powers beyond what was perhaps originally intended. For that reason, I completely associate myself with my noble friend Lady Stowell’s comments—and not because she chairs a committee on which I sit and I hope to get a favour of more speaking time on that committee. It is simply because, on its merit, we should all be asking who regulates the regulators and making sure that they are accountable. We are asking the same question of the Secretary of State, and quite rightly, the Secretary of State should be accountable for any measures they propose, but we should also be asking it of regulators.
My Lords, I have always felt rather sorry for the first Viscount Addison, because what we refer to as the Salisbury convention is really the Salisbury-Addison convention. So while I am grateful to the noble Lord, Lord Stevenson, for his flattering speech, I shall insist on calling it the “Parkinson-Stevenson rule”, not least in the hope that that mouthful will encourage people to forget its name more swiftly.
I am grateful to the noble Lord for his attention to this matter and the useful discussions that we have had. His Amendment 239 would go beyond the existing legislative process for the delegated powers in the Bill by providing for parliamentary committees to be, in effect, inserted into the secondary legislative process. The delegated powers in the Bill are crucial for implementing the regime effectively and for ensuring that it keeps pace with changes in technology. Regulation-making powers are an established part of our legislative practice, and it would not be appropriate to deviate from existing processes.
However, I agree that ongoing parliamentary scrutiny of the regime will be crucial in helping to provide noble Lords and Members in another place with the reassurance that the implementation of the regime is as we intended. As the noble Lord noted, the establishment of the Science, Innovation and Technology Select Committee in another place means that there is a new dedicated committee looking at this important area of public policy. That provides an opportunity for cross-party scrutiny of the online safety regime and broader issues. While it will be, as he said, for respective committees to decide their priorities, we welcome any focus on online safety, and certainly welcome committees in both Houses co-operating effectively on this matter. I am certain that the Communications and Digital Committee of your Lordships’ House will continue to play a vital role in the scrutiny of the online safety regime.
We would fully expect these committees to look closely at the codes of practice, the uses of regulation-making powers and the powers of direction in a way that allows them to focus on key issues of interest. To support that, I can commit that the Government will do two things. First, where the Bill places a consultation requirement on the Government, we will ensure that the relevant committees have every chance to play a part in that consultation by informing them that the process is open. Secondly, while we do not wish to see the implementation process delayed, we will, where possible, share draft statutory instruments directly with the relevant committees ahead of the formal laying process. These timelines will be on a case-by-case basis, considering what is appropriate and reasonably practical. It will be for the committees to decide how they wish to engage with the information that we provide, but it will not create an additional approval process to avoid delaying implementation. I am grateful to my noble friend Lady Stowell of Beeston for her words of caution and wisdom on that point as both chairman of your Lordships’ committee and a former Leader of your Lordships’ House.
I hope that the noble Lord will be satisfied by what I have set out and will be willing to withdraw his amendment so that our rule might enter into constitutional history more swiftly.
I am very grateful to everyone who has contributed to the debate, despite my injunction that no one was to speak other than those key persons—but it was nice to hear views around the House in support for this proposal, with caution. The noble Baroness, Lady Stowell, was right to be clear that we have to be focused on where we are going on this; there is quite a lot at stake here, and it is a much bigger issue than simply this Bill and these particular issues. Her willingness to take this on in a wider context is most welcome, and I look forward to hearing how that goes. I am also very grateful for the unexpected but very welcome support from the noble Baroness, Lady Fox. It was nice that she finally agreed to meet on one piece of territory, if we cannot agree on some of the others. The noble Lord, Lord Kamall, is right to say that we need to pick up the much broader question about who regulates those who regulate us. This is not the answer, but it certainly gets us a step in the direction.
I was grateful to the Minister for suggesting that the “Parkinson rule” could take flight, but I shall continue to call it by a single name—double-barrelled names are not appropriate here. We will see the results of that in the consultation; the things that already have to be consulted about will be offered to the committees, and it is up to them to respond on that, but it is a very good start. The idea that drafts and issues that are being prepared for future regulation will be shown ahead of the formal process is exactly where I wanted to be on this, so I am very grateful for that. I withdraw the amendment.
My Lords, if I may, I shall speak very briefly, in the absence of my noble friend Lady Kidron, and because I am one of the signatories of this amendment, alongside the noble Lord, Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. Amendment 240, together with a number of amendments that we will be debating today, turns on a fundamental issue that we have not yet resolved.
I came in this morning being told that we would be voting on this amendment and that other amendments later today would be consequential—I am a novice at this level of parliamentary procedure, so forgive me if I have got myself confused during the day—but I now understand that my noble friend considers this amendment to be consequential but, strangely, the amendments right at the end of the day are not. I just wanted to flag to the House that they all cover the same fundamental issue of whether harms can be unrelated to content, whether the harms of the online world can be to do with functionality—the systems and processes that drive the addiction that causes so much harm to our children.
It is a fundamental disagreement. I pay tribute to the amount of time the department, the Secretary of State and my noble friend have spent on it, but it is not yet resolved and, although I understand that I should now say that I beg leave to move the amendment formally, I just wanted to mark, with apologies, the necessity, most likely, of having to bring the same issue back to vote on later today.
My Lords, His Majesty’s Government indeed agree that this is consequential on the other amendments, including Amendment 35, which the noble Baroness, Lady Kidron, previously moved at Report. We disagreed with them, but we lost that vote; this is consequential, and we will not force a Division on it.
We will have further opportunity to debate the fundamental issues that lie behind it, to which my noble friend Lady Harding just referred. Some of the amendments on which we may divide later, the noble Baroness, Lady Kidron, tabled after defeating the Government the other day, so we cannot treat them as consequential. We look forward to debating them; I will urge noble Lords not to vote for them, but we will have opportunity to discuss them later.
My Lords, these amendments are concerned with Ofcom’s powers under Clause 111 to issue notices to deal with terrorism content and child sexual exploitation and abuse content.
I acknowledge the concerns which have been aired about how these powers work with encrypted services. I want to make it clear that the Bill does not require companies to break or weaken encryption, and we have built in strong safeguards to ensure that users’ privacy is protected. Encryption plays an important role online, and the UK supports its responsible use. I also want to make it clear that we are not introducing a blanket requirement for companies to monitor all content for all harms, at all times. That would not be proportionate.
However, given the serious risk of harm to children from sexual abuse and exploitation online, the regulator must have appropriate, tightly targeted powers to compel companies to take the most effective action to tackle such reprehensible illegal activity which is taking place on their services. We must ask companies to do all that is technically feasible to keep children safe, subject to stringent legal safeguards.
The powers in the Bill are predicated on risk assessments. If companies are managing the risks on their platform appropriately, Ofcom will not need to use its powers. As a last resort, however, where there is clear evidence of child sexual abuse taking place on a platform, Ofcom will be able to direct companies either to use, or to make best efforts to develop or source, accredited and accurate technology to identify and remove this illegal content. To be clear, these powers will not enable Ofcom or our law enforcement agencies to obtain any automatic access to the content detected. It is simply a matter of making private companies take effective action to prevent child sexual abuse on their services.
Ofcom must consider a wide range of matters when deciding whether a notice is necessary and proportionate, including the impacts on privacy and freedom of expression of using a particular technology on a particular service. Ofcom will only be able to require the use of technology accredited as highly accurate in detecting illegal child sexual abuse or terrorism content, vastly minimising the risk that content is wrongly identified.
In addition to these safeguards, as a public body, Ofcom is bound through the Human Rights Act 1998 by the European Convention on Human Rights, including Articles 8 and 10. Ofcom has an obligation not to act in a way which unduly interferes with the right to privacy and freedom of expression when carrying out its duties, for which it is held to account.
If appropriate technology does not exist which meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a solution. It is right that we can require technology companies to use their considerable resources and expertise to develop the best possible protections for children in encrypted environments.
Despite the breadth of the existing safeguards, we recognise that concerns remain about these powers, and we have listened to the points that noble Lords raised in Committee about privacy and technical feasibility. That is why we are introducing additional safeguards. I am grateful for the constructive engagement I have had with noble Lords across your Lordships’ House on this issue, and I hope that the government amendments alleviate their concerns.
I turn first to our Amendments 250B, 250C, 250D, 255A, 256A, 257A, 257B, 257C and 258A, which require that Ofcom obtain a skilled persons’ report before issuing a warning notice and exercising its powers under Clause 111. This independent expert scrutiny will supplement Ofcom’s own expertise to ensure that it has a full understanding of relevant technical issues to inform its decision-making. That will include issues specific to the service in question, such as its design and relevant factors relating to privacy.
I am grateful to noble Lords for their further scrutiny of this important but complex area, and for the engagement that we have had in the days running up to it as well. We know how child sexual exploitation and abuse offenders sadly exploit private channels, and the great danger that this poses, and we know how crucial these channels are for secure communication. That is why, where necessary and proportionate, and where all the safeguards are met, it is right that Ofcom can require companies to take all technically feasible measures to remove this vile and illegal content.
The government amendments in this group will go further to ensure that a notice is well informed and targeted and does not unduly restrict users’ rights. Privacy and safety are not mutually exclusive—we can and must have both. The safety of our children depends on it.
I make it clear again that the Bill does not require companies to break or weaken end-to-end encryption on their services. Ofcom can require the use of technology on an end-to-end encrypted service only when it is technically feasible and has been assessed as meeting minimum standards of accuracy. When deciding whether to issue a notice, Ofcom will engage in continual dialogue with the company and identify reasonable, technically feasible solutions to the issues identified. As I said in opening, it is right that we require technology companies to use their considerable resources and expertise to develop the best possible protections to keep children safe in encrypted environments. They are well placed to innovate to find solutions that protect both the privacy of users and the safety of children.
Just to be clear, am I right to understand my noble friend as saying that there is currently no technology that would be technically acceptable for tech companies to do what is being asked of them? Did he say that tech companies should be looking to develop the technology to do what may be required of them but that it is not currently available to them?
For clarification, if the answer to that is that the technology does not exist—which I believe is correct, although there are various snake oil salespeople out there claiming that it does, as the noble Baroness, Lady Fox of Buckley, said—my noble friend seems to be saying that the providers and services should develop it. This seems rather circular, as the Bill says that they must adopt an approved technology, which suggests a technology that has been imposed on them. What if they cannot and still get such a notice? Is it possible that these powers will never be capable of being used, especially if they do not co-operate?
To answer my noble friend Lady Stowell first, it depends on the type of service. It is difficult to give a short answer that covers the range of services that we want to ensure are covered here, but we are seeking to keep this and all other parts of the Bill technology neutral so that, as services develop, technology changes and criminals, unfortunately, seek to exploit that, technology companies can continue to innovate to keep children safe while protecting the privacy of their users. That is a long-winded answer to my noble friend’s short question, but necessarily so. Ofcom will need to make its assessments on a case- by-case basis and can require a company to use its best endeavours to innovate if no effective and accurate technology is currently available.
While I am directing my remarks towards my noble friend, I will also answer a question she raised earlier on general monitoring. General monitoring is not a legally defined concept in UK law; it is a term in European Union law that refers to the generalised monitoring of user activity online, although its parameters are not clearly defined. The use of automated technologies is already fundamental to how many companies protect their users from the most abhorrent harms, including child sexual abuse. It is therefore important that we empower Ofcom to require the use of such technology where it is necessary and proportionate and ensure that the use of these tools is transparent and properly regulated, with clear and appropriate safeguards in place for users’ rights. The UK’s existing intermediary liability regime remains in place.
Amendment 255 from my noble friend Lord Moylan seeks to prevent Ofcom imposing any requirement in a notice that would weaken or remove end-to-end encryption. He is right that end-to-end encryption should not be weakened or removed. The powers in the Bill will not do that. These powers are underpinned by proportionality and technical feasibility; if it is not proportionate or technically feasible for companies to identify child sexual exploitation abuse content on their platform while upholding users’ right to privacy, Ofcom cannot require it.
I agree with my noble friend and the noble Baroness, Lady Fox, that encryption is a very important and popular feature today. However, with technology evolving at a rapid rate, we cannot accept amendments that would risk this legislation quickly becoming out of date. Naming encryption in the Bill would risk that happening. We firmly believe that the best approach is to focus on strong safeguards for upholding users’ rights and ensuring that measures are proportionate to the specific situation, rather than on general features such as encryption.
The Bill already requires Ofcom to consider the risk that technology could result in a breach of any statutory provision or rule of law concerning privacy and whether any alternative measures would significantly reduce the amount of illegal content on a service. As I have said in previous debates, Ofcom is also bound by the Human Rights Act not to act inconsistently with users’ rights.
Will the Minister write to noble Lords who have been here in Committee and on Report in response to the fact that it is not just encryption companies saying that the demands of this clause will lead to the breaching of encryption, even though the Minister and the Government keep saying that it will not? As I have indicated, a wide range of scientists and technologists are saying that, whatever is said, demanding that Ofcom insists that technology notices are used in this way will inadvertently lead to the breaking of encryption. It would be useful if the Government at least explained scientifically and technologically why those experts are wrong and they are right.
I am very happy to put in writing what I have said from the Dispatch Box. The noble Baroness may find that it is the same, but I will happily set it out in further detail.
I should make it clear that the Bill does not permit law enforcement agencies to access information held on platforms, including access to private channels. The National Crime Agency will be responsible for receiving reports from in-scope services via secure transmission, processing these reports and, where appropriate, disseminating them to other UK law enforcement bodies and our international counterparts. The National Crime Agency will process only information provided to it by the company; where it determines that the content is child sexual abuse content and meets the threshold for criminality, it can request further information from the company using existing powers.
I am glad to hear that my noble friend Lord Moylan does not intend to divide on his amendment. The restrictions it sets out are not ones we should impose on the Bill.
Amendments 256, 257 and 259 in the name of the noble Lord, Lord Stevenson of Balmacara, require a notice to be approved by a judicial commissioner appointed under the Investigatory Powers Act 2016 and remove Ofcom’s power to require companies to make best endeavours to develop or source new technology to address child sexual exploitation and abuse content.
I appreciate the tone of the Minister’s comments very much, but they are not entirely reassuring me. There is a debate going on out there: there are people saying, “We’ve got these fabulous technologies that we would like Ofcom to order companies to install” and there are companies saying, “That would be disastrous and break encryption if we had to install them”. That is a dualistic situation where there is a contest going on. My amendment seeks to make sure the conflict can be properly resolved. I do not think Ofcom on its own can ever do that, because Ofcom will always be defending what it is doing and saying “This is fine”. So, there has to be some other mechanism whereby people can say it is not fine and contest that. As I say, in this debate we are ignoring the fact that they are already out there: people saying “We think you should deploy this” and companies saying “It would be disastrous if we did”. We cannot resolve that by just saying “Trust Ofcom”.
To meet the expectation the noble Lord voiced earlier, I will indeed point out that Ofcom can consult the ICO as a skilled person if it wishes to. It is important that we square the circle and look at these issues. The ICO will be able to be involved in the way I have set out as a skilled person.
Before I conclude, I want to address my noble friend Lady Harding’s questions on skilled persons. Given that notices will be issued on a case-by-case basis, and Ofcom will need to look at specific service design and existing systems of a provider to work out how a particular technology would interact with that design system, a skilled person’s report better fits this process by requiring Ofcom to obtain tailored advice rather than general technical advice from an advisory board. The skilled person’s report will be largely focused on the technical side of Ofcom’s assessment: that is to say, how the technology would interact with the service’s design and existing systems. In this way, it offers something similar to but more tailored than a technical advisory board. Ofcom already has a large and expert technology group, whose role it is to advice policy teams on new and existing technologies, to anticipate the impact of technologies and so on. It already has strong links with academia and with external researchers. A technical advisory board would duplicate that function. I hope that reassures my noble friend that the points she raised have been taken into account.
So I hope the noble Lord, Lord Allan, will not feel the need to divide—
Before the Minister finishes, I posed the question about whether, given the debate and issues raised, he felt completely satisfied that we had arrived at the right solution, and whether there was a case for withdrawing the amendment at this stage and bringing it back at Third Reading, having had further discussions and debate where we could all agree. I take it his answer is “no”.
I am afraid it is “no”, and if the noble Lord, Lord Allan, does seek to divide, we will oppose his amendment. I commend the amendments standing in my name in this group to the House.
“Section (Assessment duties: user empowerment) | Assessments related to duty in section 12(2)” |
My Lords, in moving Amendment 262A, I will speak also to the other government amendments in the group. These amendments address the Bill’s enforcement powers. Government Amendments 262A, 262B, 262C, 264A and 266A, Amendments 265, 266 and 267, tabled by my noble friend Lord Bethell, and Amendment 268 tabled by the noble Lord, Lord Stevenson of Balmacara, relate to senior management liability. Amendment 268C from the noble Lord, Lord Weir of Ballyholme, addresses interim service restriction orders.
In Committee, we amended the Bill to create an offence of non-compliance with steps set out in confirmation decisions that relate to specific children’s online safety duties, to ensure that providers and individuals can be held to account where their non-compliance risks serious harm to children. Since then, we have listened to concerns raised by noble Lords and others, in particular that the confirmation decision offence would not tackle child sexual exploitation and abuse. That is why the government amendments in this group will create a new offence of a failure to comply with a child sexual exploitation and abuse requirement imposed by a confirmation decision. This will mean that providers and senior managers can be held liable if they fail to comply with requirements to take specific steps as set out in Ofcom’s confirmation decision in relation to child sexual exploitation and abuse on their service.
Ofcom must designate a step in a confirmation decision as a child sexual exploitation and abuse requirement, where that step relates, whether or not exclusively, to a failure to comply with specific safety duties in respect of child sexual exploitation and abuse content. Failure to comply with such a requirement will be an offence. This approach is necessary, given that steps may relate to multiple or specific kinds of illegal content, or systems and process failures more generally. This approach will ensure that services know from the confirmation decision when they risk criminal liability, while providing sufficient legal certainty via the specified steps to ensure that the offence can be prosecuted effectively.
The penalty for this offence is up to two years in prison, a fine or both. Through Clause 182, where an offence is committed with the consent or connivance of a senior manager, or attributable to his or her neglect, the senior manager, as well as the entity, will have committed the offence and can face up to two years in prison, a fine or both.
I thank my noble friend Lord Bethell, as well as our honourable friends Miriam Cates and Sir William Cash in another place, for their important work in raising this issue and their collaborative approach as we have worked to strengthen the Bill in this area. I am glad that we have reached a position that will help to keep children safe online and drive a change in culture in technology companies. I hope this amendment reassures them and noble Lords that the confirmation decision offence will tackle harms to children effectively by ensuring that technology executives take the necessary steps to keep children safe online. I beg to move.
My Lords, I will briefly comment positively on the Minister’s explanation of how these offences might work, particularly the association of the liability with the failure to enforce a confirmation decision, which seems entirely sensible. In an earlier stage of the debate, there was a sense that we might associate liability with more general failures to enforce a duty of care. That would have been problematic, because the duty of care is very broad and requires a lot of pieces to be put in place. Associating the offences with the confirmation decision makes absolute sense. Having been in that position, if, as an executive in a tech company, I received a confirmation decision that said, “You must do these things”, and I chose wilfully to ignore that decision, it would be entirely reasonable for me to be held potentially criminally liable for that. That association is a good step forward.
My Lords, we welcome the government amendments in this group to bring child sexual exploitation and abuse failures into the scope of the senior manager liability and enforcement regime but consider that they do not go far enough. On the government amendments, I have a question for the Minister about whether, through Clause 122, it would be possible to require a company that was subject to action to do some media literacy as part of its harm reduction; in other words, would it be possible for Ofcom to use its media literacy powers as part of the enforcement process? I offer that as a helpful suggestion.
We share the concerns expressed previously by the noble Lord, Lord Bethell, about the scope of the senior manager liability regime, which does not cover all the child safety duties in the Bill. We consider that Amendment 268, in the name of my noble friend Lord Stevenson, would provide greater flexibility, giving the possibility of expanding the list of duties covered in the future. I have a couple of brief questions to add to my first question. Will the Minister comment on how the operation of the senior manager liability regime will be kept under review? This has, of course, been something of a contentious issue in the other place, so could the Minister perhaps tell your Lordships’ House how confident he is that the current position is supported there? I look forward to hearing from the Minister.
I did not quite finish writing down the noble Baroness’s questions. I will do my best to answer them, but I may need to follow up in writing because she asked a number at the end, which is perfectly reasonable. On her question about whether confirmation decision steps could include media literacy, yes, that is a good idea; they could.
Amendment 268, tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to enable the Secretary of State, through regulation, to add to the list of duties which are linked to the confirmation decision offence. We are very concerned at the prospect of allowing an unconstrained expansion of the confirmation decision offence. In particular, as I have already set out, we would be concerned about expansion of those related to search services. There is also concern about unconstrained additions of any other duties related to user-to-user services as well.
We have chosen specific duties which will tackle effectively key issues related to child safety online and tackling child abuse while ensuring that the confirmation decision offence remains targeted. Non-compliance with a requirement imposed by a confirmation decision in relation to such duties warrants the prospect of criminal enforcement on top of Ofcom’s extensive civil enforcement powers. Making excessive changes to the offence risks shifting the regime towards a more punitive and disproportionate enforcement model, which would represent a significant change to the framework as a whole. Furthermore, expansion of the confirmation decision offence could lead to services taking an excessively cautious approach to content moderation to avoid the prospect of criminal liability. We are also concerned that such excessive expansion could significantly increase the burden on Ofcom.
I am grateful to the noble Lord, Lord Weir of Ballyholme, and the noble Baroness, Lady Benjamin, for the way they set out their Amendment 268C. We are concerned about this proposal because it is important that Ofcom can respond to issues on a case-by-case basis: it may not always be appropriate or proportionate to use a specific enforcement power in response to a suspected breach. Interim service restriction orders are some of the strongest enforcement powers in the Bill and will have a significant impact on the service in question. Their use may be disproportionate in cases where there is only a minor breach, or where a service is taking steps to deal with a breach following a provisional notice of contravention.
My Lords, I think the upshot of this brief debate is that the noble Lord, Lord Knight —how he was tracked down in a Pret A Manger, I have no idea; he is normally too fast-moving for that—in his usual constructive and creative way is asking the Government to constructively engage to find a solution, which he discussed in that Pret A Manger, involving a national helpline, the NSPCC and the Children’s Commissioner, for the very reasons that he and my noble friend Lord Allan have put forward. In no way would this be some of kind of quango, in the words of the noble Baroness, Lady Fox.
This is really important stuff. It could be quite a game-changer in the way that the NSPCC and the Children’s Commissioner collaborate on tackling the issues around social media, the impact of the new rights under the Bill and so on. I very much hope that the Government will be able to engage positively on this and help to bring the parties together to, in a sense, deliver something which is not in the Bill but could be of huge importance.
My Lords, first, I reassure noble Lords that the Government are fully committed to making sure that the interests of children are both represented and protected. We believe, however, that this is already achieved through the provisions in the Bill.
Rather than creating a single advocacy body to research harms to children and advocate on their behalf, as the noble Lord’s amendment suggests, the Bill achieves the same effect through a combination of Ofcom’s research functions, the consultation requirements and the super-complaints provisions. Ofcom will be fully resourced with the capacity and technological ability to assess and understand emerging harms and will be required to research children’s experiences online on an ongoing basis.
For the first time, there will be a statutory body in place charged with protecting children from harm online. As well as its enforcement functions, Ofcom’s research will ensure that the framework remains up to date and that Ofcom itself has the latest, in-depth information to aid its decision-making. This will ensure that new harms are not just identified in retrospect when children are already affected by them and complaints are made; instead, the regulator will be looking out for new issues and working proactively to understand concerns as they develop.
Children’s perspectives will play a central role in the development of the framework, as Ofcom will build on its strong track record of qualitative research to ensure that children are directly engaged. For example, Ofcom’s ongoing programme, Children’s Media Lives, involves engaging closely with children and tracking their views and experiences year on year.
Alongside its own research functions, super-complaints will ensure that eligible bodies can make complaints on systemic issues, keeping the regulator up to date with issues as they emerge. This means that if Ofcom does not identify a systemic issue affecting children for any reason, it can be raised and then dealt with appropriately. Ofcom will be required to respond to the super-complaint, ensuring that its subsequent decisions are understood and can be scrutinised. Complaints by users will also play a vital role in Ofcom’s horizon scanning and information gathering, providing a key means by which new issues can be raised.
The extensive requirements for Ofcom to consult on codes of practice and guidance will further ensure that it consistently engages with groups focused on the interests of children as the codes and guidance are developed and revised. Children’s interests are embedded in the implementation and delivery of this framework.
The Children’s Commissioner will play a key and ongoing role. She will be consulted on codes of practice and any further changes to those codes. The Government are confident that she will use her statutory duties and powers effectively to understand children’s experiences of the digital world. Her primary function as Children’s Commissioner for England is promoting and protecting the rights of children in England and to promote and protect the rights of children across the United Kingdom where those rights are or may be affected by reserved matters. As the codes of practice and the wider Bill relate to a reserved area of law—namely, internet services—the Children’s Commissioner for England will be able to represent the interests of children from England, Scotland, Wales and Northern Ireland when she is consulted on the preparation of codes of practice. That will ensure that children’s voices are represented right across the UK. The Children’s Commissioner for England and her office also regularly speak to the other commissioners about ongoing work on devolved and reserved matters. Whether she does that in branches of Pret A Manger, I do not know, but she certainly works with her counterparts across the UK.
I am very happy to take back the idea that the noble Lord has raised and discuss it with the commissioner. There are many means by which she can carry out her duties, so I am very happy to take that forward. I cannot necessarily commit to putting it in legislation, but I shall certainly commit to discussing it with her. On the proposals in the noble Lord’s amendment, we are concerned that a separate child user advocacy body would duplicate the functions that she already has, so I hope with that commitment he will be happy to withdraw.
My Lords, I am grateful to those who have spoken in this quick debate and for the support from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Fox, about children’s voices being heard. I think that we are getting to the point when there will not be a quango or indeed a minefield, so that makes us all happy. The Minister almost derailed me, because so much of his speaking note was about the interests of children and I am more interested in the voice of children being heard directly rather than people acting on their behalf and representing their interests, but his final comments around being happy to take the idea forward means that I am very happy to withdraw my amendment.
My Lords, as we have heard, the noble Baroness, Lady Harding, made a very clear case in support of these amendments, tabled in the name of the noble Baroness, Lady Kidron, and supported by noble Lords from across the House. The noble Baroness, Lady Morgan, gave wise counsel to the Minister, as did the noble Lord, Lord Clement-Jones, that it is worth stepping back and seeing where we are in order to ensure that the Bill is in the right place. I urge the Minister to find the time and the energy that I know he has—he certainly has the energy and I am sure he will match it with the time—to speak to noble Lords over the coming Recess to agree a way to incorporate systems and functionality into the Bill, for all the reasons we have heard.
On Monday, my noble friend Lord Knight spoke of the need for a review about loot boxes and video games. When we checked Hansard, we saw the Minister had promised that such a review would be offered in the coming months. In an unusual turn of events, the Minister exceeded the timescale. We did not have to hear the words “shortly”, “in the summer” or “spring” or anything like that, because it was announced the very next day that the department would keep legislative options under review.
I make that point simply to thank the Minister for the immediate response to my noble friend Lord Knight. But, if we are to have such a review, does this not point very much to the fact that functionality and systems should be included in the Bill? The Minister has a very nice hook to hang this on and I hope that he will do so.
My Lords, this is not just a content Bill. The Government have always been clear that the way in which a service is designed and operated, including its features and functionalities, can have a significant impact on the risk of harm to a user. That is why the Bill already explicitly requires providers to ensure their services are safe by design and to address the risks that arise from features and functionalities.
The Government have recognised the concerns which noble Lords have voiced throughout our scrutiny of the Bill, and those which predated the scrutiny of it. We have tabled a number of amendments to make it even more explicit that these elements are covered by the Bill. We have tabled the new introductory Clause 1, which makes it clear that duties on providers are aimed at ensuring that services are safe by design. It also highlights that obligations on services extend to the design and operation of the service. These obligations ensure that the consideration of risks associated with the business model of a service is a fundamental aspect of the Bill.
My noble friend Baroness Harding of Winscombe worried that we had made the Bill worse by adding this. The new clause was a collaborative one, which we have inserted while the Bill has been before your Lordships’ House. Let me reassure her and other noble Lords as we conclude Report that we have not made it worse by so doing. The Bill will require services to take a safety by design approach to the design and operation of their services. We have always been clear that this will be crucial to compliance with the legislation. The new introductory Clause 1 makes this explicit as an overarching objective of the Bill. The introductory clause does not introduce any new concepts; it is an accurate summary of the key provisions and objectives of the Bill and, to that end, the framework and introductory statement are entirely compatible.
We also tabled amendments—which we debated last Monday—to Clause 209. These make it clear that functionalities contribute to the risk of harm to users, and that combinations of functionality may cumulatively drive up the level of risk. Amendment 281BA would amend the meaning of “functionality” within the Bill, so that it includes any system or process which affects users. This presents a number of concerns. First, such a broad interpretation would mean that any service in scope of the Bill would need to consider the risk of any feature or functionality, including ones that are positive for users’ online experience. That could include, for example, processes designed for optimising the interface depending on the user’s device and language settings. The amendment would increase the burden on service providers under the existing illegal content and child safety duties and would dilute their focus on genuinely risky functionality and design.
Second, by duplicating the reference to systems, processes and algorithms elsewhere in the Bill, it implies that the existing references in the Bill to the design of a service or to algorithms must be intended to capture matters not covered by the proposed new definition of “functionality”. This would suggest that references to systems and processes, and algorithms, mentioned elsewhere in the Bill, cover only systems, processes or algorithms which do not have an impact on users. That risks undermining the effectiveness of the existing duties and the protections for users, including children.
Amendment 268A introduces a further interpretation of features and functionality in the general interpretation clause. This duplicates the overarching interpretation of functionality in Clause 208 and, in so doing, introduces legal and regulatory uncertainty, which in turn risks weakening the existing duties. I hope that sets out for my noble friend Lady Harding and others our legal concerns here.
Amendment 281FA seeks to add to the interpretation of harm in Clause 209 by clarifying the scenarios in which harm may arise, specifically from services, systems and processes. This has a number of concerning effects. First, it states that harm can arise solely from a system and process, but a design choice does not in isolation harm a user. For example, the decision to use algorithms, or even the algorithm itself, is not what causes harm to a user—it is the fact that harmful content may be pushed to a user, or content pushed in such a manner that is harmful, for example repeatedly and in volume. That is already addressed comprehensively in the Bill, including in the child safety risk assessment duties.
Secondly, noble Lords should be aware that the drafting of the amendment has the effect of saying that harm can arise from proposed new paragraphs (a) (b) and (c)—
Can I just double-check what my noble friend has just said? I was lulled into a possibly false sense of security until we got to the point where he said “harmful” and then the dreaded word “content”. Does he accept that there can be harm without there needing to be content?
This is the philosophical question on which we still disagree. Features and functionality can be harmful but, to manifest that harm, there must be some content which they are functionally, or through their feature, presenting to the user. We therefore keep talking about content, even when we are talking about features and functionality. A feature on its own which has no content is not what the noble Baroness, Lady Kidron, my noble friend Lady Harding and others are envisaging, but to follow the logic of the point they are making, it requires some content for the feature or functionality to cause its harm.
Yes, even if the content is not harmful. We keep saying “content” because it is the way the content is disseminated, as the Bill sets out, but the features and functionalities can increase the risks of harm as well. We have addressed this through looking at the cumulative effects and in other ways.
This is the key question. For example, let us take a feature that is pushing something at you constantly; if it was pushing poison at you then it would obviously be harmful, but if it was pushing marshmallows then they would be singularly not harmful but cumulatively harmful. Is the Minister saying that the second scenario is still a problem and that the surfeit of marshmallows is problematic and will still be captured, even if each individual marshmallow is not harmful?
Yes, because the cumulative harm—the accumulation of marshmallows in that example—has been addressed.
Noble Lords should also be aware that the drafting of Amendment 281FA has the effect of saying that harm can arise from proposed new paragraphs (a), (b) and (c)—for example, from the
“age or characteristics of the likely user group”.
In effect, being a child or possessing a particular characteristic may be harmful. This may not be the intention of the noble Baronesses who tabled the amendment, but it highlights the important distinction between something being a risk factor that influences the risk of harm occurring and something being harmful.
The Government are clear that these aspects should properly be treated as risk factors. Other parts of the Bill already make it clear that the ways in which a service is designed and used may impact on the risk of harm suffered by users. I point again to paragraphs (f) to (h) of Clause 10(6); paragraph (e) talks about the level of risk of functionalities of the service, paragraph (f) talks about the different ways in which the service is used, and so on.
We have addressed these points in the Bill, though clearly not to the satisfaction of my noble friend, the noble Baroness, Lady Kidron, and others. As we conclude Report, I recognise that we have not yet convinced everyone that our approach achieves what we all seek, though I am grateful for my noble friend’s recognition that we all share the same aim in this endeavour. As I explained to the noble Baroness, Lady Kidron, on her Amendment 35, I was asking her not to press it because, if she did, the matter would have been dealt with on Report and we would not be able to return to it at Third Reading.
As the Bill heads towards another place with this philosophical disagreement still bubbling away, I am very happy to commit to continuing to talk to your Lordships—particularly when the Bill is in another place, so that noble Lords can follow the debates there. I am conscious that my right honourable friend Michelle Donelan, who has had a busy maternity leave and has spoken to a number of your Lordships while on leave, returns tomorrow in preparation for the Bill heading to her House. I am sure she will be very happy to speak even more when she is back fully at work, but we will both be happy to continue to do so.
I think it is appropriate, in some ways, that we end on this issue, which remains an area of difference. With that promise to continue these discussions as the Bill moves towards another place, I hope that my noble friend will be content not to press these amendments, recognising particularly that the noble Baroness, Lady Kidron, has already inserted this thinking into the Bill for consideration in the other House.
(1 year, 2 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
My Lords, I will make a brief statement on the devolution status of the Bill. I am pleased to inform your Lordships’ House that both the Scottish Parliament and Senedd Cymru have voted to grant consent for all the relevant provisions. For Scotland, these provisions are the power to amend the list of exempt educational institutions, the power to amend the list of child sexual exploitation and abuse offences and the new offence of encouraging or assisting serious self-harm. For Wales, the provisions are the power to amend the list of exempt educational institutions, the false communications offence, the threatening communications offence, the flashing images offences and the offence of encouraging or assisting serious self-harm.
As noble Lords will be aware, because the Northern Ireland Assembly is adjourned the usual process for seeking legislative consent in relation to Northern Ireland has not been possible. In the absence of legislative consent from the Northern Ireland Assembly, officials from the relevant UK and Northern Ireland departments have worked together to ensure that the Bill considers and reflects the relevant aspects of devolved legislation so that we may extend the following provisions to Northern Ireland: the power to amend the list of exempt educational institutions, the false communications offence, the threatening communications offence and the offence of encouraging or assisting serious self-harm. His Majesty’s Government have received confirmation in writing from the relevant Permanent Secretaries in Northern Ireland that they are content that nothing has been identified which would cause any practical difficulty in terms of the existing policy and legislative landscape. Historically, this area of legislation in Northern Ireland has mirrored that in Great Britain, and we believe that legislating without the consent of the Northern Ireland Assembly is justified in these exceptional circumstances and mitigates the risk of leaving Northern Ireland without the benefit of the Bill’s important reforms and legislative parity.
We remain committed to ensuring sustained engagement on the Bill with all three devolved Administrations as it progresses through Parliament. I beg to move that the Bill be read a third time.
Clause 44: Secretary of State’s powers of direction
Amendment 1
My Lords, His Majesty’s Government have listened carefully to the views expressed in Committee and on Report and have tabled amendments to the Bill to address concerns raised by noble Lords. Let me first again express my gratitude to my noble friend Lady Stowell of Beeston for her constructive engagement on the Secretary of State’s powers of direction. As I said during our previous debate on this topic, I am happy to support her Amendments139 and 140 from Report. The Government are therefore bringing forward two amendments to that effect today.
Noble Lords will recall that, whenever directing Ofcom about a code, the Secretary of State must publish that direction. Amendment 1 means that, alongside this, in most cases a direction will now need to be laid before Parliament. There may be some cases where it is appropriate for the Secretary of State to withhold information from a laid direction: for example, if she thinks that publishing it would be against the interests of national security. In these cases, Amendment 2 will instead require the Secretary of State to lay a statement before Parliament setting out that a direction has been given, the kind of code to which the direction relates and the reasons for not publishing it. Taken together, these amendments will ensure that your Lordships and Members of another place are always made aware as soon as a direction has been made and, wherever possible, understand the contents of that direction. I hope noble Lords will agree that, after the series of debates we have had, we have reached a sensible and proportionate position on these clauses and one which satisfies your Lordships’ House.
I am also grateful to the noble Baroness, Lady Kennedy of The Shaws, for her determined and collaborative work on the issue of threatening communications. Following the commitment I made to her on Report, I have tabled an amendment to make it explicit that the threatening communications offence captures threats where the recipient fears that someone other than the person sending the message will carry out the threat. I want to make it clear that the threatening communications offence, like other existing offences related to threats, already captures threats that could be carried out by third parties. This amendment does not change the scope of the offence, but the Government understand the desire of the noble Baroness and others to make this explicit in the Bill, and I am grateful to her for her collaboration.
Regarding Ofcom’s power of remote access, I am grateful to noble Lords, Lord Knight of Weymouth and Lord Allan of Hallam, my noble friend Lord Moylan and the noble Baroness, Lady Fox of Buckley, who unavoidably cannot be with us today, for raising their concerns about the perceived breadth of the power and the desire for further safeguards to ensure that it is used appropriately by the regulator.
I am also grateful to technology companies for the constructive engagement they have had with officials over the summer. As I set out on Report, the intention of our policy is to ensure clarity about Ofcom’s ability to observe empirical tests, which are a standard method for understanding algorithms and consequently for assessing companies’ compliance with the duties in the Bill. They involve taking a test data set, running it through an algorithmic system and observing the output.
My Lords, I do not know how everyone has spent their summer, but this feels a bit like we have been working on a mammoth jigsaw puzzle and we are now putting in the final pieces. At times, through the course of this Bill, it has felt like doing a puzzle in the metaverse, where we have been trying to control an unreliable avatar that is actually assembling the jigsaw—but that would be an unfair description of the Minister. He has done really well in reflecting on what we have said, influencing his ministerial colleagues in a masterclass of managing upwards, and coming up with reasonable resolutions to previously intractable issues.
We are trusting that some of the outcome of that work will be attended to in the Commons, as the noble Baroness, Lady Morgan, has said, particularly the issues that she raised on risk, that the noble Baroness, Lady Kidron, raised on children’s safety by design, and that my noble friend Lady Merron raised on animal cruelty. We are delighted at where we think these issues have got to.
For today, I am pleased that the concerns of the noble Baroness, Lady Stowell, on Secretary of State powers, which we supported, have been addressed. I also associate myself with her comments on parliamentary scrutiny of the work of the regulator. Equally, we are delighted that the Minister has answered the concerns of my noble friend Lady Kennedy and that he has secured the legislative consent orders which he informed us of at the outset today. We would be grateful if the Minister could write to us answering the points of my noble friend Lord Rooker, which were well made by him and by the Delegated Powers Committee.
I am especially pleased to see that the issues which we raised at Report on remote access have been addressed. I feel smug, as I had to press quite hard for the Minister to leave the door open to come back at this stage on this. I am delighted that he is now walking through the door. Like the noble Lord, Lord Allan, I have just a few things that I would like clarification on—the proportional use of the powers, Ofcom taking into account user privacy, especially regarding live user data, and that the duration of the powers be time- limited.
Finally, I thank parliamentarians on all sides for an exemplary team effort. With so much seemingly falling apart around us, it is encouraging that, when we have common purpose, we can achieve a lot, as we have with this Bill.
My Lords, let me first address the points made by the noble Lord, Lord Rooker. I am afraid that, like my noble friend Lady Stowell of Beeston, I was not aware of the report of your Lordships’ committee. Unlike her, I should have been. I have checked with my private office and we have not received a letter from the committee, but I will ask them to contact the clerk to the committee immediately and will respond to this today. I am very sorry that this was not brought to my attention, particularly since the members of the committee met during the Recess to look at this issue. I have corresponded with my noble friend Lord McLoughlin, who chairs the committee, on each of its previous reports. Where we have disagreed, we have done so explicitly and set out our reasons. We have agreed with most of its previous recommendations. I am very sorry that I was not aware of this report and have not had the opportunity to provide answers for your Lordships’ House ahead of the debate.
The report was published on 31 August. It so happens that the committee has been forced to meet in an emergency session tomorrow morning because of government amendments that have been tabled to the levelling-up Bill, which will be debated next Wednesday, that require a report on the delegated powers, so we will have the opportunity to see what the Minister has said. I am very grateful for his approach.
The committee will have a reply from me before it meets tomorrow. Again, I apologise. It should not be up to the committee to let the Minister know; I ought to have known about it.
I am very grateful to noble Lords for their support of the amendments that we have tabled in this group, which reflect the collaborative nature of the work that we have done and the thought which has been put into this by my ministerial colleagues and me, and by the Bill team, over the summer. I will have a bit more to say on that when I move that the Bill do now pass in a moment, but I am very grateful to those noble Lords who have spoken at this stage for highlighting the model of collaborative working that the Bill has shown.
The noble Baroness, Lady Ritchie of Downpatrick, asked for an update on timetables. Some of the implementation timetables which Ofcom has assessed depend a little on issues which may still change when the Bill moves to another place. If she will permit it, once they have been resolved I will write with the latest assessments from Ofcom, and, if appropriate, from us, on the implementation timelines. They are being recalculated in the light of amendments that have been made to the Bill and which may yet further change. However, everybody shares the desire to implement the Bill as swiftly as possible, and I am grateful that your Lordships’ work has helped us do our scrutiny with that in mind.
The noble Lord, Lord Allan, asked some questions about the remote viewing power. On proportionality, Ofcom will have a legal duty to exercise its power to view information remotely in a way that is proportionate, ensuring, as I said, that undue burdens are not placed on businesses. In assessing proportionality in line with this requirement, Ofcom would need to consider the size and resource capacity of a service when choosing the most appropriate way of gathering information. To comply with this requirement, Ofcom would also need to consider whether there was a less onerous method of obtaining the necessary information.
On the points regarding that and intrusion, Ofcom expects to engage with providers as appropriate about how to obtain the information it needs to carry out its functions. Because of the requirement on Ofcom to exercise its information-gathering powers proportionately, it would need to consider less onerous methods. As I said, that might include an audit or a skilled persons report, but we anticipate that, for smaller services in particular, those options could be more burdensome than Ofcom remotely viewing information.
Will my noble friend draw attention to the part of Clause 122 that says that Ofcom cannot issue a requirement which is not technically feasible, as he has just said? That does not appear in the text of the clause, and it creates a potential conflict. Even if the requirement is not technically feasible—or, at least, if the platform claims that it is not—Ofcom’s power to require it is not mitigated by the clause. It still has the power, which it can exercise, and it can presumably take some form of enforcement action if it decides that the company is not being wholly open or honest. The technical feasibility is not built into the clause, but my noble friend has just added it, as with quite a lot of other stuff in the Bill.
It has to meet minimum standards of accuracy and must have privacy safeguards in place. The clause talks about those in a positive sense, which sets out the expectation. I am happy to make clear, as I have, what that means: if the appropriate technology does not exist that meets these requirements, then Ofcom will not be able to use Clause 122 to require its use. I hope that that satisfies my noble friend.
My Lords, in begging to move that the Bill do now pass, I add my words of thanks to all noble Lords who have been involved over many years and many iterations of the Bill, particularly during my time as the Minister and in the diligent scrutiny we have given it in recent months. The Bill will establish a vital legislative framework, making the internet safer for all, particularly for children. We are now closer than ever to achieving that important goal. In a matter of months from Royal Assent, companies will be required to put in place protections to tackle illegal content on their services or face huge fines. I am very grateful to noble Lords for the dedication, attention and time they have given to the Bill while it has been before your Lordships’ House.
The Bill will mark a significant change in children’s safety online. Last month, data from UK police forces showed that 6,350 offences relating to sexual communications with a child were recorded last year alone. These are horrifying statistics which underline the importance of the Bill in building a protective shield for our children online. We cannot let perpetrators of such abhorrent crimes stalk children online and hide behind their screens, nor let companies continue to turn a blind eye to the harm being done to children on their services. We are working closely with Ofcom to make sure that the protections for children established by the Bill are enforced as soon as possible, and we have been clear that companies should not wait for the legislation to come into force before taking action.
The aim of keeping children safe online is woven throughout the Bill, and the changes that we have made throughout its passage in your Lordships’ House have further bolstered it. In order to provide early and clear guidance to companies and Ofcom regarding the content from which children must be protected, rather than addressing these later via secondary legislation, the categories of primary priority and priority content which is harmful to children will now be set out in the Bill.
Following another amendment made during your Lordships’ scrutiny, providers of the largest services will also be required to publish summaries of their risk assessments for illegal content and content which is harmful to children. Further changes to the Bill have also made sure that technology executives must take more responsibility for the safety of those who use their websites. Senior managers will face criminal liability if they fail to comply with steps set by Ofcom following enforcement action to keep children safe on their platforms, with the offence punishable with up to two years in prison.
Noble Lords have rightly raised concerns about what the fast-changing technological landscape will mean for children. The Bill faces the future and is designed to keep pace with emerging technological changes such as AI-generated pornography.
Child sexual exploitation and abuse content generated by AI is illegal, regardless of whether it depicts a real child or not, and the Bill makes it clear that technology companies will be required to identify this content proactively and remove it. Whatever the future holds, the Bill will ensure that guard rails are in place to allow our children to explore it safely online.
I have also had the pleasure of collaborating with noble Lords from across your Lordships’ House who have championed the important cause of strengthening protections for women and girls online, who we know disproportionately bear the brunt of abhorrent behaviour on the internet. Following changes made earlier to the Bill, Ofcom will be required to produce and publish guidance which summarises in one clear place measures that should be taken to reduce the risk of harm to women and girls online. The amendment will also oblige Ofcom to consult when producing the guidance, ensuring that it reflects the voices of women and girls as well as the views of experts on this important issue.
The Bill strikes a careful balance: it tackles criminal activity online and protects our children while enshrining freedom of expression in its legislative framework. A series of changes to the Bill has ensured that adults are provided with greater control over their online experience. All adult users of the largest services will have access to tools which, if they choose to use them, will allow them to filter out content from non-verified users and to reduce the likelihood of encountering abusive content. These amendments, which have undergone careful consideration and consultation, will ensure that the Bill remains proportionate, clear and future-proof.
I am very grateful to noble Lords who have helped us make those improvements and many more. I am conscious that a great number of noble Lords who have taken part in our debates were part of the pre-legislative scrutiny some years ago. They know the Bill very well and they know the issues well, which has helped our debates be well informed and focused. It has helped the scrutiny of His Majesty’s Government, and I hope that we have risen to that.
I am very grateful to all noble Lords who have made representations on behalf of families who have suffered bereavements because of the many terrible experiences online of their children and other loved ones. There are too many for me to name now, and many more who have not campaigned publicly but who I know have been following the progress of the Bill carefully, and we remember them all today.
Again, there are too many noble Lords for me to single out all those who have been so vigilant on this issue. I thank my colleagues on the Front Bench, my noble friends Lord Camrose and Lord Harlech, and on the Front Bench opposite the noble Lords, Lord Knight and Lord Stevenson, and the noble Baroness, Lady Merron. On the Liberal Democrat Benches, I thank the noble Lords, Lord Clement-Jones and Lord Allan of Hallam—who has been partly on the Front Bench and partly behind—who have been working very hard on this.
I also thank the noble Baroness, Lady Kidron, whom I consider a Front-Bencher for the Cross Benches on this issue. She was at the vanguard of many of these issues long before the Bill came to your Lordships’ House and will continue to be long after. We are all hugely impressed by her energy and personal commitment, following the debates not only in our own legislature but in other jurisdictions. I am grateful to her for the collaborative nature of her work with us.
I will not single out other noble Lords, but I am very grateful to them from all corners of the House. They have kicked the tyres of the Bill and asked important questions; they have given lots of time and energy to it and it is a better Bill for that.
I put on record my thanks to the huge team in my department and the Department for Science, Innovation and Technology, who, through years of work, expertise and determination, have brought the Bill to this point. I am grateful to the staff of your Lordships’ House and to colleagues from the Office of the Parliamentary Counsel, in particular Maria White and Neil Shah, and, at the Department for Science, Innovation and Technology, Sarah Connolly, Orla MacRae, Caroline Bowman and Emma Hindley as well as their huge teams, including those who have worked on the Bill over the years but are not currently working on it. They have worked extremely hard and been generous with their time to noble Lords for the use of our work.
The Bill will make a vital difference to people’s safety online, especially children’s safety. It has been a privilege to play a part in it. I was working as a special adviser at the Home Office when this area of work was first mooted. I remember that, when this Bill was suggested in the 2017 manifesto, people suggested that regulating the internet was a crazy idea. The biggest criticism now is that we have not done it sooner. I am very grateful to noble Lords for doing their scrutiny diligently but speedily, and I hope to see the Bill on the statute book very soon. I beg to move that the Bill do now pass.
My Lords, I am grateful to the Minister for his very kind words to everybody, particularly my Front Bench and me. I also wish him a speedy recovery from his recent illness, although I was less sympathetic when I discovered how much he has been “managing upwards”—in the words of my noble friend Lord Knight—and achieving for us in the last few days. He has obviously been recovering and I am grateful for that. The noble Lord has steered the Bill through your Lordships’ House with great skill and largely single-handedly. It has been a pleasure to work with him, even when he was turning down our proposals and suggestions for change, which he did in the nicest possible way but absolutely firmly.
I rise briefly to raise the question of access to data by academics and research organisations. Before I do so, I want to express profound thanks to noble Lords who have worked so collaboratively to create a terrific Bill that will completely transform and hold to account those involved in the internet, and make it a safer place. That was our mission and we should be very proud of that. I cannot single out noble Peers, with the exception of the noble Baroness, Lady Kidron, with whom I worked collaboratively both on age assurance and on harms. It was a partnership I valued enormously and hope to take forward. Others from all four corners of the House contributed to the parts of the Bill that I was particularly interested in. As I look around, I see so many friends who stuck their necks out and spoke so movingly, for which I am enormously grateful.
The question of data access is one of the loose ends that did not quite make it into the Bill. I appreciate the efforts of my noble friend the Minister, the Secretary of State and the Bill team in this matter and their efforts to try and wangle it in; I accept that it did not quite make it. I would like to hear reassurance from my noble friend that this is something that the Government are prepared to look at in future legislation. If he could provide any detail on how and in which legislation it could be revisited, I would be enormously grateful.
My Lords, I will be brief and restrict myself to responding to the questions which have been raised. I will hold to my rule of not trying to thank all noble Lords who have played their part in this scrutiny, because the list is indeed very long. I agree with what the noble Lord, Lord Clement-Jones, said about this being a Back-Bench-driven Bill, and there are many noble Lords from all corners of the House and the Back Benches who have played a significant part in it. I add my thanks to the noble Baroness, Lady Benjamin, not just for her kind words, but for her years of campaigning on this, and to my noble friend Lord Bethell who has worked with her—and others—closely on the issues which she holds dear.
I also thank my noble friend Lord Moylan who has often swum against the tide of debate, but very helpfully so, and on important matters. In answer to his question about Wikipedia, I do not have much to add to the words that I have said a few times now about the categorisation, but on his concerns about the parliamentary scrutiny for this I stress that it is the Secretary of State who will set the categorisation thresholds. She is, of course, a Member of Parliament, and accountable to it. Ofcom will designate services based on those thresholds, so the decision-making can be scrutinised in Parliament, even if not in the way he would have wished.
I agree that we should all be grateful to the noble Lord, Lord Allan of Hallam, because he addressed some of the questions raised by my noble friend Lady Stowell of Beeston. In brief, the provision is flexible for where the technological solutions do not currently exist, because Ofcom can require services to develop or source new solutions.
This close to the gracious Speech, I will not point to a particular piece of legislation in which we might revisit the issue of researchers’ access, as raised by my noble friend Lord Bethell, but I am happy to say that we will certainly look at that again, and I know that he will take the opportunity to raise it.
Noble Lords on the Front Benches opposite alluded to the discussions which are continuing—as I committed on Report to ensure that noble Lords are able to be part of discussions as the Bill heads to another place—on functionalities and on the amendment of my noble friend Lady Morgan on category 1 services. She is one of a cavalcade of former Secretaries of State who have been so helpful in scrutinising the Bill. It is for another place to debate them, but I am grateful to noble Lords who have given their time this week to have the discussions which I committed to have and will continue to have as the Bill heads there, so that we can follow those issues hopefully to a happy resolution.
I thank my noble friend Lady Harding of Winscombe for the concessions that she wrought on Report, and for the part that she has played in discussions. She has also given a great deal of time outside the Chamber.
We should all be very grateful to the noble Lord, Lord Grade of Yarmouth, who has sat quietly throughout most of our debates—understandably, in his capacity as chairman of Ofcom—but he has followed them closely and taken those points to the regulator. Dame Melanie Dawes and all the team there stand ready to implement this work and we should be grateful to the noble Lord, Lord Grade of Yarmouth, and to all those at Ofcom who are ready to put it into action.
(1 year, 2 months ago)
Commons ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move amendment (a) to Lords amendment 182.
With this it will be convenient to discuss the following:
Lords amendment 349, and Government amendments (a) and (b).
Lords amendment 391, Government amendment (a), and Government consequential amendment (a).
Lords amendment 17, Government motion to disagree, and Government amendments (a) and (b) in lieu.
Amendment (i) to Government amendment (a) in lieu of Lords amendment 17.
Lords amendment 20, and Government motion to disagree.
Lords amendment 22, and Government motion to disagree.
Lords amendment 81, Government motion to disagree, and Government amendments (a) to (c) in lieu.
Lords amendment 148, Government motion to disagree, and Government amendment (a) in lieu.
Lords amendment 1, and amendments (a) and (b).
Lords amendments 2 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181 and 183 to 188.
Lords amendment 189, and amendment (a) in lieu.
Lords amendments 190 to 216.
Lords amendment 217, and amendment (a).
Lords amendments 218 to 227.
Lords amendment 228, and amendment (a).
Lords amendments 229 and 230.
Lords amendment 231, and amendment (a).
Lords amendments 232 to 319.
Lords amendment 320, and amendment (a).
Lords amendment 321, and amendment (a).
Lords amendments 322 to 348, 350 to 390 and 392 to 424.
As we know from proceedings in this place, the Online Safety Bill is incredibly important. I am delighted that it is returning to the Commons in great shape, having gone through extensive and thorough scrutiny in the Lords. The Bill is world-leading, and the legislative framework established by it will lead to the creation of a profoundly safer online environment in this country. It will kickstart change where that is sorely needed, and ensure that our children are better protected against pornography and other content that is harmful to them. The Bill will also guard children against perpetrators of abhorrent child sexual exploitation and abuse, and ensure that tech companies take responsibility for tackling such content on their platforms, or be held criminally accountable.
As I am sure my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) will agree, may I say how much we appreciate what the Government have done in relation to the matter just referred to? As the Minister knows, we withdrew our amendment in the House of Commons after discussion, and we had amazingly constructive discussions with the Government right the way through, and also in the House of Lords. I shall refer to that if I am called to speak later, but I simply wanted to put on record our thanks, because this will save so many children’s lives.
I thank my hon. Friend and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) for all their work on this. I hope that this debate will show that we have listened and tried to work with everybody, including on this important part of the Bill. We have not been able to capture absolutely everything that everybody wants, but we are all determined to ensure that the Bill gets on the statute book as quickly as possible, to ensure that we start the important work of implementing it.
We have amended the Bill to bolster its provisions. A number of topics have been of particular interest in the other place. Following engagement with colleagues on those issues, we have bolstered the Bill’s protections for children, including a significant package of changes relating to age assurance. We have also enhanced protections for adult users.
My hon. Friend will know that Ministers and officials in his Department have worked extensively—I thank them for that—with me, Baroness Kidron, and the Bereaved Families for Online Safety group, on the amendment that will make it easier for coroners to have access to data from online companies in the tragic cases where that might be a cause of a child’s death. He will also know that there will still be gaps in legislation, but such gaps could be closed by further measures in the Data Protection and Digital Information Bill. His ministerial colleague in the other place has committed the Government to that, so may I invite my hon. Friend to set out more about the Government’s plans for doing just that?
I thank my right hon. Friend for his work on this, and Baroness Kidron for her work. I will cover that in more detail in a moment, but we remain committed to exploring measures that would facilitate better access to data for coroners under specific circumstances. We are looking for the best vehicle to do that, which includes those possibilities in the Data Protection and Digital Information Bill. We want to ensure that the protections for adult users afford people greater control over their online experience.
The Minister is setting out a powerful case for how the Government have listened to the overtures in this place and the other place. Further to the interventions from my hon. Friend the Member for Stone (Sir William Cash) and my right hon. Friend the Member for Bromsgrove (Sajid Javid), the former Culture Secretary, will the Minister be clear that the risk here is under-regulation, not over-regulation? Although the internet may be widely used by perfectly good people, the people who run internet companies are anything but daft and more likely to be dastardly.
This is a difficult path to tread in approaching this issue for the first time. In many ways, these are things that we should have done 10 or 15 years ago, as social media platforms and people’s engagement with them proliferated over that period. Regulation has to be done gently, but it must be done. We must act now and get it right, to ensure that we hold the big technology companies in particular to account, while also understanding the massive benefits that those technology companies and their products provide.
I agree with the Minister that this is a groundbreaking Bill, but we must be clear that there are still gaps. Given what he is saying about the requirements for regulation of online social media companies and other platforms, how will he monitor, over a period of time, whether the measures that we have are as dynamic as they need to be to catch up with social media as it develops?
The hon. Lady asks an important question, and that is the essence of what we are doing. We have tried to make this Bill flexible and proportionate. It is not technology specific, so that it is as future-proofed as possible. We must obviously lean into Ofcom as it seeks to operationalise the Act once the Bill gains Royal Assent. Ofcom will come back with its reporting, so not only will Government and the Department be a check on this, but Parliament will be able to assess the efficacy of the Bill as the system beds in and as technology and the various platforms move on and develop.
I talked about the offences, and I will just finalise my point about criminal liability. Those offences will be punishable with up to two years in prison.
Further to that point about the remaining gaps in the Bill, I appreciate what the Minister says about this area being a moving target. Everybody—not just in this country, but around the world—is having to learn as the internet evolves.
I thank the Minister for Government amendment 241, which deals with provenance and understanding where information posted on the web comes from, and allows people therefore to check whether they want to see it, if it comes from dubious sources. That is an example of a collective harm—of people posting disinformation and misinformation online and attempting to subvert our democratic processes, among other things. I park with him, if I may, the notion that we will have to come back to that area in particular. It is an area where the Bill is particularly weak, notwithstanding all the good stuff it does elsewhere, notably on the areas he has mentioned. I hope that everyone in this House accepts that that area will need to be revisited in due course.
Undoubtedly we will have to come back to that point. Not everything needs to be in the Bill at this point. We have industry initiatives, such as Adobe’s content security policy, which are good initiatives in themselves, but as we better understand misinformation, disinformation, deepfakes and the proliferation and repetition of fake images, fake text and fake news, we will need to keep ensuring we can stay ahead of the game, as my hon. Friend said. That is why we have made the legislation flexible.
I have two things to ask. First, will the Minister spell out more clearly how Parliament will be able to monitor the implementation? What mechanisms do we have to do that? Secondly, on director liability, which I warmly welcome—I am pleased that the Government have listened to Back Benchers on this issue—does he not agree that the example we have set in the Bill should be copied in other Bills, such as the Economic Crime and Corporate Transparency Bill, where a similar proposal exists from Back Benchers across the House?
The right hon. Lady raises some interesting points. We have conversed about harms, so I totally get her point about making sure that we tackle this issue in Parliament and be accountable in Parliament. As I have said, that will be done predominantly by monitoring the Bill through Ofcom’s reporting on what harms it is having to deal with. We have regular engagement with Ofcom, not only here and through the Select Committees, but through the Secretary of State.
On criminal liability, we conversed about that and made sure that we had a liability attached to something specific, rather than the general approach proposed at the beginning. It therefore means that we are not chilling innovation. People can understand, as they set up their approaches and systems, exactly what they are getting into in terms of risk for criminal liability, rather than having the general approach that was suggested at the beginning.
The review mechanism strikes me as one of the places where the Bill falls down and is weakest, because there is not a dedicated review mechanism. We have needed this legislation for more than 30 years, and we have now got to the point of legislating. Does the Minister understand why I have no faith that future legislation will happen in a timely fashion, when it has taken us so long even to get to this point? Can he give us some reassurance that a proper review will take place, rather than just having Ofcom reports that may or may not be read?
I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.
I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.
There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.
There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.
Does my hon. Friend agree that the companies already say in their terms of service that they do not allow illegal use of their products, yet they do not say how they will monitor whether there is illegal use and what enforcement they take? What the Bill gives us, for the first time, is the right for Ofcom to know the answers to those questions and to know whether the companies are even enforcing their own terms of service.
My hon. Friend makes an important point, and I thank him for the amazing work he has done in getting the Bill to this point and for his ongoing help and support in making sure that we get it absolutely right. This is not about bashing technology companies; it is about not only holding them to account, but bringing them closer, to make sure that we can work together on these issues to protect the children I was talking about.
Despite the breadth of existing safeguards, we recognise the concerns expressed about privacy and technical feasibility in relation to Ofcom’s power to issue CSE or terrorism notices. That is why we introduced additional safeguards in the Lords. First, Ofcom will be required to obtain a skilled person’s report before issuing any warning notice and exercising its powers under clause 122. Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. We are confident that in addition to Ofcom’s existing routes of evidence gathering, this measure will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place.
We also brought forth amendments requiring Ofcom to consider the impact that the use of technology would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. That builds on the existing safeguards in clause 133 regarding freedom of expression and privacy.
We recognise the disproportionate levels of harm that women and girls continue to face online, and that is why the Government have made a number of changes to the Bill to strengthen protections for women and girls. First, the Bill will require Ofcom to produce guidance on online harms that disproportionately affect women and girls and to provide examples of best practice to providers, and it will require providers to bring together in one clear place all the measures that they take to tackle online abuse against women and girls on their platforms. The Bill will also require Ofcom to consult the Victims’ Commissioner and the Domestic Abuse Commissioner, in addition to the Children’s Commissioner, while preparing codes of practice. That change to the Bill will ensure that the voices of victims of abuse are brought into the consultation period.
I am grateful for the amendment, which I think is important. Will the Minister make it clear that he will not accept the amendments tabled by the hon. Member for Yeovil (Mr Fysh).
Indeed, we will not be accepting those amendments, but I will cover more of that later on, after I have listened to the comments that I know my hon. Friend wants to make.
As a result of the amendment, we have also made a small change to clause 98—the emerging category 1 services list—to ensure that it makes operational sense. Prior to Baroness Morgan’s amendment, a service had to meet the functionality threshold for content and 75% of the user number threshold to be on the emerging services list. Under the amended cause, there is now a plausible scenario where a service could meet the category 1 threshold without meeting any condition based on user numbers, so we had to make the change to ensure that the clause worked in that scenario.
We have always been clear that the design of a service, its functionalities and its other features are key drivers of risk that impact on the risk of harm to children. Baroness Kidron’s amendments 17, 20, 22 and 81 seek to treat those aspects as sources of harm in and of themselves. Although we agree with the objective, we are concerned that they do not work within the legislative framework and risk legal confusion and delaying the Bill. We have worked closely with Baroness Kidron and other parliamentarians to identify alternative ways to make the role that design and functionalities play more explicit. I am grateful to colleagues in both Houses for being so generous with their time on this issue. In particular, I thank again my right hon. and learned Friend the Member for Kenilworth and Southam for his tireless work, which was crucial in enabling the creation of an alternative and mutually satisfactory package of amendments. We will disagree to Lords amendments 17, 20, 22 and 81 and replace them with amendments that make it explicit that providers are required to assess the impact that service design, functionalities and other features have on the risk of harm to children.
On Report, my hon. Friend the Member for Crawley (Henry Smith) raised animal abuse on the internet and asked how we might address such harmful content. I am pleased that the changes we have since made to the Bill fully demonstrate the Government’s commitment to tackling criminal activity relating to animal torture online. It is a cause that Baroness Merron has championed passionately. Her amendment in the other place sought to require the Secretary of State to review certain offences and, depending on the review’s outcome, to list them as priority offences in schedule 7. To accelerate measures to tackle such content, the Government will remove clause 63—the review clause—and instead immediately list section 4(1) of the Animal Welfare Act 2006 as a priority offence. Officials at the Department for Environment, Food and Rural Affairs have worked closely with the Royal Society for the Prevention of Cruelty to Animals and are confident that the offence of unnecessary suffering will capture a broad swathe of behaviour. I hope the whole House will recognise our efforts and those of Baroness Merron and support the amendment.
You will be pleased to know, Mr Deputy Speaker, that I will conclude my remarks. I express my gratitude to my esteemed colleagues both here and in the other place for their continued and dedicated engagement with this complicated, complex Bill during the course of its parliamentary passage. I strongly believe that the Bill, in this form, strikes the right balance in providing the strongest possible protections for both adults and children online while protecting freedom of expression. The Government have listened carefully to the views of Members on both sides of the House, stakeholders and members of the public. The amendments we have made during the Bill’s progress through the Lords have further enhanced its robust and world-leading legislative framework. It is groundbreaking and will ensure the safety of generations to come. I ask Members of the House gathered here to support the Government’s position on the issues that I have spoken about today.
I want to speak briefly about Lords amendments 195 and 153, which would allow Ofcom, coroners and bereaved parents to acquire information and support relating to a child’s use of social media in the event of that child’s tragic death. Specifically, I want to speak about Archie Battersbee, who lived in my constituency but lost his life tragically last year, aged only 12. Archie’s mum, Hollie, was in the Public Gallery at the beginning of the debate, and I hope that she is still present. Hollie found Archie unconscious on the stairs with a ligature around his neck. The brain injury Archie suffered put him into a four-month coma from which, sadly, doctors were unable to save him.
To this day, Hollie believes that Archie may have been taking part in some form of highly dangerous online challenge, but, unable to access Archie’s online data beyond 90 days of his search history, she has been unable to put this devastating question to rest. Like the parents of Molly, Breck, Isaac, Frankie and Sophia, for the last year Hollie has been engaged in a cruel uphill struggle against faceless corporations in her attempt to determine whether her child’s engagement with a digital service contributed to his death. Despite knowing that Archie viewed seven minutes of content and received online messages in the hour and a half prior to his death, she has no way of knowing what may have been said or exactly what he may have viewed, and the question of his online engagement and its potential role in his death remains unsolved.
Lords amendment 195, which will bolster Ofcom’s information-gathering powers, will I hope require a much more humane response from providers in such tragic cases as this. This is vital and much-needed legislation. Had it been in place a year ago, it is highly likely that Hollie could have laid her concerns to rest and perhaps received a pocket of peace in what has been the most traumatic time any parent could possibly imagine.
I also welcome Lords amendment 153, which will mandate the largest providers to put in place a dedicated helpline so that parents who suffer these tragic events will have a direct line and a better way of communicating with social media providers, but the proof of the pudding will obviously be in the eating. I very much hope that social media providers will man that helpline with real people who have the appropriate experience to deal with parents at that tragic time in their lives. I believe that Hollie and the parents of many other children in similar tragic cases will welcome the Government’s amendments that allow Ofcom, coroners and bereaved parents to access their children’s online data via the coroner directing Ofcom.
I pay tribute to the noble Baroness Kidron, to my right hon. Friend the Member for Bromsgrove (Sajid Javid) and to the Bereaved Families for Online Safety group, who have done so much fantastic work in sharing their heartrending stories and opening our eyes to what has been necessary to improve the Online Safety Bill. I also, of course, pay tribute to Ian Russell, to Hollie and to all the other bereaved parents for their dedication to raising awareness of this hugely important issue.
If I could just say one last thing, I have been slipped from the Education Committee to attend this debate today and I would like to give an advert for the Committee’s new inquiry, which was launched on Monday, into the effects of screen time on education and wellbeing. This Bill is not the end of the matter—in many ways it is just the beginning—and I urge all Members please to engage with this incredibly important inquiry by the Education Committee.
I thank all right hon. and hon. Members for their contribution to the debate today and, indeed, right through the passage of this complex Bill.
First, let me turn to the amendments tabled by my hon. Friend the Member for Yeovil (Mr Fysh). I understand that the intention of his amendments is to restrict the reach of the new online safety regulatory regime in a number of ways. I appreciate his concern to avoid unnecessarily burdensome business, and I am sympathetic to his point that the Bill should not inhibit sectors such as the life sciences sector. I reassure him that such sectors are not the target of this regime and that the new regulatory framework is proportionate, risk-based and pro-innovation.
The framework has been designed to capture a range of services where there is a risk of significant harm to users, and the built-in exemptions and categorisations will ensure it is properly targeted. The alternative would be a narrow scope, which would be more likely to inadvertently exempt risky science or to displace harm on to services that are out of scope. The extensive discussion on this point in both Houses has made it clear that such a position is unlikely to be acceptable.
The amendments to the overarching statement that would change the services in scope would introduce unclear and subjective terms, causing issues of interpretation. The Bill is designed so that low-risk services will have to put in place only proportionate measures that reflect the risk of harm to their users and the service provider’s size and capacity, ensuring that small providers will not be overly burdened unless the level of risk requires it.
The amendment that would ensure Ofcom cannot require the use of a proactive technology that introduces weaknesses or vulnerabilities into a provider’s systems duplicates existing safeguards. It also introduces vague terms that could restrict Ofcom’s ability to require platforms to use the most effective measures to address abhorrent illegal activity.
Ofcom must act proportionately, and it must consider whether a less intrusive measure could achieve the same effect before requiring the use of proactive technology. Ofcom also has duties to protect both privacy and private property, including algorithms and code, under the Human Rights Act 1998.
I thank the Minister for engaging with us on access to private property and for setting up, with his officials, a consultation on the right to access a person’s phone after they are deceased or incapacitated. I thank him for incorporating some of those thoughts in what he and the Government are doing today. I hope this is the start of something and that these big digital companies will no longer be able to bully people. The boot will be on the other foot, and the public will own what they have on their digital devices.
The hon. Gentleman is talking about the access of coroners, families and others to information, following the sad death of Molly Russell. Again, I pay tribute to Ian Russell and all the campaigners. I am glad that we have been able to find an answer to a very complex situation, not only because of its international nature but because of data protection, et cetera.
The measures I have outlined will ensure that risks relating to security vulnerabilities are managed. The Bill is also clear that Ofcom cannot require companies to use proactive technology on privately communicated content, in order to comply with their safety duties, which will provide further safeguards for user privacy and data security.
Will the Minister make it clear that we should expect the companies to use proactive technology, because they already use it to make money by recommending content to people, which is a principal reason for the Bill? If they use proactive technology to make money, they should also use it to keep people safe.
My hon. Friend absolutely nails it. He said earlier that businesses are already collecting this data. Since I was first involved with the Bill, it has primarily been about getting businesses to adhere to their own terms and conditions. The data they use should be used in that way.
The amendment to the definition of “freedom of expression” in part 12 would have no effect as these concepts are already covered by the existing definition. Changing the definition of “automated tool” would introduce untested terms and would have an unclear and confusing impact on the duties.
My hon. Friend the Member for Yeovil also asked for clarification of how Ofcom’s power to view information remotely will be used, and whether the power is sufficiently safeguarded. I assure the House that this power is subject to strict safeguards that mean it cannot be use to undermine a provider’s systems.
On Third Reading in the other place, the Government introduced amendments that defined the regulator’s power to view information remotely, whereas previously the Bill spoke of access. As such, there are no risks to system security, as the power does not enable Ofcom to access the service. Ofcom also has a duty to act proportionately and must abide by its privacy obligations under the Human Rights Act. Ofcom has a stringent restriction on disclosing businesses’ commercially sensitive and other information without consent.
My hon. Friend also asked for clarification on whether Ofcom will be able to view live user data when using this power. Generally, Ofcom would expect to require a service to use a test dataset. However, there may be circumstances where Ofcom asks a service to execute a test using data that it holds, for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In that scenario, Ofcom may need to use a provider’s own test dataset containing content that has previously violated its own terms of service. However, that would be subject to Ofcom’s privacy obligations and data protection law.
Lords amendment 17 seeks to explicitly exempt low-risk functionality from aspects of user-to-user services’ children’s risk assessment duties. I am happy to reassure my hon. Friend that the current drafting of the Government’s amendment in lieu of Lords amendment 17 places proportionate requirements on providers. It explicitly excludes low-risk functionality from the more stringent duty to identify and assess the impact that higher-risk functionalities have on the level of risk of harm to children. Proportionality is further baked into this duty through Ofcom’s risk assessment guidance. Ofcom is bound by the principle of proportionality as part of its general duties under the Communications Act 2003, as updated by the Bill. As such, it would not be able to recommend that providers should identify and assess low-risk functionality.
The amendment to Lords amendment 217 tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) would introduce a new safeguard that requires Ofcom to consider whether technology required under a clause 122 notice would circumvent end-to-end encryption. I wish to reassure him and others who have raised the question that the amendment is unnecessary because it is duplicative of existing measures that restrict Ofcom’s use of its powers. Under the Bill’s safeguards, Ofcom cannot require platforms to weaken or remove encryption, and must already consider the risk that specified technology can result in a breach of any statutory provision or the rule of law concerning privacy. We have intentionally designed the Bill so that it is technology neutral and futureproofed, so we cannot accept amendments that risk the legislation quickly becoming out of date. That is why we focused on safeguards that uphold user rights and ensure measures that are proportionate to the specific risks, rather than focusing on specific features such as encryption. For the reasons I have set out, I cannot accept the amendment and hope it will not be pressed to a vote.
The amendment tabled by my hon. Friend the Member for Stroud (Siobhan Baillie) would create an additional reporting requirement on Ofcom to review, as part of its report on the use of the age assurance, whether the visibility of a user’s verification status improves the effectiveness of age assurance, but that duplicates existing review requirements in the Bill. The Bill already provides for a review of user verification; under clause 179, the Secretary of State will be required to review the operation of the online safety regulatory framework as a whole. This review must assess how effective the regulatory framework is at minimising the risk of harm that in scope services pose to users in the UK. That may include a review of the effectiveness of the current user verification and non-verified users duty. I thank my hon. Friend also for raising the issue of user verification and the visibility of verification status. I am pleased to confirm that Ofcom will have the power to set out guidance on user verification status being visible to all users. With regard to online fraud or other illegal activity, mandatory user verification and visibility of verification status is something Ofcom could recommend and require under legal safety duties.
Let me quickly cover some of the other points raised in the debate. I thank my hon. Friend the Member for Gosport (Dame Caroline Dinenage), a former Minister, for all her work. She talked about young people and the Bill contains many measures, for example, on self-harm or suicide content, that reflect them and will still help to protect them. On the comments made by the hon. Member for Aberdeen North (Kirsty Blackman) and indeed the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), whom I am glad to see back in her place, there are a number of review points. Clause 179 requires the Secretary of State to review how the Bill is working in practice, and there will be a report resulting from that, which will be laid before Parliament. We also have the annual Ofcom report that I talked about, and most statutory instruments in the Bill will be subject to the affirmative procedure. The Bill refers to a review after two to five years—Ministers can dictate when it takes place within that period—but that is based on allowing a long enough time for the Bill to bed in and be implemented. It is important that we have the ability to look at that in Parliament. The UN convention on the rights of the child principles are already in the Bill. Although the Bill does not cite the report by name, the EU convention principles are all covered in the Bill.
My hon. Friend the Member for Folkestone and Hythe (Damian Collins) did an amazing job in his time in my role, and before and afterwards as Chair of the Joint Committee responsible for the pre-legislative scrutiny of the Online Safety Bill. When he talked about scrutiny, I had the advantage of seeing the wry smile of the officials in the Box behind him. That scrutiny has been going on since 2021. Sarah Connolly, one of our amazing team of officials, has been involved with the Bill since it was just a concept.
As Carnegie UK Trust observed online, a child born on the day the Government first published their original internet safety strategy would now be in its second year of primary school.
I do not think I need to respond to that, but it goes to show does it not?
My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.
My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.
The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.
My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.
It has just crossed my mind that the Minister might be saying that he agreed with everything that I said, which cannot be right. Let me be clear about the two points. One was in relation to whether, when we look at design harms, both proportionality and balancing duties are relevant—I think that he is saying yes to both. The other point that I raised with him was around encryption, and whether I put it in the right way in terms of the Government’s position on encryption. If he cannot deal with that now, and I would understand if he cannot, will he write to me and set out whether that is the correct way to see it?
I thank my right hon. Friend for that intervention. Indeed, end-to-end encrypted services are in the scope of the Bill. Companies must assess the level of risk and meet their duties no matter what their design is.
Can the Minister confirm whether the letter I received from the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is accurate?
I was just coming to that. I thank my right hon. Friend for the rest of her speech. She always speaks so powerfully on eating disorders—on anorexia in particular—and I can indeed confirm the intent behind the Minister’s letter about the creation and use of algorithms.
Finally, I shall cover two more points. My hon. Friend the Member for Stone (Sir William Cash) always speaks eloquently about this. He talked about Brexit, but I will not get into the politics of that. Suffice to say, it has allowed us—as in other areas of digital and technology—to be flexible and not prescriptive, as we have seen in measures that the EU has introduced.
I also ask my hon. Friend the Member for Southend West (Anna Firth) to pass on my thanks and best wishes to Hollie whom I met to talk about Archie Battersbee.
On the small high-harm platforms that are now in the scope of the Bill, will the Minister join me in thanking Hope Not Hate, the Antisemitism Policy Trust and CST, which have campaigned heavily on this point? While we have been having this debate, the CST has exposed BitChute, one of those small high-harm platforms, for geoblocking some of the hate to comply with legislation but then advertising loopholes and ways to get around that on the platform. Can the Minister confirm that the regulator will be able to take action against such proceedings?
I will certainly look at that. Our intention is that in all areas, especially relating to children and their protection, that might not fall within the user enforcement duties, we will look to make sure that the work of those organisations is reflected in what we are trying to achieve in the Bill.
We have talked about the various Ministers that have looked after the Bill during its passage, and the Secretary of State was left literally holding the baby in every sense of the word because she continued to work on it while she was on maternity leave. We can see the results of that with the engagement that we have had. I urge all Members on both sides of the House to consider carefully the amendments I have proposed today in lieu of those made in the Lords. I know every Member looks forward eagerly to a future in which parents have surety about the safety of their children online. That future is fast approaching.
I reiterate my thanks to esteemed colleagues who have engaged so passionately with the Bill. It is due to their collaborative spirit that I stand today with amendments that we believe are effective, proportionate and agreeable to all. I hope all Members will feel able to support our position.
Amendment (a) made to Lords amendment 182.
Lords amendment 182, as amended, agreed to.
Amendments (a) and (b) made to Lords amendment 349.
Lords amendment 349, as amended, agreed to.
Amendment (a) made to Lords amendment 391.
Lords amendment 391, as amended, agreed to.
Government consequential amendment (a) made.
Lords amendment 17 disagreed to.
Government amendments (a) and (b) made in lieu of Lords amendment 17.
Lords amendment 20 disagreed to.
Lords amendment 22 disagreed to.
Lords amendment 81 disagreed to.
Government amendments (a) to (c) made in lieu of Lords amendment 81.
Lords amendment 148 disagreed to.
Government amendment (a) made in lieu of Lords amendment 148.
Lords amendments 1 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181, 183 to 348, 350 to 390, and 392 to 424 agreed to, with Commons financial privileges waived in respect of Lords amendments 171, 180, 181, 317, 390 and 400.
Ordered, That a Committee be appointed to draw up Reasons to be assigned to the Lords for disagreeing to their amendments 20 and 22;
That Paul Scully, Steve Double, Alexander Stafford, Paul Howell, Alex Davies-Jones, Taiwo Owatemi and Kirsty Blackman be members of the Committee;
That Paul Scully be the Chair of the Committee;
That three be the quorum of the Committee.
That the Committee do withdraw immediately.—(Mike Wood.)
Committee to withdraw immediately; reasons to be reported and communicated to the Lords.
(1 year, 2 months ago)
Lords ChamberThis text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.
In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.
This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here
This information is provided by Parallel Parliament and does not comprise part of the offical record
That this House do not insist on its Amendment 17 and do agree with the Commons in their Amendments 17A and 17B in lieu.
My Lords, I beg to move Motion A and, with the leave of the House, I shall also speak to Motions B to H.
I am pleased to say that the amendments made in your Lordships’ House to strengthen the Bill’s provisions were accepted in another place. His Majesty’s Government presented a number of amendments in lieu of changes proposed by noble Lords, which are before your Lordships today.
I am grateful to my noble friend Lady Morgan of Cotes for her continued engagement on the issue of small but high-risk platforms. The Government were happy to accept her proposed changes to the rules for determining the conditions that establish which services will be designated as category 1 or 2B services. In making the regulations, the Secretary of State will now have the discretion to decide whether to set a threshold based on either the number of users or the functionalities offered, or on both factors. Previously, the threshold had to be based on a combination of both.
It remains the expectation that services will be designated as category 1 services only where it is appropriate to do so, to ensure that the regime remains proportionate. We do not, for example, expect to apply these duties to large companies with very limited functionalities. This change, however, provides greater flexibility to bring smaller services with particular functionalities into scope of category 1 duties, should it be necessary to do so. As a result of this amendment, we have also made a small change to Clause 98—the emerging services list—to ensure that it makes operational sense. Before my noble friend’s amendment, a service would be placed on the emerging services list if it met the functionality condition and 75% of the user number threshold. Under the clause as amended, a service could be designated as category 1 without meeting both a functionality and a user condition. Without this change, Ofcom would, in such an instance, be required to list only services which meet the 75% condition.
We have heard from both Houses about the importance of ensuring that technology platforms are held to account for the impact of their design choices on children’s safety. We agree and the amendments we proposed in another place make it absolutely clear that providers must assess the impact of their design choices on the risk of harm to children, and that they deliver robust protections for children on all areas of their service. I thank in particular the noble Baroness, Lady Kidron, the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, my noble friend Lady Harding of Winscombe and the right reverend Prelate the Bishop of Oxford for their hard work to find an acceptable way forward. I also thank Sir Jeremy Wright MP for his helpful contributions to this endeavour.
Noble Lords will remember that an amendment from the noble Baroness, Lady Merron, sought to require the Secretary of State to review certain offences relating to animals and, depending on the outcome of that review, to list these as priority offences. To accelerate protections in this important area, the Government have tabled an amendment in lieu listing Section 4(1) of the Animal Welfare Act 2006 as a priority offence. This will mean that users can be protected from animal torture material more swiftly. Officials at the Department for Environment, Food and Rural Affairs have worked closely with the RSPCA and are confident that the Section 4 offence, unnecessary suffering of an animal, will capture a broad swathe of illegal activity. Adding this offence to Schedule 7 will also mean that linked inchoate offences, such as encouraging or assisting this behaviour, are captured by the illegal content duties. I am grateful to the noble Baroness for raising this matter, for her discussions on them with my noble friend Lord Camrose and for her support for the amendment we are making in lieu.
To ensure the speedy implementation of the Bill’s regime, we have added Clauses 116 to 118, which relate to the disclosure of information by Ofcom, and Clauses 170 and 171, which relate to super-complaints, to the provisions to be commenced immediately on Royal Assent. These changes will allow Ofcom and the Government to hold the necessary consultations as quickly as possible after Royal Assent. As noble Lords know, the intention of the Bill is to make the UK the safest place in the world to be online, particularly for children. I firmly believe that the Bill before your Lordships today will do that, strengthened by the changes made in this House and by the collaborative approach that has been shown, not just in all quarters of this Chamber but between both Houses of Parliament. I beg to move.
My Lords, I thank the Minister very warmly for his introduction today. I shall speak in support of Motions A to H inclusive. Yes, I am very glad that we have agreement at this final milestone of the Bill before Royal Assent. I pay tribute to the Minister and his colleagues, to the Secretary of State, to the noble Baronesses, Lady Morgan, Lady Kidron and Lady Merron, who have brought us to this point with their persistence over issues such as functionalities, categorisation and animal cruelty.
This is not the time for rehearsing any reservations about the Bill. The Bill must succeed and implementation must take place swiftly. So, with many thanks to the very many, both inside and outside this House, who have worked so hard on the Bill for such a long period, we on these Benches wish the Bill every possible success. He is in his place, so I can say that it is over to the noble Lord, Lord Grade, and his colleagues at Ofcom, in whom we all have a great deal of confidence.
My Lords, I too thank the Minister for his swift and concise introduction, which very carefully covered the ground without raising any issues that we have to respond to directly. I am grateful for that as well.
The noble Lord, Lord Clement-Jones, was his usual self. The only thing that I missed, of course, was the quotation that I was sure he was going to give from the pre-legislative scrutiny report on the Bill, which has been his constant prompt. I also think that the noble Baroness, Lady Finlay, was very right to remind us of those outside the House who we must remember as we reach the end of this stage.
Strangely, although we are at the momentous point of allowing this Bill to go forward for Royal Assent, I find that there is actually very little that needs to be said. In fact, everything has been said by many people over the period; trying to make any additional points would be meretricious persiflage. So I will make two brief points to wind up this debate.
First, is it not odd to reflect on the fact that this historic Parliament, with all our archaic rules and traditions, has the capacity to deal with a Bill that is regulating a technology which most of us have difficulty in comprehending, let alone keeping up with? However, we have done a very good job and, as a result, I echo the words that have already been said; I think the internet will now be a much safer place for children to enjoy and explore, and the public interest will be well served by this Bill, even though we accept that it is likely to only be the first of a number of Bills that will be needed in the years to come.
Secondly, I have been reflecting on the offer I made to the Government at Second Reading, challenging them to work together with the whole House to get the best Bill that we could out of what the Commons had presented to us. That of course could have turned out to be a slightly pointless gesture if nobody had responded positively—but they did. I particularly thank the Minister and the Bill team for rising to the challenge. There were problems initially, but we got there in the end.
More widely, there was, I know, a worry that committing to working together would actually stifle debate and somehow limit our crucial role of scrutiny. But actually I think it had the opposite effect. Some of the debates we had in Committee, from across the House, were of the highest standard, and opened up issues which needed to be resolved. People listened to each other and responded as the debate progressed. The discussion extended to the other place. It is very good to see Sir Jeremy Wright here; he has played a considerable role in resolving the final points.
It will not work for all Bills, but if the politics can be ignored, or at least put aside, it seems to make it easier to get at the issues that need to be debated in the round. In suggesting this approach, I think we may have found a way of getting the best out of our House —something that does not always occur. I hope that lesson can be listened to by all groups and parties.
For myself, participating in this Bill and the pre-legislative scrutiny committee which preceded it has been a terrific experience. Sadly, a lot of people who contributed to our discussions over that period cannot be here today, but I hope they read this speech in Hansard, because I want to end by thanking them, and those here today, for being part of this whole process. We support the amendments before the House today and wish good luck to the noble Lord, Lord Grade.
My Lords, I am very conscious that this is not the end of the road. As noble Lords have rightly pointed out in wishing the Bill well, attention now moves very swiftly to Ofcom, under the able chairmanship of the noble Lord, Lord Grade of Yarmouth, who has participated, albeit silently, in our proceedings before, and to the team of officials who stand ready to implement this swiftly. The Bill benefited from pre-legislative scrutiny. A number of noble Lords who have spoken throughout our deliberations took part in the Joint Committee of both Houses which did that. It will also benefit from post-legislative scrutiny, through the Secretary of State’s review, which will take place between two and five years after Royal Assent. I know that the noble Lords who have worked so hard on this Bill for many years will be watching it closely as it becomes an Act of Parliament, to ensure that it delivers what we all want it to.
The noble Lord, Lord Stevenson, reminded us of the challenge he set us at Second Reading: to minimise the votes in dissent and to deliver this Bill without pushing anything to ping-pong. I think I was not the only one in the Chamber who was sceptical about our ability to do so, but it is thanks to the collaborative approach and the tone that he has set that we have been able to do that. That is a credit to everybody involved.
That this House do not insist on its Amendment 20, to which the Commons have disagreed for their Reason 20A.
That this House do not insist on its Amendment 22, to which the Commons have disagreed for their Reason 22A.
That this House do not insist on its Amendment 81 and do agree with the Commons in their Amendments 81A, 81B and 81C in lieu.
That this House do not insist on its Amendment 148 and do agree with the Commons in their Amendment 148A in lieu.
That this House do agree with the Commons in their Amendment 182A.
That this House do agree with the Commons in their Amendments 349A and 349B.
That this House do agree with the Commons in their Amendments 391A and 391B.