Online Safety Bill

(Limited Text - Ministerial Extracts only)

Read Full debate
Tuesday 29th November 2022

(1 year, 11 months ago)

Written Statements
Read Hansard Text
Michelle Donelan Portrait The Secretary of State for Digital, Culture, Media and Sport (Michelle Donelan)
- Hansard - - - Excerpts

The Online Safety Bill is a vital, world-leading piece of legislation, designed to ensure that tech companies take more responsibility for the safety of their users, particularly children. It is also vital that people can continue to express themselves freely and engage in pluralistic debate online. For that reason, I am today committing to make a number of changes to the Online Safety Bill to strengthen its provisions relating to children, and to ensure the Bill’s protections for adults strike the right balance with its protections for free speech.

Since taking up the role of Secretary of State for Digital, Culture, Media and Sport I have engaged extensively with colleagues to hear views on this legislation. We have heard concerns from many parliamentarians, stakeholders and members of the public on a number of issues, including a desire to go further on child protections, wanting better protections for legal speech and a concern that too much power over what we see and engage with online rests with tech giants themselves. Making progress on these important concerns did not, in my view, need to come at the expense of one another. I therefore set out a clear approach with three main aims:

Strengthen the protections for children in the Bill

Ensure that adults’ right to legal free speech is protected

Create a genuine system of transparency, accountability and control to give the British public more choice and power over their own accounts and experience.

We can say with confidence that all three aims have been achieved with the amendments the Government are putting forward. We will go further to strengthen the elements of the Bill that specifically protect children online. At the same time, we will remove the clauses pertaining to “legal but harmful” content for adults and replace them with a “triple shield” that empowers users and ensures that control over the online experience rests with individuals rather than anonymous committees in Silicon Valley.

Protections for Children

The Bill’s key objective, above everything else, is the safety of young people online. Not only will we preserve the existing protections, I will table a number of amendments that go further to strengthen the existing protections for children in the Bill to:

make clearer the existing expectations of platforms in understanding the age of their users and, where platforms specify a minimum age for users, require them to clearly explain in their terms of service the measures they use to enforce this and if they fail to adhere to these measures, Ofcom will be able to act. I will table these amendments in the Commons;

require the largest platforms to publish summaries of their risk assessments for illegal content and material that is harmful to children, to allow users and empower parents to clearly understand the risks presented by these services and the approach platforms are taking to children’s safety

name the children’s commissioner as a statutory consultee for Ofcom in its development of the codes of practice to ensure that the measures relating to children are robust and reflect the concerns of parents.

The Government will table the remaining amendments in the Lords.

Legal Free Speech

A large number of colleagues, stakeholders and members of the public have been particularly concerned about provisions that would result in the over-removal of legitimate legal content by creating a new category of “legal but harmful” speech. However admirable the goal, I do not believe that it is morally right to censor speech online that is legal to say in person.

I will therefore table a number of amendments in the Commons to remove “legal but harmful” from the Bill in relation to adults, and replace it with a fairer, simpler and we believe more effective mechanism called the triple shield, which will focus on user choice, consumer rights and accountability while protecting freedom of expression. We are taking the same approach when assessing the proposed new harmful communications offence, which when applied could potentially have criminalised legitimate discussion of some topics. I have therefore tabled amendments for the second day of Report stage to remove the harmful communications offence from the Bill.

To retain protections for victims of abusive communications, including victims of domestic abuse, we will continue progressing new offences for false and threatening communications. Furthermore, the Bill will no longer repeal the Malicious Communications Act 1988 and relevant sections of the Communications Act 2003. To avoid duplication in legislation, the Government will remove elements of the offences in these Acts which criminalise false and threatening communications.

Protection for Adults: The Triple Shield

It is unquestionable that speech that is illegal in the street should also be illegal online, and that major platforms should remove illegal content from their sites. While most platforms, including social media sites, have robust terms of service detailing the types of content they do or do not allow, anyone who uses these platforms regularly will know that there is a widespread failure of companies to enforce their own terms of service and platforms can often treat some sections of society differently. Lastly, I believe that rather than censoring adults, the Government should be standing up for free speech and choice by empowering people.

Together, these three common sense principles form the basis of the triple shield, a comprehensive set of tools to protect and empower adults. Under this system, three important rules apply:

Illegal: Content that is illegal should be removed. The Bill includes a number of priority offences, and companies must proactively prevent users from encountering this content. The Bill includes the relevant offences for England and Wales, Scotland, and Northern Ireland. Companies will also have to remove other relevant illegal content, when they become aware of it.

Terms of service: Legal content that a platform prohibits in its own terms of service should be removed, and legal content that a platform allows in its terms of service should not be removed.

User empowerment: Rather than tech giants’ algorithms alone deciding what users engage with, users themselves should have the option to decide. Adults should be empowered to choose whether or not to engage with legal forms of abuse and hatred if the platform they are using allows such content. So the “third shield” puts a duty on platforms to provide their users with the functionality to control their exposure to unsolicited content that falls into this category. These functions will, under no circumstances, limit discussion, robust debate or support groups’ ability to speak about any of these issues freely.

The user empowerment tools will allow adults to reduce the likelihood that they will see certain categories of content if they so choose. The duty will specify legal content related to suicide, content promoting self-harm and eating disorders, and content that is abusive or incites hate on the basis of race, ethnicity, religion, disability, sex, gender reassignment, or sexual orientation. This is a targeted approach that reflects areas where we know adult users, in particular vulnerable users, would benefit from having greater choice over how they interact with these kinds of content. For the first time, tech giants will be required to give individual adults genuine control over their own accounts and online experience. I will table amendments relating to these provisions in the Commons.

This will be done while upholding users’ rights to free expression and ensuring that legitimate debate online will not be affected by these stronger duties. There are high thresholds for inclusion in these content categories, which will exclude discussions about these broad topics—even where that could be controversial or challenging—but where it does not become abusive. Nothing in this duty will require companies to remove or take down legal content. This will also be made clear through the Bill’s explanatory notes.

Category 1 services will still need to give users the option to verify themselves and choose not to interact with unverified users. This duty will remain unchanged, and again reinforces this Government’s commitment to ensuring users have genuine choice over their online experience.

These changes will ensure the Bill protects free speech while holding social media companies to account for their promises to users, guaranteeing that users will be able to make informed choices about the services they use and the interactions they have on those sites.

Accountability and further measures

Publication of enforcement notices: The regulator, Ofcom, will hold companies to account if they fail to comply with the requirements in the Bill by issuing fines or notifications requiring them to take steps to remedy compliance failures. To further strengthen transparency for users, we will give Ofcom the power to require services to publish the details of any enforcement notifications, including notices requiring them to remedy breaches, that they receive. I have now tabled these amendments in the Commons.

Self-harm: I am aware of particular concerns around content online which encourages vulnerable people to self-harm. While the child safety duties in the Bill will protect children, vulnerable adults may remain at risk of exposure to this abhorrent content. I am therefore committing to making the encouragement of self-harm illegal. The Government will bring forward in this Bill proposals to create an offence of sending a communication that encourages serious self-harm via an amendment in the House of Lords. This new offence will ensure that trolls sending such messages to a person, regardless of the recipient’s age, face the consequences for their vile actions.

Tackling violence against women and girls: It is unacceptable that women and girls suffer disproportionately from abuse online and it is right that we address this through the Online Safety Bill. Therefore, extensive work has been undertaken, including with Home Office colleagues, to understand how we can further protect women and girls through the Online Safety Bill, including to:

List Controlling or Coercive behaviour as a priority offence. This is an offence that disproportionately impacts women and girls—listing this as a priority offence means companies will have to take proactive measures to tackle this content, therefore strengthening the protections for women and girls under the Bill.

Name the Victims Commissioner and the Domestic Abuse Commissioner as Statutory Consultees for the codes of practice, to ensure that they are consulted by Ofcom ahead of drafting and amending the codes of practice.

These changes will be made to the Bill in the House of Lords.

As announced last week by the Deputy Prime Minister, we are also going to take forward reforms to the criminal law on the abuse of intimate images. Building on the campaign of my right hon. Friend the Member for Basingstoke (Dame Maria Miller), as well as recommendations from the Law Commission, we will criminalise the sharing of people’s intimate images without their consent. This, in combination with the measures already in the Bill to make cyberflashing a criminal offence, will significantly strengthen protections for women in particular as they are disproportionately affected by these activities. The Government will table these amendments in the Lords. Separate to the Online Safety Bill, the Government will also bring forward a package of additional laws to tackle a range of abusive behaviour including the installation of equipment, such as hidden cameras, to take or record images of someone without their consent.

Epilepsy Trolling: I have tabled amendments for the second day of Report Stage to legislate for a new flashing images offence. I would like to pay tribute to the passionate campaigning that has been done on this issue, both by the Epilepsy Society, and parliamentarians from across both Houses to help the Government ensure that this appalling behaviour is tackled and that we fulfil the Government’s previous commitment to legislate to protect victims from epilepsy trolling. We have also made a number of other technical changes to clarify existing policy positions, further details of which can be found in the amendment paper.

To ensure the proposed changes go through proper scrutiny, we intend to return a number of clauses back to a Public Bill Committee for consideration. These are issues that are of fundamental importance to the regime, and to members of this House, such as freedom of expression, user empowerment, and age assurance, and it would not be right to proceed with these changes without detailed scrutiny in the House of Commons. We intend to make further changes, as set out above, in the House of Lords, however the timing of these amendments will depend on parliamentary scheduling.

[HCWS397]