(1 day, 23 hours ago)
General CommitteesIt may help the Committee if I clarify from the Chair what we are debating. The motion in the name of the Secretary of State for Science, Innovation and Technology is listed in the “Future Business” section of the Order Paper, and the House will be asked to pass the motion without debate after the text has been agreed by this Committee.
I beg to move,
That this House authorises the Secretary of State to undertake to pay, and to pay by way of financial assistance under section 8 of the Industrial Development Act 1982, a grant or grants exceeding £30 million and up to a total of £129 million to BioNTech UK Limited to support their planned expansion of research and development and artificial intelligence activities in the UK over the next 10 years.
It is an honour to serve under your chairmanship, Ms Jardine.
This investment comes at an important time for the UK’s thriving life sciences sector, which forms a key pillar of two of the Government’s missions: to kick-start economic growth; and to build an NHS fit for the future. The sector is responsible for over £100 billion of turnover in the UK, and it supports over 304,000 jobs in 6,850 businesses. In addition to supporting our economy, the sector also delivers for patients by providing the medicines and technologies that people need to live longer, healthier lives.
As we will set out in the life sciences sector plan, we must build on our world-leading R&D ecosystem and double down on rebuilding an internationally competitive business environment so that innovative companies can start, scale and stay here in the UK. To deliver that plan, we will continue to work in partnership with industry, our life sciences ecosystem and the NHS to seize opportunities that will foster innovation across the UK. To that end, through this proposed grant, we have an opportunity for the UK to secure international investment in innovative, cutting-edge R&D in the face of increasing global competition.
As the right hon. and hon. Members present know, BioNTech is an international leader in the biotechnology industry, and the developer of the first licensed mRNA covid-19 vaccine. Building on the vaccine’s success and global impact, BioNTech has applied for a Government grant of £129 million to support its transformation and UK expansion, which will see it invest circa £1 billion over 10 years. Supported by the grant, BioNTech research activities will focus on structural biology, regenerative medicine, oncology and AI-driven drug discovery, spanning three locations and creating about 460 new, directly-employed, highly skilled jobs.
In Cambridge, BioNTech will set up a new centre of excellence to focus on drug discovery and development of new treatments for cancer and other serious diseases. That directly supports the Government’s ambition to boost the Oxford-Cambridge growth corridor. In London, BioNTech intends to establish a major hub, including a centre of AI expertise to leverage this game-changing technology and to enhance our understanding of diseases, their causes and drug targeting. At a third site—to be announced shortly—BioNTech plans to undertake R&D into vaccines, including for diseases with high pandemic potential.
BioNTech’s decision to invest in the UK and to expand its R&D activities builds on the Government’s existing strategic partnership with the company. That includes BioNTech’s work to provide up to 10,000 NHS patients with personalised immunotherapies by 2030, which is already transforming health outcomes by enabling UK patients to be among the first in the world to benefit from cancer vaccines. That support for BioNTech is further evidence of the Government’s backing of a world-leading life sciences sector. Working together, we are driving growth, creating jobs and fostering innovation that will translate into positive outcomes for patients. Supporting BioNTech’s investment is another signal of our commitment to this crucial sector ahead of launching our ambitious life sciences sector plan in the spring.
I commend the motion to the Committee.
I thank the Opposition and Liberal Democrat spokespeople. The funding we have discussed today will unlock around £1 billion to further boost the UK’s life sciences sector and, in turn, support the Government’s missions to kick-start economic growth and build an NHS fit for the future. It will also build on our significant progress and commitments to date, including the life sciences innovative manufacturing fund of up to £520 million announced by the Chancellor in October 2024, and our landmark partnerships with Oxford Nanopore and Eli Lilly.
The hon. Member for Runnymede and Weybridge (Dr Spencer) commented on the investment environment. I am sure he did not miss the fact that this Government attracted £63 billion-worth of investment at the last international investment summit. We have done the hard work to make that investment a reality. He may be interested to hear that, according to the latest CEO survey by PricewaterhouseCoopers, the UK is the second best country in the world in which to invest. However, we are not complacent, and we are fully committed to making the UK the best place to invest. The life sciences are an area of huge UK expertise, and they are key to that commitment. Securing this investment will send a clear message to innovative companies that the UK is open for business.
The hon. Member for Harpenden and Berkhamsted (Victoria Collins) asked about monitoring. The financial assistance will be monitored through the normal procedures used for any investment made by the Government. I am happy to send her details of that process and the timeline for this investment.
Working together with industry, this Government are delivering better patient outcomes and driving economic growth. I look forward to continuing that work, and to building on that momentum through the publication and rapid delivery of the life sciences sector plan and industrial strategy in the spring.
I commend the motion to the Committee.
Question put and agreed to.
(3 weeks, 5 days ago)
Written StatementsI am repeating the following written ministerial statement made today in the other place by the Minister for the Future Digital Economy and Online Safety, my noble Friend Baroness Jones of Whitchurch.
In 2023, the previous Government appointed Baroness Bertin as the independent lead reviewer to explore issues surrounding the regulation, legislation and enforcement of online pornography. Throughout the review, she reviewed evidence submitted from the public, academics and civil society, as well as stakeholders in law enforcement, the pornography sector and health service providers. The final report provided to the Government is insightful and timely.
The report has been laid before Parliament today and it will also be available on gov.uk.
Baroness Bertin’s report highlights some of the harms caused by unregulated access to some online pornography. The review finds that online pornography can impact people’s health and mental wellbeing, and is potentially fuelling violence against women and girls offline.
Baroness Bertin’s review makes a case for bringing the regulation of pornography online into parity with offline regulation. In the time she has had to do the review, she has considered the existing evidence on the topic, but she has also highlighted where some issues are still poorly understood and more research is needed to understand the potential harms from pornographic content and how to mitigate those.
The review acknowledges the important protections that the Online Safety Act 2023 will put in place to protect young people from seeing harmful content online, including pornographic content. It also notes that the Act has made it a priority for in-scope services to proactively tackle the most harmful illegal content, which includes intimate image abuse, extreme pornography and child sexual abuse material.
This review has revealed shocking detail about the prevalence of violent and misogynistic pornography online, and the extent to which it is influencing dangerous offline behaviours, including in young relationships. Graphic strangulation pornography is illegal but is not always being treated as such and instead remains widely accessible on mainstream pornography platforms. There is increasing evidence that “choking” is becoming a common part of real-life sexual encounters, despite the significant medical dangers associated with it. The Government will take urgent action to ensure that pornography platforms, law enforcement and prosecutors are taking all necessary steps to tackle this increasingly prevalent harm.
Additionally, the review’s findings have noted that as technologies such as artificial intelligence continue to evolve and become increasingly sophisticated and accessible, they are reshaping the online pornography landscape. Individuals can now create sexual content, consensually and non-consensually, with nudification applications and other forms of software. Baroness Bertin has found that more needs to be done to protect those online from being victimised by non-consensual sexual content.
The Government are delivering our manifesto commitment to ban sexually explicit deepfakes: the Data (Use and Access) Bill introduces a new offence that will criminalise the creation of a purported intimate image, or deepfake, of an adult without their consent. It will also criminalise asking someone to create a purported intimate image, or deepfake, for you, regardless of where that person is based or whether the image is created.
We are introducing a package of offences in the Crime and Policing Bill to tackle the taking of intimate images without consent and the installation of equipment with intent to enable the taking of intimate images without consent. Through the offences at section 66B of the Sexual Offences Act 2003, the law already captures situations where intimate images, including deepfakes, are shared without consent.
Together these measures will ensure that law enforcement can effectively tackle this abusive behaviour. This demeaning and disgusting form of chauvinism must not become normalised, and as part of our plan for change we are bearing down on violence against women, whatever form it takes. We are putting offenders on notice: they will face the full force of the law.
The review has also made several recommendations related to the education system. This Government consider healthy relationships a key part of RSHE—relationships, sex and health education—and relationships education will support our mission to halve violence against women and girls in the next decade. This Government will support schools to tackle misogyny and promote healthy relationships and positive masculinity.
The relationship, sex and health education statutory guidance is currently being reviewed following a public consultation last year. As part of this, we are working with stakeholders and teachers to ensure that the curriculum covers all content that pupils need to keep themselves and others safe and to be respectful in their relationships.
This Government are equipping teachers with the information, resources and training to teach young people about healthy relationships and behaviour, which plays a significant role in preventing harmful sexual behaviours. We have recently published a new guide for teachers on incel culture on the Department’s Education Against Hate website. Teacher training contains the teachers’ standards, including high expectations of behaviour, and we are working with schools on what more we can do to support them to root out misogyny and ensure that young people treat each other with respect.
This Government have set out an unprecedented mission to halve violence against women and girls within a decade, and this will require a renewed focus on prevention—including ensuring that online content is not encouraging offline violence and abuse. We will therefore take forward the findings of Baroness Bertin’s review, which will help to inform the cross-Government violence against women and girls strategy to be published in the next few months.
I thank Baroness Bertin for her efforts in bringing this report together and shedding light on a complex yet deeply important topic. The Government will provide a further update on how they are tackling the issues raised in the review as part of their mission to tackle VAWG in due course.
[HCWS479]
(3 weeks, 6 days ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this debate on the implementation of the Online Safety Act. I know that he has been following the Bill throughout its passage and has been a critic of every Minister, even his Government’s Ministers, whenever the Bill was watered down or delayed, so I expect him to hold all of us to account. I am grateful to him and all the hon. Members who have spoken this morning. The Government share their commitment to keeping users safe online. It is crucial that we continue to have conversations about how best to achieve that goal.
The Online Safety Act lays the foundations for strong protections against evil content and harmful material online. It addresses the complex nature of online harm, recognising that harm is not limited to explicit content and extending to the design and functionality of online services. We know that the legislation is not perfect. I hear that at every such debate, but we are committed to supporting Ofcom to ensure that the Act is implemented quickly, as this is the fastest way to protect people online. 2025 is the year of action for online safety, and the Government have already taken a number of steps to build on Ofcom’s implementation of the Act. In November last year, the Secretary of State published the draft “Statement Of Strategic Priorities for online safety”. That statement is designed to deliver a comprehensive, forward-looking set of online safety priorities for the full term of this Government. It will give Ofcom a backing to be bold on specific areas, such as embedding safety by design, through considering all aspects of a service’s business model, including functionalities and algorithms.
We are also working to build further on the evidence base to inform our next steps on online safety, and I know that this issue was debated earlier this week. In December, we announced a feasibility study to understand the impact of smartphones and social media on children, and in the Data (Use and Access) Bill, we have included provisions to allow the Secretary of State to create a new researcher access regime for online safety data. That regime is working to fix a systemic issue that has historically prevented researchers from understanding how platforms operate, and it will help to identify and mitigate new and preventable harms. We have also made updates to the framework, such as strengthening measures to tackle intimate image abuse under the Online Safety Act, and we are following up on our manifesto commitment to hold perpetrators to account for the creation of explicit, non-consensual deepfake images through amendments to the Data Bill.
We are also building on the measures in the Online Safety Act that allow Ofcom to take information on behalf of coroners. Through the Data Bill, we are bringing in additional powers to allow coroners to request Ofcom to issue a notice requiring platforms to preserve children’s data, which can be crucial for investigations into a child’s tragic death. My hon. Friend the Member for Darlington (Lola McEvoy) raised Jools’ law, of which I am very aware, and I believe that she is meeting Ministers this week to discuss it further.
Finally, we recently announced that, in the upcoming Crime and Policing Bill, we are introducing multiple offences to tackle AI sexual abuse, including a new offence for possessing, creating or supplying AI tools designed to generate child sexual abuse material.
Members have raised the issue of the Act’s implementation being too slow. We are aware of the frustrations over the amount of time that it has taken to implement the Online Safety Act, not least because of the importance of the issues at hand. We are committed to working with Ofcom to ensure that the Online Safety Act is implemented as quickly and effectively as possible.
On implementation, would the Minister give clarity about the watermark for re-consultation and the point of delay of implementing the children’s codes under the Act? Amendments could be made to the children’s codes and I do not think they would trigger an automatic re-consultation with platforms. Could the Minister elaborate on where the delay would come from and how much scope Parliament has to amend those codes, which will be published in April?
Ofcom has had to spend a long time consulting on the codes to ensure that they are as proofed against judicial review as possible. Any re-consultation or review of the codes will result in a delay, and the best way to ensure that we can protect children is to implement the Act as soon as possible. My hon. Friend referred to the fact that both Ofcom and the Secretary of State have said that this is not a done deal; it is an iterative process, so of course we expect those codes to be reviewed.
As I said, Ofcom is moving forward with implementation of the Act. In a matter of weeks we will start to see, for the first time, safety duties making a material difference to online experiences for adults and children. Platforms are already duty-bound to assess the risk of illegal content and, with a deadline of 16 March, to complete risk assessments. Once legal harm codes come into effect from 17 March, Ofcom will be able to enforce legal safety duties. Shortly following that in April, Ofcom will publish the child safety codes and associated guidance, starting the clock for services to assess the risk of content harmful to children on their platforms. The child safety duties should be fully in effect by the summer.
My hon. Friend the Member for Darlington also raised the issue of dynamic risk assessment. I understand that she is in conversation with Ofcom and Ministers on that. I will await the outcome of those discussions. The implementation of the Act will bring in long overdue measures, such as preventing children from accessing pornography and legal content encouraging suicide, self-harm or eating disorders.
I have heard concerns raised by hon. Members regarding Ofcom’s approach, particularly to harmful functionalities and safety by design. We understand there is still a lot of work to be done, which is why the Secretary of State’s statement of strategic priorities places a high importance on safety by design. However, it is important not to lose sight of the positive steps we expect to see this year under the Act. For instance, Ofcom’s draft child codes already include specific measures to address harmful algorithms, among other safety recommendations. We expect Ofcom will continue to build on those important measures in the codes.
Questions were asked about whether the Government have plans to water down the Act. I can categorically state that there are no plans to water down the measures. The Secretary of State has made it very clear that any social media company that wants to operate in our society will have to comply with the law of the land. Whatever changes are made in other jurisdictions, the law of the land will remain.
The Minister might be about to come to the point I want to raise with her, which is about proportionality. Will she say something about that? I am keen to understand whether the Government accept Ofcom’s understanding of the term—that proportional measures are those measures that can be evidenced as effective. I gave reasons why I am concerned about that. I want to understand whether the Government believe that that is the correct interpretation of proportionality.
I was about to come to the point that the right hon. and learned Member raised about the digital regulation Committee. I have had a brief conversation with him about that, and agree about the importance of parliamentary scrutiny of the implementation of the Online Safety Act. I welcome the expertise that Members of both Houses bring. Select Committees are a matter for the House, as he is aware.
We will continue to work with the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee to support their ongoing scrutiny, as well as other parliamentary Committees that may have an interest in the Act. The Act requires the Secretary of State to review the effectiveness of the regime, two to five years after the legislation comes into force. We will ensure that Parliament is central to that process. I encourage the right hon. and learned Member to continue to raise the matter with the right people.
Most hon. Members raised the issue of apps. Ofcom will have a duty to publish a report on the role of app stores and children’s accessing harmful content on the apps of regulated services. The report is due between January ’26 and January ’27. Once it is published, the Secretary of State may, if appropriate, make regulations to bring app stores into the scope of the Act. The timing will ensure that Ofcom can prioritise the implementation of child safety duties. I will write to the right hon. and learned Member for Kenilworth and Southam on the issue of proportionality, as I want to ensure that I give him the full details about how that is being interpreted by Ofcom.
We fully share the concerns of hon. Members over small platforms that host incredibly harmful content, such as hate forums. These dark corners of the internet are often deliberately sought out by individuals who are at risk of being radicalised.
If the Government fully support our concerns about small but harmful sites, will the statutory instrument be reworked to bring them back into category 1, as the Act states?
The Government are confident that the duties to tackle illegal content and, where relevant, protect children from harmful content will have a meaningful impact on the small but risky services to which the hon. Gentleman refers. Ofcom has created a dedicated supervision taskforce for small but high-risk services, recognising the need for a bespoke approach to securing compliance. The team will focus on high-priority risks, such as CSAM, suicide and hate offences directed at women and girls. Where services do not engage with Ofcom and where there is evidence of non-compliance, Ofcom will move quickly to enforcement action, starting with illegal harm duties from 17 March, so work is being done on that.
The comprehensive legal safety duties will be applied to all user-to-user forums, and child safety duties will be applied to all user-to-user forums likely to be accessed by children, including the small but high-risk sites. These duties will have the most impact in holding the services to account. Because of the deep concerns about these forums, Ofcom has, as I said, created the small but risky supervision taskforce. For example, Ofcom will be asking an initial set of firms that pose a particular risk, including smaller sites, to disclose their illegal content risk assessment by 31 March.
The Government have been clear that we will act where there is evidence that harm is not being adequately addressed despite the duties being in effect, and we have been clear to Ofcom that it has the Government’s and Parliament’s backing to be bold in the implementation of the Online Safety Act. We are in clear agreement that the Act is not the end of the road, and Ofcom has already committed to iterating on the codes of practice, with the first consultation on further measures being launched this spring. The Government remain open minded as to how we ensure that users are kept safe online, and where we need to act, we will. To do so, we must ensure that the actions we take are carefully considered and rooted in evidence.
Will the consultation this spring for the next iterations of the codes include consultation with parliamentarians, or is it solely with platforms?
I expect any consultation will have to go through the Secretary of State, and I am sure it will be debated and will come to the House for discussion, but I will happily provide my hon. Friend with more detail on that.
I am grateful to all Members for their contributions to the debate. I look forward to working with the right hon. and learned Member for Kenilworth and Southam, and hopefully he can secure the Committee that he has raised.
Can the Minister explain what she meant when she said that Ofcom had to ensure that the codes were as judicial review-proofed as possible? Surely Ofcom’s approach should be to ensure that the codes protect vulnerable users, rather than be judicial review-proofed.
The point I was trying to make was that Ofcom is spending time ensuring that it gets the codes right and can implement them as soon as possible, without being delayed by any potential challenge. To avoid any challenge, it must ensure that it gets the codes right.
(1 month, 3 weeks ago)
General CommitteesI beg to move,
That the Committee has considered the draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025.
Thank you for coming to save the day, Sir Christopher; it is an honour to serve under your chairmanship. These regulations were laid before Parliament on 16 December 2024. As the Online Safety Act 2023 sets out, the Secretary of State must set thresholds for three categories of service: category 1, category 2A and category 2B. The services that fall into each of those categories will be required to comply with additional duties, with category 1 services having the most duties placed on them. The duties are in addition to the core duties that apply to all user-to-user and search services in scope.
The 2023 Act requires that specific factors must be taken into account by the Secretary of State when deciding thresholds for each category. The threshold conditions for user-to-user services must be set on user numbers and functionalities as well as any other characteristics or factors relating to the user-to-user part of the service that the Secretary of State deems relevant.
For category 1, the key consideration is the likely impact of the number of users of the user-to-user part of the service and its functionalities, on how quickly, easily and widely regulated user-generated content is disseminated by means of the service. For category 2A, the key consideration is the likely impact of the number of users of the search engine on the level of risk of harm to individuals from search content that is illegal or harmful to children. For category 2B, the key consideration is the likely impact of the number of users of the user-to-user part of the service and its functionalities on the level of risk of harm to individuals from illegal content or content that is harmful to children disseminated by means of the service.
Those conditions form the basis of Ofcom’s independent research and advice, as published in March 2024, which the Secretary of State was required to consider when setting threshold conditions. In laying these regulations before Parliament, the Secretary of State has considered the research carried out and the advice from Ofcom and agreed to its recommendations.
I understand that this decision will not please everyone. In particular, I recognise that the thresholds are unlikely to capture so-called “small but risky services”, as per Baroness Morgan’s successful amendment, which made it possible to create a threshold condition by reference only to functionalities and any other factors or characteristics. However, it is important to note that all regulated user-to-user and search services, no matter their size, will be subject to existing illegal content duties and, where relevant, child safety duties. The categories do not change that fact.
If the codes on illegal content duties currently laid before Parliament pass without objection, the duties will be in effect by this spring. They will force services to put in place systems and processes to tackle illegal content. If a service is likely to be accessed by children, the child safety duties will require services to conduct a child safety risk assessment and provide safety measures for child users. We expect that those will come into effect this summer, on the basis that the codes for the duties will have passed by then.
Together, the illegal content and child safety duties will mark the biggest material change in online safety for UK citizens since the internet era began. We expect the Online Safety Act to cover more than 100,000 services of various sizes, showing that the legislation goes far and wide to ensure important protections for users, particularly children, online.
The instrument before us will enable additional duties for categorised services. All categorised services must comply with transparency reporting duties. They must also have terms on the ability of parents to access information about children’s use of a service in the event of a child’s death. Category 1 services will have the most additional requirements. They will have to give adults more choice about the content they see and the people they interact with, and they must protect journalistic and news publisher content and content of democratic importance. The duties will also ensure that we can hold these companies to account over their terms of service, ensuring that they keep the promises they make to their users.
Once in force, the regulations will enable Ofcom to establish a public register of categorised services, which it expects to publish this summer. Ofcom will then consult on the draft codes of practice and guidance where relevant for additional duties. Ofcom will also do additional work to tackle small but risky services.
Ofcom’s work to tackle egregious content and enhance accountability does not stop with this instrument, which takes me back to the small but risky services that I mentioned. The horrifying stories I have heard about these sites during a number of debates recently are truly heartbreaking; we must do everything in our power to prevent vulnerable people from falling victim to such circumstances. I was pleased to see Ofcom set out in September 2024 its targeted approach to tackling small but risky services, which includes a dedicated supervision taskforce and a commitment to move to rapid enforcement action where necessary. That followed a letter from the Secretary of State to Ofcom inquiring about those services.
I am confident that the regulatory framework, combined with the bespoke taskforce, will work to keep all UK citizens safe online, but I must stress that the Secretary of State will hold the thresholds under review going forward. If there is evidence that the categories have become outdated or that they inadequately protect users, he will not shy away from updating them or reviewing the legislation, as he has made clear recently.
Finally, the online world that we are looking to govern is complex and ever-changing. The Act will not solve every problem, but it will bring real benefit to children and adults who have to contend with an unsafe online world for far too long. We should see the instruments we are debating as a step in that process and a first iteration, not as something fixed or set in stone, because there is much more to do. Our foremost priority is the timely implementation of the Act to enforce the additional duties as soon as possible. Years of delay and indecision have already come at a heartbreaking cost for vulnerable children and adults. Now it is time to deliver, but that relies on Parliament approving the categorisation thresholds without delay.
I thank all Members for their very powerful contributions to the debate. This instrument will bring us one step closer to a safer online world for our citizens. It is clearer than ever that it is desperately needed: transparency, accountability and user empowerment matter now more than ever.
The Opposition spokesperson, the hon. Member for Huntingdon, asked whether we agree on the need for companies not to wait for the duties in the Act to be implemented, but to ensure that safety is baked in from the start. I absolutely agree, and he will be aware that the Secretary of State has made that point on many occasions. He also raised the issue of proportionality. I confirm that many of the duties on categorised services are subject to the principle of proportionality, which requires Ofcom to consider measures that are technically feasible to providers of a certain size or capacity, and in some cases duties are based on the assessment of risk of harm presented by the service.
For example, in determining what is proportionate for the user empowerment duties on content for category 1 services, the findings of the most recent user empowerment assessments are relevant. They include the incidence of relevant content on the service in addition to the size and capacity of the provider. Where a code of practice is relevant to a duty, Ofcom must have regard to the principles on proportionality, and what is proportionate for one kind of service might not be for another.
The hon. Member for Huntingdon is absolutely right that the pornography review has been completed. The Government are reviewing that at the moment and will publish it in due course.
In response to the hon. Members for Newton Abbot and for Aberdeen North (Kirsty Blackman) and to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), when the Online Safety Act was introduced, category 1 thresholds were due to be assessed based on the level of risk and harm for adults—as the Members read out very clearly. That was removed during the passage of the Bill by the previous Government.
As things stand, although Baroness Morgan’s successful amendment made it possible for threshold conditions to be based solely on functionalities, it did not change the basis of Ofcom’s research, which for category 1 is easy, quick and wide dissemination of content. The Secretary of State had to consider that. I will repeat that for all Members to hear again: the Secretary of State has to act within the powers given to him in schedule 11 when setting out the threshold and conditions. The powers do not allow for thresholds to be determined by another body, as per the amendment.
Although the hon. Member for Aberdeen North very powerfully read out the Act, it very clearly sets out that it does not actually do what she is asking for it to do. We absolutely agree that small but risky sites need to be covered, but as it stands, the Secretary of State does not have the powers to include them.
Sorry, I have lots of points to cover. If I have not covered the hon Member’s concerns in my response, she is more than welcome to intervene later.
These small but risky services are of significant concern to the Government, and they will still have to protect against illegal content and, where relevant, content that is harmful to children. Ofcom also has a dedicated taskforce to go after them. I hope that answers the hon. Member’s question.
The hon. Member for Newton Abbot also raised the review of Ofcom’s approach. The regulator has already trialled an approach of targeting small but risky services through its regulation of video-sharing platforms. Indeed, a number of those services improved their policies and content moderation in response. All the adult platforms under the VSP regime, large and small, have implemented age verification through this route to ensure that under-18s cannot access pornography on their services. In instances where services fail to make necessary changes, they will face formal enforcement action from Ofcom. Ofcom has a proven track record and the Government have every faith in its ability to take action against non-compliant services.
The hon. Member also raised issues around how Ofcom will enforce action against small but risky services. Ofcom will have robust enforcement powers available to use against companies that fail to fulfil their duties and it will be able to issue enforcement decisions. Action can include fines of up to £18 million or 10% of qualifying worldwide revenue in the relevant year, whichever is higher, and Ofcom can direct companies to take specific steps to comply with its regulation.
The Minister raised the issue of age verification, which is good. However, she did not say how “harmful to adults”, “harmful to vulnerable minorities” and “harmful to women” are categorised. Children are protected in this case, but those other groups are not.
Also, in response to the answer that the Minister just gave, the difficulty is not the Ofcom powers; it is the obligation on the provider. If we have not put a provider into category 1, it does not have the same level of obligation as category 1 companies do. No matter what powers Ofcom has and no matter what fines it imposes, it cannot get such companies to give those commitments to a category 1 level if they are not in that category.
Removing the section is not giving Ofcom the tools it needs. The Minister was absolutely right earlier when she said that there is much more to do. Why drop this ability to put other sites in category 1?
I think the hon. Member missed it when I said that, as things stand, the Secretary of State does not have the power to include them. It is not about removing them; it is about not having the powers to include them, as things stand, at the moment.
I will conclude. In extreme cases, Ofcom, with the agreement of the courts, uses business disruption measures, which are court orders that mean third parties have to withdraw non-compliant services, or restrict or block access to non-compliant services in the UK.
The hon. Member for Newton Abbot also asked whether the Act will be reviewed to address the gaps in it. As I said at the start, our immediate focus is getting the Act implemented quickly and effectively. It was designed to tackle illegal content and protect children, and we want those protections in place as soon as possible. It is right that the Government continually assess the ability of the framework to keep us safe, especially given that technology develops so quickly. We will look, of course, at how effective these protections are and build on the Online Safety Act, based on evidence. However, our message to social media companies remains clear: there is no need to wait. As the Opposition spokesperson said, those companies can and should take immediate action to protect their users.
On the use of business disruption measures, the Act provides Ofcom with powers to apply to court for such measures, as I have said, including where there is continued failure and non-compliance. We expect Ofcom to use all available enforcement mechanisms.
The hon. Member for Huntingdon asked how Parliament can scrutinise the delivery of the legislation. Ongoing parliamentary scrutiny is absolutely crucial; indeed, the Online Safety Act requires Ofcom codes to be laid before Parliament for scrutiny. The Science, Innovation and Technology Committee and the Communications and Digital Committee of the House of Lords will play a vital role in scrutinising the regime. Ofcom’s codes of practice for illegal content duties were laid before Parliament in December. Subject to their passing without objection, we expect them to be in force by spring 2025, and the child safety codes are expected to be laid before Parliament in April, in order to be in effect by summer 2025. Under section 178 of the Act, the Secretary of State is required to review the effectiveness of its regulatory framework between two and five years after key provisions of the Act come into force. That will be published as a report and laid before Parliament.
Letters were sent in advance of laying these regulations to the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee. Hon. Members have asked about user numbers. Ofcom recommended the threshold of 34 million or 7 million for category 1. Services must exceed the user number thresholds. The Government are not in a position to confirm who will be categorised. That will be the statutory role of Ofcom once the regulations have passed.
I am going to make some progress. On livestreaming, Ofcom considered that functionality, but concluded that the key functionalities that spread content easily, quickly and widely are content recommender systems and forwarding or resharing user-generated content.
Services accessed by children must still be safe by design, regardless of whether they are categorised. Small but risky services will also still be required to comply with illegal content duties. The hon. Member for Aberdeen North should be well aware of that as she raised concerns on that issue.
On child safety, there were questions about how online safety protects children from harmful content. The Act requires all services in scope to proactively remove and prevent users from being exposed to priority illegal content, such as illegal suicide content and child sexual exploitation and abuse material. That is already within the remit.
In addition, companies that are likely to be accessed by children will need to take steps to protect children from harmful content and behaviour on their services, including content that is legal but none the less presents a risk of harm to children. The Act designates content that promotes suicide or self-harm as in the category of primary priority content that is harmful to children. Parents and children will also be able to report pro-suicide or pro-self-harm content to the platform and the reporting mechanism will need to be easy to navigate for child users. On 8 May, Ofcom published its draft children’s safety codes of conduct, in which it proposed measures that companies should employ to protect children from suicide and self-harm content, as well as other content.
Finally, on why category 1 is not based on risk, such as the risk of hate speech, when the Act was introduced, category 1 thresholds were due to be assessed on the level of risk of harm to adults from priority content disseminated by means of that service. As I said earlier, that was removed during the Act’s passage by the then Government and replaced with consideration of the likely functionalities and how easily, quickly and widely user-generated content is disseminated, which is a significant change. Although the Government understand that that approach has its critics, who argue that the risk of harm is the most significant factor, that is the position under the Act.
The Minister is making the case that the Secretary of State’s hands are tied by the Act —that it requires stuff in relation to the number of users. Can she tell us in which part of the Act it says that, because it does not say that? If she can tell us where it is in the Act, I am quite willing to sit down and shut up about this point, but it is not in the Act.
The legislation allows the Secretary of State to deviate from Ofcom’s advice and to publish a statement explaining why. However, the core consideration for category 1 under schedule 11 is—I repeat for the third time—how easily, quickly and widely regulated user-generated content is disseminated by means of a service. As a result, for category 1, Ofcom concluded that the content is disseminated with increased breadth as the number of users increases.
The decision to proceed with the threshold combination recommended by Ofcom, rather than discounting user-number thresholds, reflects that any threshold condition created by the Government should consider the factors as set out in the Act, including easy, quick and wide dissemination for category 1, and the evidence base. That is what the Act says. As a result, the Government decided to not proceed with an approach that deviated from Ofcom’s recommendation, particularly considering the risk of unintended consequences.
I am more than happy to write to the hon. Member for Aberdeen North with the full details. I understand that she feels very passionately about this point, but the Act is the Act. Although I am grateful for her contribution, I have to follow what the Act says, based on the legal advice that I get.
I am extremely grateful to the Minister for giving way, and I have sympathy with her position, especially in relation to legal advice, having both received it and given it. I suggest that the Minister is talking about two different things, and they need to be separated. The first is the question of whether legal but harmful content was removed from the Bill, which it undoubtedly was. Measures in relation to content that is neither unlawful nor harmful to children were largely removed from the Bill—the Minister is right to say that.
What we are discussing, however, are the tools available to Ofcom to deal with those platforms that it is still concerned about in relation to the remaining content within the ambit of the Bill. The worry of those of us who have spoken in the debate is that the Government are about to remove one of the tools that Ofcom would have had to deal with smaller, high-harm platforms when the harm in question remains in ambit of the Bill—not that which was taken out during its passage. Would the Minister accept that?
I will again set out what the Secretary of State’s powers are. The Government have considered the suggestion of Baroness Morgan and others to categorise small but risky based on the coroner or Ofcom linking a service to a death. The Government were grateful for that suggestion. However, there were issues with that approach, including with what the Act allows the Secretary of State to consider when setting the categories. The Secretary of State is not allowed to consider anything other than the factors set out in the Act, which says that it has to include easy, quick and wide dissemination for category 1, and has to be evidence based.
I hope that the hon. Member for Aberdeen North will accept that I will write to her in great detail, and include a letter from Government lawyers setting out what I am saying in relation to the powers of the Secretary of State in setting the categories. I hope that she will be satisfied with that. I want to make it clear that we are not taking anything out; the Secretary of State is proceeding with the powers that he has been given.
I am going to proceed. I think I have covered the main points raised by hon. Members. I hope that the Committee agrees with me on the importance of enacting these thresholds and implementing the Online Safety Act as swiftly as possible. I made it clear that Ofcom has set up a taskforce that will review the small but risky sites, in response to the Secretary of State’s letter to it in September.
It is an honour to serve under your chairmanship, Sir Christopher. My right hon. and learned Friend the Member for Kenilworth and Southam was Attorney General for four years. It is just possible that his interpretation of the Act is correct, and that of the Minister’s officials is incorrect. I do not have detailed knowledge of this legislation, but I wonder whether the Minister and her Whip want to take some further time and pause before putting these regulations to a vote—that would be perfectly acceptable to us. We will not oppose the regulations, but we are cautious that if the Minister wants more time, she is welcome to take it.
Although I thank the hon. Member for his contribution, I am sure that he will appreciate that this issue has been looked into and discussed in debates and with officials. With that, I commend these regulations to the Committee.
The comments made by the hon. Member for Aberdeen North are absolutely outrageous, but I would not expect anything less from the SNP. I have made it very clear that I will share legal advice with Members. I also made it clear that the small but risky sites that Members have been talking about were raised by the Secretary of State in a letter to Ofcom in September, and Ofcom has set up a taskforce to look at those services.
The key thing for the Government is to get on with implementing the Online Safety Act. I know that the hon. Lady would like us to spend lots of time delaying, but we are interested in getting on with implementing the Act so that we can keep children safe online. With that, I commend the regulations to the House.
For the benefit of people watching, only Committee members can cast votes in a Division.
Question put.
(2 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Sir Desmond. I start by paying tribute to Ellen Roome both for launching this petition and for all the campaigning she has done in this area. Let us take a moment to remember her son, Jools. As a parent, I know that we do everything to keep our children safe. We teach them how to cross a road and why it matters not to talk to strangers—we do all we can, but it can still be terrifying to think about what our children are exposed to, even in the safety of our own homes. I can only imagine how it would feel for a parent not to know how or why their child lost their life. I know that parents across the country feel the same way.
As we have heard, Ellen’s petition received over 120,000 signatures between 10 May and the dissolution of Parliament on 30 May. That shows the strength of feeling on this issue, and I am grateful to the brave parents, including Ellen, Ian and others who campaigned on this issue during the passage of the Online Safety Act, who continue to shine a light on it. The Secretary of State has met them a number of times, and their views are absolutely crucial to the work we are doing in this area. Finally, I thank my hon. Friend the Member for Sunderland Central (Lewis Atkinson) for securing a debate on this e-petition on behalf of the Petitions Committee, along with other hon. and right hon. Members for their powerful contributions.
I know how long it has taken to get the Online Safety Act across the line. It is not a perfect piece of legislation, and the delay in delivering it has come at a heartbreaking human cost. As the Secretary of State has set out numerous times, we are working to implement the Act as quickly as we possibly can so that the protections it puts in place can begin to change the online world that our children experience.
The Act has two provisions relevant to this debate. First, section 101 seeks to address problems faced when there is uncertainty over the circumstances leading to the death of a child. The provision supports coroners and procurators fiscal in their investigations by giving Ofcom the power to require information about a child’s online activity following a request from the investigating coroner. It is already in force, and the coroners have begun to make use of the powers available to them.
Secondly, section 75 imposes additional duties on categorised services to be transparent with parents regarding a company’s data disclosure processes following the death of a child. We have been clear that we plan to build on the Online Safety Act where it does not go far enough, and the Secretary of State only yesterday set out how the Online Safety Act is uneven and, in some cases, unsatisfactory. He also set out the need for Parliament to learn to legislate much faster—we cannot wait another 10 years to make changes to the legislation.
At the end of last year, the Secretary of State decided to use his powers to issue a statement of strategic priorities to Ofcom, asking them to ensure that safety is embedded in our online world from the very start. That is why the Government will also seek to establish a data preservation process through clause 122 of the Data (Use and Access) Bill. The proposed clause will require Ofcom to issue a data preservation notice to specified companies at the request of the coroner or, in Scotland, the procurator fiscal. That will require these companies to preserve information relating to the use of their services by the child who has died. This proposal fulfils a manifesto commitment to further strengthen powers, and will help coroners understand the tragic circumstances surrounding a child’s death.
Let me turn to the matter of coroners sharing information with families. Interested persons, including bereaved families, have the right to receive evidence from coroners, subject to their judicial discretion. The chief coroner has provided detailed guidance on this. Coroners have a statutory duty to issue a prevention of future deaths report if their investigation reveals that future deaths could be prevented by one or more measures. Evidence accessed via Ofcom powers will help to inform a decision on whether a report should be issued.
I know from parents and children just how complex this issue is. The Secretary of State recently visited the NSPCC, where he met a group of young people to understand more about their lives online. The NSPCC was concerned that giving parents complete access to their children’s social media accounts could raise complex issues around children’s rights to privacy and, in extreme cases—as we have heard today—safeguarding. For example, as raised earlier, if a child is exploring their sexuality online, they may not want their parents to know and they would be right to expect that privacy.
All Members raised the retrospective application of section 101 of the Act. Ofcom’s powers to require information from companies on behalf of coroners can still be used where a second coroner’s inquest is ordered. Ofcom can use these powers on the instruction of a coroner. Ofcom will also be able to use data preservation notices in the event that a second coroner’s inquest is ordered. Any personal data that is captured by the data preservation notice, and held by the online service at the time of issue, will still be in scope and must be retained upon receipt of notice. However, I have heard very powerfully from all Members today about the lengths parents have to go to request a second inquest and about the associated costs. As I have said, the legislation is not perfect and there is room for improvement, and I would like to meet Members and parents to explore this matter further. We need to continue to review the legislation.
When it comes to age limits, a smartphone and social media ban for under-16s has been raised. We are aware of the ongoing debate as to what age children should have smartphones or access to social media. As the Secretary of State for Science, Innovation and Technology has previously said, there are no current plans to implement a smartphone or social media ban for children. We will continue to do what is necessary to keep our children safe online.
On that note, we have heard from several Members today about their concerns for children’s mental health, when their expectations are often measured against heavily doctored images they see online. Will the Minister commit to use and/or amend legislation that commits hosts—as is common with regulated news outlets —to clearly identify doctored imagery, and the accounts and pages that spread them?
I will come to that point.
On the issue of a ban on smartphones and social media for under-16s, we are focused on building the evidence base to inform any future action. We have launched a research project looking at the links between social media and children’s wellbeing. I heard from the hon. Member for Esher and Walton (Monica Harding) that that needs to come forward and I will pass that on to my colleagues in the Department.
My hon. Friend the Member for Lowestoft (Jess Asato) mentioned the private Member’s Bill in the name of my hon. Friend the Member for Whitehaven and Workington (Josh MacAlister). We are aware of his Bill and share his commitment to keeping children safe online. We are aware of the ongoing discussion around children’s social media and smartphone use, and it is important that we allocate sufficient time to properly debate the issue. We are focused on implementing the Online Safety Act and building the evidence base to inform any future action. Of course, we look forward to seeing the detail of my hon. Friend’s proposal and the Government will set out their position on that in line with the parliamentary process.
My hon. Friend the Member for Darlington (Lola McEvoy) raised the issue of Ofcom’s ambitions. Ofcom has said that its codes will be iterative, and the Secretary of State’s statement will outline clear objectives for it to require services to improve safety for their users.
The hon. Member for Twickenham (Munira Wilson) and my hon. Friend the Member for Bournemouth West (Jessica Toale) mentioned engagement with children, and we know how important that is. Ofcom engaged with thousands of children when developing its codes, and the Children’s Commissioner is a statutory consultee on those codes, but of course we must do more.
The hon. Member for Huntingdon (Ben Obese-Jecty) raised the matter of mental health services and our commitment in that regard. He is right that the Government’s manifesto commits to rolling out Young Futures hubs. That national network is expected to bring local services together to deliver support for not only teenagers at risk of being drawn into crime, but those facing mental health challenges, and, where appropriate, to deliver universal youth provision. As he rightly said, that is within the health portfolio, but I am happy to write to him with more detail on where the programme is.
We want to empower parents to keep their children safe online. We must also protect children’s right to express themselves freely, and safeguard their dignity and autonomy online.
The Minister spoke earlier about age limits. I was not sure if she had finished responding to Members’ comments and questions, and whether she would be able to comment on not only what the various age thresholds should be, but what they mean. In particular, if the GDPR age is 13, does that mean that parental controls can effectively be switched off by somebody of age 13, 14 or 15?
I am sure the right hon. Gentleman’s party would have discussed the issue of the age limit and why it was 13 during the passage of the Online Safety Act.
I am more than happy to write to him in detail on why the age limit has been set at 13. As I said, there is currently a live discussion about raising the age and evidence is being collated.
The challenge of keeping our children safe in a fast-moving world is one that we all—Government, social media platforms, parents and society at large—share. As we try to find the solutions, we are committed to working together and continuing conversations around access to data in the event of the tragic death of a child.
I will finish by again thanking Ellen for her tireless campaigning. I also thank all the speakers for their thoughtful contributions. I know that Ellen has waited a long time for change and we still have a long way to go. Working with Ellen, the Bereaved Families for Online Safety group, other parents and civil society organisations, we will build a better online world for our children.
(2 months, 2 weeks ago)
Commons ChamberIn case the House has not heard, this Government are driving innovation, with a record £20.4 billion of research and development investment for 2025-26, powering an innovation-led economy across the UK. In Staffordshire, UK Research and Innovation is backing more than £29 million for 70 cutting-edge research and innovation projects. A stand-out example is Innovate UK’s support for the Staffordshire net zero skills for growth project, which is equipping the country to seize opportunities in the net zero transition.
Towns such as those in my constituency are key to the economy, but can face unique challenges in accessing innovation opportunities. Please could the Minister tell me how she plans to ensure that towns such as Stafford and Eccleshall are able to access new jobs, skills, investment and growth opportunities?
The Department has a clear vision to ensure that the UK remains at the forefront of global innovation—a place where cutting-edge businesses of all sizes can start and grow, and where local people have high-quality jobs, building on local strengths. I am delighted to hear about the new multimillion-pound facility being built at Newcastle and Stafford colleges’ Stafford campus in my hon. Friend’s constituency, supported by £15 million of Government investment. It will welcome learners from September and will help to provide the technical skills that businesses need, both now and in the future, to support regional and national productivity.
DSIT is leading the charge by establishing the digital centre of Government to harness technology and transform our public services. We are committed to improving digital inclusion and accessibility to ensure high-quality online services that are available to everyone. In the coming months, the Department will outline its plans and priorities for a digital centre and to advance digital inclusion.
Sunderland was recently named the UK’s smartest city by The Times. It was a pleasure to welcome the Secretary of State when he visited recently. More than 5,000 homes in our city now have assistive technology installed, supporting the independence of older and disabled people and improving their access to care. How do the Government plan to build on the example of Sunderland to improve access to public services across the UK?
The Government recognise the potential for digital technology to support people to live independently. We will set new national standards for care technologies and develop trusted guidance so that people who draw on care, their families and care providers can confidently buy what works and get the safest, most effective tech into their homes or services. In addition, we will take forward a range of initiatives in 2025-26, including funding more home adaptations and promoting the better use of care technology.
What steps is her Department taking to help older people who do not feel comfortable utilising technology to access public services?
The hon. Gentleman will be happy to hear that the Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 require most public sector organisations to ensure their services are accessible to disabled and older people by meeting the requirements of the web content accessibility guidelines and by publishing an accessibility statement in the prescribed format. The Government Digital Service’s accessibility monitoring team reviews public sector websites to ensure compliance with the accessibility regulations and supports Departments to improve their services.
I welcome the Minister’s approach to improving access through technology. However, the majority of the concerns that colleagues and I receive are from those who cannot use technology. Rather than improving access, for some, technology can act as a barrier. What is her assessment of the impact of digital exclusion in the UK? Will the digital inclusion strategy that she has announced include digital exclusion at all levels of Government?
Digital inclusion is a priority for this Government. We have set up the digital inclusion and skills unit to ensure that everyone has the access, skills, support and confidence to participate in modern digital society, whatever their circumstances. Work is ongoing to develop our approach to digital inclusion and co-ordinate across Departments, and we hope to announce more on that soon. We will work closely with the third sector, the industry, devolved Governments and local authorities to ensure that future interventions are targeted and based on individuals’ needs.
(3 months, 1 week ago)
Commons ChamberI thank the hon. Member for Leeds East (Richard Burgon) for opening the debate and all other colleagues who have contributed. I know that this issue will be close to the hearts of many of us, because it is about protecting the safety of everyone, including our children and young people.
This evening I want to talk about why this issue matters and what the Online Safety Act will do about it. First, I would like to share my deepest sympathies with family and friends of Joe Nihill—a 23-year-old man who ended his life after finding suicide-related content online. Unfortunately, stories such as Joe’s are not uncommon—we have heard about Tom, a 22-year-old young man, who also died from suicide. As part of our work in online safety we speak to groups that have campaigned for years for a safer internet, often led by bereaved families. I thank Joe’s mother Catherine, his sister-in-law Melanie and all the bereaved families for their tireless work. We continue to listen to their expertise in this conversation.
People who are thinking about ending their lives or hurting themselves might turn to the internet as a place of refuge. All too often, what they find instead is content encouraging them not to seek help. That deluge of content has a real-world impact. Suicide-related internet use is a factor in around a quarter of deaths by suicide among people aged 10 to 19 in the UK—at least 43 deaths a year. Lots of research in this area focuses on children, but it is important to recognise that suicide-related internet use can be a factor in suicide in all age groups. These harms are real, and tackling them must be a collective effort.
On the hon. Member’s first point, we welcome efforts by all companies, including internet service providers, to tackle illegal content so that no more lives are tragically lost to suicide. Online safety forms a key pillar of the Government’s suicide prevention strategy. However, we are clear that the principal responsibility sits squarely with those who post such hateful content, and the site where it is allowed to fester—sites that, until now, have not been made to face the consequences. The Online Safety Act has been a long time coming. A decade of delay has come at a tragic human cost, but change is on its way. On Monday, Ofcom published its draft illegal harms codes under the Online Safety Act, which are a step change.
On the hon. Member’s second point, I can confirm that from next spring, for the first time, social media platforms and search engines will have to look proactively for and take down illegal content. These codes will apply to sites big and small. If services do not comply they could be hit by massive fines, or Ofcom could, with the agreement of the courts, use business disruption measures —court orders that mean that third parties have to withdraw their services or restrict or block access to non-compliant services in the UK. We have made intentionally encouraging or assisting suicide a priority offence under the Act. That means that all providers, no matter their size, will have to show that they are taking steps to stop their sites being used for such content.
The strongest protection in the Act’s frameworks are for children, so on the hon. Member’s third point, I assure him that under the draft child safety codes, any site that allows content that promotes self-harm, eating disorders or suicide will now have to use highly effective age limits to stop children from accessing such content. Some sites will face extra duties. We have laid the draft regulations setting out the threshold conditions for category 1, 2A and 2B services under the Act. Category 1 sites are those that have the ability to spread content easily, quickly and widely. They will have to take down content if it goes against their terms of services, such as posts that could encourage self-harm or eating disorders. They will also have to give adult users the tools to make it less likely they will see content that they do not want to see, or will alert them to the nature of potentially harmful content.
A suicide forum will be unlikely to have terms of services that restrict legal suicide content, and users of these sites are unlikely to want to use tools that make it less likely they will see such content. However, that absolutely does not mean that such forums—what people call “small but risky” sites—can go unnoticed.
Every site, whether it has five users or 500 million users, will have to proactively remove illegal content, such as content where there is proven intent of encouraging someone to end their life. Ofcom has also set up a “small but risky” supervision taskforce to ensure that smaller forums comply with new measures, and it is ready to take enforcement action if they do not do so. The Government understand that just one person seeing this kind of content could mean one body harmed, one life ended, and one family left grieving.
The problem is that the sites that the hon. Member for Leeds East (Richard Burgon) referred to—and there are many others like them—do not necessarily fall into the illegal category, although they still have extremely dangerous and harmful content. Despite a cross-party vote in Parliament to include in the Online Safety Act these very small and very dangerous sites in category 1, there has been a proactive decision to leave them out of the illegal harms codes, which were published yesterday. Can the Minister put on record exactly why that is? Why can these sites not be included in that category? There is all sorts of content glamourising suicide, self-harm, eating disorders and other hate speech that is being promoted by these small sites. They should be regulated to a high level.
Based on research regarding the likely impact of user numbers and functionalities, category 1 is about easy, quick and wide dissemination of regulated user-generated content. As Melanie Dawes set out in her letter to the Secretary of State in September, Ofcom has established a “small but risky” supervision task, as I mentioned, to manage and enforce compliance among smaller services. It has the power to impose significant penalties and, as I say, to take remedial action against non-compliant services. As the hon. Member for Leeds East mentioned earlier, the Online Safety Act is one of the biggest steps that Government have taken on online safety, but it is imperfect. It is an iterative process, and it will be kept under review.
I thank the hon. Gentleman for raising this matter, and for bringing to our memory Joe Nihill and those like him, who turned to the internet for help and were met with harm. On his final point, on the effective implementation of the Online Safety Act, we will continue to engage with all providers in this space. I am confident that these measures are a big step in making tech companies play their part in wiping out those harms and making the internet a safer place for us all. The hon. Gentleman raised the matter of an outstanding question. I do not know whether he has gone to the wrong Department, but I will commit to looking up that question and ensuring that he receives a response to it.
With that, I thank you, Madam Deputy Speaker, and wish you and the whole House a very happy Christmas.
Question put and agreed to.
(3 months, 3 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Dowd. I congratulate my hon. Friend the Member for Darlington (Lola McEvoy) on securing this debate. As hon. Members can see, debates in Westminster Hall take a whole different form from debates in the House; they are a lot more informative and collegiate, and Westminster Hall is a much nicer place to debate. I welcome the parents in the Public Gallery and thank them for their commitment and the work they continue to do to make sure that this issue stays on our agenda and continues to be debated. I know they have met my colleagues, and I look forward to meeting them as well.
I am grateful to all hon. Members for the incredibly powerful and informative contributions to today’s debate. As the mother of two young children, I always have online safety on my mind. Every time I am on my phone in front of my children or want to keep them distracted by putting on a YouTube video, it scares me, and the issue is always at the back of my mind. It is important that we as parents always have the safety of our children in mind. My hon. Friend the Member for Rother Valley (Jake Richards) talked about being a parent to really young children while being an MP or candidate. As a mother who had two children in part during the last term, I can assure him that it does get easier. I am happy to exchange some tips.
The growth in the use of phones and social media has been a huge societal change, and one that we as parents and citizens are grappling with. I am grateful to all hon. Members here who are engaging in this debate. The Government are committed to keeping children safe online, and it is crucial that we continue to have conversations about how best to achieve that goal. We live in a digital age, and we know that being online can benefit children of all ages, giving them access to better connections, education, information and entertainment. However, we know that it can also accentuate vulnerabilities and expose children to harmful and age-inappropriate content. We believe that our children should be well-equipped to make the most of the digital opportunities of the future, but we must strike the right balance so that children can access the benefits of being online while we continue to put their safety first.
Last week, the Secretary of State visited NSPCC headquarters to speak to their voice of online youth group. That is just the latest meeting in a programme of engagement undertaken by the Secretary of State and my colleague in the other place, Baroness Maggie Jones. Getting this right has been and will continue to be a long process. Many hon. Members here will remember the battle to get the Online Safety Act passed. Despite the opposition—some Members in this place sought to weaken it—there was cross-party consensus and a lot of support, and so it was passed.
On a number of occasions during the passage of the Online Safety Bill in this House, I raised the story of my constituent Joe Nihill from Leeds, who sadly took his own life after accessing very dangerous suicide-related content. I want to bring to the Minister’s attention that before Ofcom’s new powers are put into practice at some point next year, there is a window where there is a particular onus on internet service providers to take action. The website that my constituent accessed, which encouraged suicide, deterred people from seeking mental health support and livestreamed suicide, has been blocked for people of all ages by Sky and Three. Will the Minister congratulate those two companies for doing that at this stage and encourage all internet service providers to do the same before Ofcom’s new powers are implemented next year?
I thank the hon. Member for making that point and I absolutely welcome that intervention by internet providers. As I will go on to say, internet providers do not have to wait for the Act to be enacted; they can start making such changes now. I absolutely agree with him.
Many colleagues have raised the issue of the adequacy of the Online Safety Act. It is a landmark Act, but it is also imperfect. Ofcom’s need to consult means a long lead-in time; although it is important to get these matters right, that can often feel frustrating. None the less, we are clear that the Government’s priority is Ofcom’s effective implementation of the Act, so that those who use social media, especially children, can benefit from the Act’s wider reach and protections as soon as possible. To that end, the Secretary of State for Science, Innovation and Technology became the first Secretary of State to set out a draft statement of strategic priorities to ensure that safety cannot be an afterthought but must be baked in from the start.
The hon. Member for Strangford (Jim Shannon) raised the issue of suicide and self-harm. Ofcom is in the process of bringing the Online Safety Act’s provisions into effect. Earlier this year, it conducted a consultation on the draft illegal content, with one of the most harmful types being content about suicide. Child safety codes of practice were also consulted on. We expect the draft illegal content codes to be in effect by spring 2025, with child safety codes following in the summer.
Under the Act, user-to-user and search services will need to assess the risk that they might facilitate illegal content and must put in place measures to manage and mitigate any such risk. In addition, in-scope services likely to be accessed by children will need to protect children from content that is legal but none the less harmful to children, including pornography, bullying and violent content. The Act is clear that user-to-user services that allow the most harmful types of content must use highly effective age-assurance technology to prevent children from accessing it.
Ofcom will be able to use robust enforcement powers against companies that fail to fulfil their duties. Ofcom’s draft codes set out what steps services can take to meet those duties. The proposals mean that user-to-user services that do not ban harmful content should introduce highly effective age checks to prevent children from accessing the entire site or app, or age-restrict those parts of the service that host harmful content. The codes also tackle algorithms that amplify harm and feed harmful material to children, which have been discussed today. Under Ofcom’s proposal, services will have to configure their algorithms to filter out the most harmful types of content from children’s feeds, and reduce the visibility and prominence of other harmful content.
The hon. Member for Aberdeen North (Kirsty Blackman), the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) and others discussed strengthening the codes. Ofcom has been very clear that it will look to strengthen the codes in future iterations. The Government will encourage it to do so as harmful online technology and the evidence base about such technology evolves.
I am short of time, so I will have to proceed.
For example, Ofcom recently announced plans to launch a further consultation on the illegal content duties once the first iteration of those duties is set out in spring next year. That iterative approach enables Ofcom to prioritise getting its initial codes in place as soon as possible while it builds on the foundations set out in that first set of codes.
My hon. Friends the Members for Slough (Mr Dhesi) and for Lowestoft (Jess Asato) and the hon. Member for Aberdeen North raised the issue of violence against girls and women. In line with our safer streets mission, platforms will have new duties to create safer spaces for women and girls. It is a priority of the Online Safety Act for platforms proactively to tackle the most harmful illegal content, which includes offences such as harassment, sexual exploitation, extreme pornography, internet image abuse, stalking and controlling or coercive behaviour, much of which disproportionately affects women and girls. All services in scope of the Act need to understand the risks facing women and girls from illegal content online and take action to mitigate that.
My hon. Friend the Member for Carlisle (Ms Minns) set out powerfully the issues around child sexual exploitation and abuse. Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims. UK law is crystal clear: the creation, possession and distribution of child sexual abuse images is illegal. The strongest protections in the Online Safety Act are against child sexual abuse and exploitation. Ofcom will have strong powers to direct online platforms and messaging and search services to combat that kind of abuse. It will be able to require platforms to use accredited, proactive technology to tackle CSEA and will have powers to hold senior managers criminally liable if they fail to protect children.
I am running short of time, so I shall make some final remarks. While we remain resolute in our commitment to implementing the Online Safety Act as quickly and effectively as possible, we recognise the importance of these ongoing conversations, and I am grateful to everyone who has contributed to today’s debate. I am grateful to the brave parents who continue to fight for protections for children online and shine a light on these important issues. The Opposition spokesperson, the hon. Member for Runnymede and Weybridge (Dr Spencer), asked a host of questions. I will respond to him in writing, because I do not have time to do so today, and I will place a copy in the Library.
I call Lola McEvoy to briefly respond to the debate.
(4 months ago)
Commons ChamberWe want to boost investment in innovation and enable people in all regions of the UK to benefit from an innovation-led economy. That is why the spending review supports the UK’s research and development ambition, with total Government investment in R&D rising to a record £20.4 billion in 2025-26. That allows us to extend innovation accelerators for another year, which will continue to bolster the west midlands’ high-potential innovation clusters, fund the Midlands Industrial Ceramics Group through the Strength in Places fund, and support the region’s investment zone.
Last month, alongside Richard Parker, the Mayor of the West Midlands, I was lucky enough to join Halesowen college as it opened its new digital and media campus at Trinity Point. Does the Minister agree that excellent institutions such as this are fundamental to supporting innovation across our region, and would she be so kind as to visit us at some point in the near future?
I agree with my hon. Friend that educational institutes are crucial to innovation. Halesowen college is one of five colleges across the region using the further education and innovation fund to support innovation and technical excellence within the local community. Such facilities and expertise will help businesses to develop a workforce with skills and take advantage of that. I would be delighted to visit the Trinity Point college if the opportunity arises.
Increasing levels of innovation across the UK are crucial to unlocking growth and solving some of our biggest problems. That is why I was worried to read about the Secretary of State saying that we have to apply “a sense of statecraft” to working with multinational tech companies. Does the Minister agree that what we should be doing is working with such companies as companies, not states, focusing on increasing healthy competition and supporting innovative UK businesses so that they are not left with the choice of being bought up or leaving the UK?
As I have said, increasing productivity right across the UK is fundamental to our mission to kick-start economic growth. Through our industrial strategy and the development of local growth plans, we will build on local strengths to ensure that public and private research and development businesses right across the UK help local places to reach their potential. We are strengthening the relationships with businesses to deliver for British people.
(4 months ago)
General CommitteesI beg to move,
That the Committee has considered the draft Communications Act 2003 (Disclosure of Information) Order 2024.
It is a pleasure to serve under your chairmanship, Mr Twigg. I start by welcoming the shadow Minister, the hon. Member for Runnymede and Weybridge, to his place. I look forward to many encounters with him—we will have another tomorrow.
The Online Safety Act 2023 lays the foundations for strong protections for children and adults online, and I thank colleagues for their continued interest in the Act and its implementation. It is critical that the Act is made fully operational as quickly as possible, and the Government are committed to ensuring that its protections are delivered as soon as possible.
The draft order will further support the implementation of the 2023 Act by Ofcom. It concerns Ofcom’s ability to share business information with Ministers for the purpose of fulfilling functions of the 2023 Act under section 393 of the Communications Act 2003. It corrects an oversight in the 2023 Act that was identified following its passage.
Section 393 of the Communications Act 2003 contains a general restriction on Ofcom’s disclosing information about particular businesses without their consent. It includes exemptions, including where such a disclosure would facilitate Ofcom’s carrying out its regulatory functions or where it would facilitate other specified persons in carrying out specified functions. However, the section currently does not enable Ofcom to share information with Ministers for the purpose of their fulfilling functions under the Online Safety Act, although the 2003 Act does contain similar information-sharing powers in respect of the Enterprise Act 2002 and the Enterprise and Regulatory Reform Act 2013. That means that were Ofcom to disclose information about businesses to the Secretary of State, it may be in breach of the law.
It is important that a gateway exists for sharing information for these purposes, so that the Secretary of State can carry out key functions of the Online Safety Act, such as setting the fee threshold for the online safety regime in 2025 or carrying out the post-implementation review of the Act required under section 178. The draft order will therefore amend the 2003 Act to allow Ofcom to share information with the Secretary of State and other Ministers strictly for the purpose of fulfilling functions under the Online Safety Act.
There are strong legislative safeguards and limitations on the disclosure of this information, and Ofcom is experienced in handling confidential and sensitive information obtained from the services it regulates. Ofcom must comply with UK data protection law and would need to show that the processing of any personal data was necessary for a lawful purpose. As a public body, Ofcom is also required to act compatibly with the article 8 right to privacy in the European convention on human rights. We will continue to review the Online Safety Act so that Ofcom is able to support the delivery of functions under the Act where appropriate.
I thank the shadow Minister and the Liberal Democrat spokesperson for their comments. The Government are committed to the effective implementation of the Online Safety Act. It is crucial that we remove any barriers to that, as we are doing with this draft order, which will ensure that Ofcom can co-operate and share online safety information with the Secretary of State where it is appropriate to do so, as intended during the Act’s development.
The shadow Minister asked about proportionality. There are safeguards around the sharing of business information. Section 393 of the 2003 Act prohibits the disclosure of information about a particular business that was obtained using powers under certain Acts listed in the section, except with consent or where an exception applies. Ofcom is therefore restricted in disclosing information obtained using its powers to require information and other powers under the Online Safety Act, except where an exception applies. Ofcom is an experienced regulator and understands the importance of maintaining confidentiality. It is also a criminal offence for a person to disclose information in contravention of section 393 of the 2003 Act, including to the Secretary of State.
The Liberal Democrat spokesperson asked about the Online Safety Act’s implementation. On 17 October, Ofcom published an updated road map setting out its implementation plans. Firms will need to start risk-assessing for illegal content by the end of the year, once Ofcom finalises its guidance. The illegal content duties will be fully in effect by spring 2025, and Ofcom can start enforcing against the regime. Firms will have to start risk-assessing for harms to children in spring 2025, and the child safety regime will be fully in effect by summer 2025.
I hope that the Committee agrees with me about the importance of implementing the Online Safety Act and ensuring that it can become fully operational as quickly as possible. I commend the draft order to the Committee.
Question put and agreed to.