UK Democracy: Impact of Digital Platforms Debate
Full Debate: Read Full DebateFeryal Clark
Main Page: Feryal Clark (Labour - Enfield North)Department Debates - View all Feryal Clark's debates with the Foreign, Commonwealth & Development Office
(2 days, 4 hours ago)
Commons ChamberI thank the hon. Member for Lagan Valley (Sorcha Eastwood) for securing the debate. I join her in honouring the memory of our dear colleagues Sir David Amess and Jo Cox. I am grateful to her and to all the other speakers for their incredibly powerful and insightful contributions to the debate.
The Government share the hon. Member’s concerns about the impact that online harassment, intimidation, abuse, misinformation and disinformation have on our democracy. Existing and emerging technologies have led to changes in the information environment and will continue to shape our future, but it is, and will always be, an absolute priority for the UK Government to protect our democracy, and we remain well-prepared to do so with robust systems in place. I was grateful to the hon. Member for sharing her experiences. The House should hear about the online abuse and hate that she has faced. There is no place for that, and I thank her for sharing it.
The Government are committed to combating violence against women and girls. The Online Safety Act requires Ofcom to develop and enforce guidance for tech companies, which aims to ensure that platforms implement measures to reduce harm to women and girls online. The Act imposes legal responsibility on online platforms, including social media platforms, gaming platforms, dating apps and search engines, to protect users from illegal content and material that is harmful to children and to address issues that disproportionately affect women and girls. Those measures reflect the Government’s commitment to creating a safer online environment, acknowledging the unique challenge faced by women and girls in the digital space. In putting that guidance together, Ofcom consulted with the Domestic Abuse Commissioner, the Victims’ Commissioner and experts in the field.
The effectiveness of those measures depends on their robust implementation and enforcement, which we will monitor closely. As the hon. Member knows, the implementation of the Online Safety Act started only in spring this year. While we know it is a landmark Act, it is not perfect, so the Government will continue to keep it under review, and we will not shy away from strengthening it where required. As I said, the Act is already being implemented. We will introduce protections to protect people from illegal content, such as child sexual abuse and terrorist material, as well as to protect children from harmful material. I make it clear to the House and to all Members who raised this issue that that is not up for negotiation.
The hon. Member also raised the issue of banning smartphones for under-16s. The Government will consider all options in pursuit of children’s online safety. However, it is important that the Government take evidence-based action in recognition of the need to balance safety with allowing children to use technology positively. I am sure she is also aware that in November last year, the Department announced a study using methods and data to understand the impact of smartphones and social media on children. The study began in December last year and will run for six months until May 2025, and I am sure we will report to the House on that.
I come back to my right hon. Friend the Member for Oxford East (Anneliese Dodds). I take this opportunity to thank her for all the support she gave me and many of my colleagues when she served on the Front Bench in opposition and when we came into government. I look forward to seeing her on the Front Bench again soon; I hope she does not spend too long on the Back Benches.
My right hon. Friend raised the issue of the unrest last year. During that unrest, the Department worked with major platforms to tackle content contributing to the disorder, which included proactively referring content to platforms that assessed and acted on it in line with their terms of service. Throughout our engagement, we have been very clear that social media platforms should not wait for the Online Safety Act to come into action: they should actively be removing harmful content.
My right hon. Friend also raised the issue of broader international collaboration on online safety, with which I absolutely agree. International collaboration is absolutely crucial in tackling the global threat of online harms, and we must build consensus around approaches that uphold our democratic values and promote a free, open and secure internet.
As the hon. Member for Runnymede and Weybridge (Dr Spencer) said, since 2022, the Elections Act has protected candidates, campaigners and elected office holders from intimidation, both online and in person. It is an election offence for a person to make or publish, before or during an election, a false statement of fact about a candidate’s personal character or conduct, for the purpose of affecting the return for that candidate at the election, if the person does not believe it to be true. This provides a reasonable check and balance against malicious smear campaigns.
We also have the defending democracy taskforce, which has a mandate to drive forward a whole-Government response to the full range of threats to our democracy. That taskforce reports to the National Security Council and is comprised of Ministers and senior officials, as well as representatives of law enforcement, the UK intelligence community, the parliamentary authorities and the Electoral Commission. In April 2023, the task- force set up the joint election security and preparedness unit—JESP, for short—as a permanent function dedicated to protecting UK elections and referendums. It monitors and mitigates risks related to the security of elections, including those posed by artificial intelligence, misinformation and disinformation. JESP stood up an election cell for the 2024 elections, which co-ordinated a wide range of teams across Government to respond to issues as they emerged, including issues to do with protective security, cyber-threats, and misinformation and disinformation.
An election cell has been stood up for the upcoming local elections. Firm steps are being taken to ensure the security of candidates and campaigners. That happened during last year’s election, and will happen again for the upcoming local elections. Candidates were issued with security advice, and guidance was made available on gov.uk about the risks they face, including from AI and disinformation. That guidance brought together expertise from across the security community, including from the police and the National Cyber Security Centre, to help candidates implement quick and effective personal protective measures. I have only recently looked at that guidance, and I recommend that all candidates take a look. There was also an investment of £31 million over financial year 2024-25 to strengthen protective security measures for MPs, locally elected representatives and candidates.
As reported by the Electoral Commission, last year’s UK general election was delivered safely and securely. Certain novel risks, such as AI-generated deepfakes influencing the outcome, did not materialise. However, in that election, there was unacceptable harassment and intimidation directed at candidates—particularly female candidates—and campaigners, especially online. It is clearly vital that everyone, regardless of their sex/gender or race, feels able to participate in public life. The Home Office is reviewing this activity through the defending democracy taskforce.
We need to better understand the trends, motivations and drivers that cause people to harass and intimidate their elected representatives. That includes identifying gaps and vulnerabilities and developing recommendations to strengthen legislative responses, as well as a clear delineation of online versus in-person activity and its impact. That work will be reported to the taskforce, and my Department has contributed to these efforts to tackle online harms and improve online environments. While the primary responsibility for harmful social media content rests with the individuals and groups who create and post it, social media platforms have a responsibility to keep users safe.
I call Sorcha Eastwood to quickly wind up.