(1 day, 22 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Lewis Atkinson
I agree that there is significant work to be done to effectively implement the OSA. I will touch on that, and the Minister may wish to do so in his response.
Crucially, the report by the Children’s Commissioner found that children were most likely to see pornography by accident—a key point that some of the criticism of the Act fails to grasp. The horrifying statistics, showing the scale of online harm to children that the OSA is working to reduce, make it obvious why in a recent survey 69% the public backed the introduction of age verification checks on platforms, and why children’s charities and children’s rights organisations overwhelmingly back the OSA and—to my hon. Friend’s point—want it implemented more rapidly and robustly.
I have heard that some petition signatories are particularly concerned about age verification on platforms, such as X, Reddit or Discord, beyond those specifically designed as pornography sites. However, the report by the Children’s Commissioner shows that eight out of 10 of the main sources where children saw pornography were not porn sites; they were social media or networking sites. Those platforms that choose to allow their users to upload pornographic content—some do not—should be subject to the same age-verification requirements as porn sites in order to keep our children safe.
Following the implementation of those provisions of the Online Safety Act, it was reported that UK traffic to the most popular pornographic websites was notably down. Yes, it was initially reported that there had been in spike in the number of virtual private networks, or VPNs, being downloaded for access to those sites, but research increasingly suggests it is likely that that trend was being driven by adults worried about their anonymity, rather than by children seeking to circumvent the age limitations.
The Online Safety Act addresses harms beyond those done by porn. Content that is especially harmful to children and that children should not have access to includes very violent content and content encouraging limited eating or suicide.
Amanda Hack (North West Leicestershire) (Lab)
Looking at those algorithms is a really important part of the Online Safety Act. When I was a county councillor looking at public health, I did a piece of work on disordered eating, and I was bombarded with content. I am not a vulnerable young person or a vulnerable adult, but my real fear is that that information is seen by people who are not as capable of managing that content. Does my hon. Friend agree that algorithm work is a key part of the Online Safety Act?
Lewis Atkinson
My hon. Friend is right. The proactive duty that the Act places on providers in relation to the nature of their algorithms and their content is crucial because of the type of content to which she refers. It is right that the largest providers, and those most frequently used by kids, have to take active responsibility for keeping children safe. The implementation of the OSA means that algorithms serving harmful content to kids are now being regulated for the first time. There is a long way to go, and I am sure that other Members will say more than I can in this introduction, but I want to be clear to my constituents that I support the action that the OSA is prompting to improve children’s safety and welfare online.
Various surveys set out the impact of the Online Safety Act; Ofcom is publishing its research and a formal Government review will follow in due course. However, most impactful for me was seeing a teenage boy say on a news piece recently that, now,
“when I’m scrolling TikTok, I’m free from violence.”
That changed for him in the months following the implementation of the Online Safety Act, so it is no wonder that organisations such as the Online Safety Act Network, which I spoke to in preparation for this debate, fully support the Act’s principles. The network points to early evidence that the Act is actively reducing harm to children and emphasised that Ofcom must move beyond content filters to ensure safety by design, which would, for example, include addressing features that incentivise pile-ons, targeting an individual with abuse and harassment.
New Ofcom research shows that 58% of parents now believe that measures in the code of practice are beginning to improve the safety of children online. My belief is that we should be considering not whether to repeal the Act, but how we can continue to enforce it in a robust, effective and proportionate manner.
The way in which the Online Safety Act addresses online hate has perhaps not had as much focus as it might have. As well as being a member of the Petitions Committee, I am privileged to be a member of the Home Affairs Committee, which is conducting an inquiry into combating new forms of extremism. It is very clear from the public evidence that we have received so far that, left unregulated and unchallenged, online spaces and services can be used to amplify hate, thus risking a rise in extremist action, including violence.
Analysis by the Antisemitism Policy Trust highlights that there are patterns of co-ordinated and persistent misogynistic, anti-immigrant, anti-Government and antisemitic discourse on social media, with bot accounts being repeatedly used to amplify misleading or harmful narratives that fuel hate and may increase the risk of violence. Such content often breaches platforms’ own terms of service, but under the Online Safety Act, I understand that Ofcom category 1 services will now be mandated to proactively offer users optional tools to help them to reduce the likelihood that they will encounter legal but harmful content such as that.
There is much to be done to implement those provisions in an appropriate manner. However, I invite anyone calling for full repeal of the Act to consider how we as a society deal with the rise of extremism, and a context where the internet can be used as a sort of free-for-all fuelled by hate-filled algorithms that thrive on and incentivise division and hatred, rather than consensus and civic peace.
I am aware that there are large parts of the Online Safety Act that I have not been able to touch on today; I hope that others will do so during the debate. There are questions about end-to-end encryption, cyber-flashing, the creation of abusive deepfakes, AI moderation and chatbots.
(7 months, 1 week ago)
Commons Chamber
Amanda Hack (North West Leicestershire) (Lab)
My constituent, James Hares, was one of the brave pilots in the same unit. Despite the unit having a death rate of about 48%—one of the worst of the war—he survived, only to sadly pass away on the journey home. We have heard from so many colleagues already how those stories are largely untold and how many people did crucial work in helping to win the war—
Order. If Members are going to get in during this debate, interventions need to be interventions and not mini-speeches.