(4 days, 7 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Tom Collins (Worcester) (Lab)
It is a pleasure to serve under your chairship, Mr Pritchard.
At its birth, the internet was envisaged as a great advancement in a free society: decentralised, crowdsourced and open, it would share knowledge across humanity. As it grew, every one of us would own a platform and our voice. Of course, since then bandwidth has increased massively, which means that we now experience a rich variety of media. Storage and compute have increased by many orders of magnitude, which has created the power of big data, and generative capabilities have emerged quite recently, creating a whole new virtual world. Services no longer simply route us to what we were searching for but offer us personalised menus of rich media, some from human sources and some generated to entertain or meet demands.
We are now just starting to recognise the alarming trends that we are discussing today. Such rich media and content has become increasingly harmful. That compute, storage and big data power is being used to collect, predict and influence our most private values, preferences and behaviours. Generative AI is immersing us in a world of reconstituted news, custom facts and bots posing as people. It increasingly feels like a platform now owns every one of us and our voice.
Harms are dangerously impacting our young people. Research from the Centre for Countering Digital Hate illustrates some of the problems. On YouTube, the “Next Video” algorithm was found to be recommending eating disorder content to the account of a UK-based 13-year-old female. In just a few minutes, the account was exposed to material promoting anorexia and weight loss, and more than half the other recommended videos were for content on eating disorders or weight loss.
On TikTok, new teen accounts were found to have been recommended self-harm and eating disorder content within minutes of scrolling the “For You” feed. Suicide content appeared within two and a half minutes, and eating disorder content within eight. Accounts created with phrases such as “lose weight” received three times as many of these videos as standard teen accounts, and 12 times as many self-harm videos. Those are not isolated incidents, and they show the scale and speed at which harmful material can spiral into exponential immersion in worlds of danger for young people.
On X, formerly known as Twitter—a trigger warning for anybody who has been affected by the absolutely appalling Bondi beach Hanukkah attack—following the Manchester synagogue attack, violent antisemitic messages celebrating and calling for further violence were posted and left live for at least a week. ChatGPT has been shown to produce dangerous advice within minutes of account creation, including guidance on self-harm, restrictive diets and substance misuse.
I am grateful to hon. Friends for raising the topic of pornography. I had the immense privilege of being at an event with a room full of men who spoke openly and vulnerably about their experiences with pornography: how it affected their sex lives, their intimacy with their partners or wives, their dynamics of power and respect, and how it infused all their relationships in daily life. They said things such as, “We want to see it, but we don’t want to want to see it.” If adult men—it seems from this experience, at least, perhaps the majority of adult men—are finding it that hard to deal with, how can we begin to comprehend the impact it is having on our children who come across it accidentally?
This can all feel too big to deal with—too big to tackle. It feels immense and almost impossible to comprehend and address. Yet, to some, the Online Safety Act feels like a sledgehammer cracking a nut. I would say it is a sledgehammer cracking a deeply poisonous pill in a veritable chemistry lab of other psychoactive substances that the sledgehammer completely misses and will always be too slow and inaccurate to hit. We must keep it, but we must do better.
As an engineer, I am very aware that since the industrial revolution, when physical machines suddenly became immensely more powerful and complex, a whole world of not just regulations but technical standards has been built. It infuses our daily lives, and we can barely touch an object in this room that has not been built and verified to some sort of standard—a British, European or global ISO standard—for safety. We should be ready to reflect that model in the digital world. A product can be safe or unsafe. We can validate it to be safe, design it to be safe, and set criteria that let us prove it—we have shown that in our physical world since the industrial revolution. So how do we now begin to put away the big, blunt instrument of regulation when the problem seems so big and insurmountable?
John Slinger (Rugby) (Lab)
Ofcom officials came before the Speaker’s Conference, of which I am a member, so I declare that interest. They spoke about section 100 of the Act, which gives Ofcom the power to request certain types of information on how, for example, the recommender systems work on the companies’ algorithms. Unfortunately, they said that could be “complicated and challenging to do”, but one thing they spoke about very convincingly was that they want to require—in fact, they can require—those companies to put information, particularly about the algorithms, in the public domain to help researchers. That could really help with the point my hon. Friend is making about creating regulations that improve safety for our population.