(4 days, 10 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Tom Collins
I thank my hon. Friend for his remark. He is entirely right. In my own experience of engineering products, very critically, for safety, it was incumbent upon us to be fully open about everything we had done with those regulating and certifying our products for approval. We had numerous patents on our technology, which was new and emerging and had immense potential and value, yet we were utterly open with those notified bodies to ensure that our products were safe.
Similarly, I was fortunate enough to be able to convene industry to share the key safety insights that we were discovering early on to make sure that no mistake was ever repeated, and that the whole industry was able to innovate and develop in a safe way. I thank my hon. Friend the Member for Rugby (John Slinger) for his comments, and I strongly agree that there is no excuse for a lack of openness when it comes to safety.
How do we move forward? The first step is to start breaking down the problem. I have found it helpful to describe it in four broad categories, including hazards that apply to the individual simply through exposure. This would be content such as pornography, violence and images of or about abuse. And then there are hazards that apply to an individual by virtue of interaction, such as addictive user interfaces or personified GPTs. We cannot begin to comprehend the potential psychological harms that could come to human beings when we start to promote attachment with machines. There is no way we can have evidence to inform how safe or harmful that would be, but I suggest that all the knowledge that exists in the psychology and psychiatric communities would probably point to it being extremely risky and dangerous.
We have discussed recommendation algorithms at length. There are also societal harms that affect us collectively by exposure. These harms could be misinformation or echo chambers, for example. The echo chambers of opinion have now expanded to become echo chambers of reality in which people’s worldviews are increasingly being informed by what they see in those spaces, which are highly customised to their existing biases.
Tom Hayes (Bournemouth East) (Lab)
I have met constituents to understand their concerns and ambitions in relation to online safety legislation. There is a clear need to balance the protection of vulnerable users against serious online harms with the need to protect lawful speech as we pragmatically review and implement the Act.
My hon. Friend talks about equipping our younger people, in particular, with the skills to scrutinise what is real or fake. Does he agree that, although we have online safety within the national curriculum, we need to support our teachers to provide consistent teaching in schools across our country so that our children have the skills to think critically about online safety, in the same way as they do about road safety, relationships or consent? [Interruption.]
Before we continue, could I ask that everybody has their phone on silent, please?