(6 days, 20 hours ago)
Public Bill Committees
Andrew Cooper
Q
Stuart McKean: It is an interesting cultural challenge. You want people to be open and to report incidents that are having an impact, but at the same time, if they report those incidents they might get fined, which could be economically challenging, particularly for a small business. Yes, we want to open and to report incidents, but—and this is where the detail comes in—what is the level of detail that needs to be reported and what is the impact of reporting it? When you report it to the regulators, what are they going to do with it? How will they share it and how will it benefit everybody else? The devil is definitely in the detail, and it is a cultural change that is required.
Sarah Russell (Congleton) (Lab)
Q
Jill Broom: We can assume that it will, because if you are in the supply chain or come within scope, you will have certain responsibilities and you will have to invest, not just in technology but in the skills space as well. How easy it is to do that is probably overestimated a bit; it is quite difficult to find the right skilled people, and that applies across regulators as well as business.
Generally speaking, yes, I think it will be costly, but there are things that could probably help smaller organisations: techUK has called for things such as financial incentives, or potentially tax credits, to help SMEs. That could be applied on a priority basis, with those working within the critical national infrastructure supply chain looked at first.
Dr Sanjana Mehta: If I may expand on that, we have been consulting our members and the wider community, and 58% of our respondents in the UK say that they still have critical and significant skills needs in their organisations. Nearly half of the respondents—47%—say that skills shortages are going to be one of the greatest hurdles in regulatory compliance. That is corroborated by evidence, even in the impact assessment that has been done on the previous regulatory regime, where I think nearly half of the operators of essential services said that they do not have access to skills in-house to support the regulatory requirements. Continuing to have sustained investment in skills development is definitely going to require funding. Taking it a step back, we need first of all to understand what sort of skills and expertise we have to develop to ensure that implementation of the Bill is successful.
Alison Griffiths
Q
Stuart McKean: I am not an expert on the detail, but I would say that there is currently very little detail in the Bill regarding IT and OT.
(6 days, 20 hours ago)
Public Bill Committees
The Chair
We have only five minutes left for this session, so if we can have concise questions and answers we might get everyone in.
Sarah Russell (Congleton) (Lab)
Q
Stuart Okin: Essentially, we would not go all the way down the supply chain. First, the operators of essential services are defined very much by the thresholds. Ultimately, they are the first point of responsibility. On the critical third party suppliers that have been brought in by the Bill, there will be a small number of those that, for energy, are for the entire systemic system of the UK, not the smaller entities. So we will hold those to account. On the enforcement side of things, if and when it comes to that, they will be in the same situation as the current operators of essential services are today. We welcome the simplification in the Bill and bringing those into the same sectorial powers and the same types of fines that we see today. It will not go down to those minutiae of detail. Again, the secondary legislation gives you the ability to define that.
Natalie Black: To keep it brief, we welcome the supply chain being brought into scope because we are all well aware that the most high-profile recent incidents often emanated from the supply chain. That said, we should be very honest about the complexity of entering this space, exactly for all the points that you have alluded to in terms of volume and scale and everything. We are already using this time to work through what our methodology will be. Engaging with the operators of essential services who are ultimately the customer of these suppliers has to be a starting point in terms of who they are most worried about in their supply chain. As Stuart says, you will see some commonality across all our sectors, so the numbers might not be as big as we might at first think, but this is what we need to work through over the coming months.
Ian Hulme: From an ICO perspective, one of the big tasks that we are going to have in understanding the MSP market is what their supply chains look like. We are perhaps a little behind colleagues in other regulators because of the difference in the regulatory regime, but that is one of the tasks that we will have to get to grips with.
Q
Professor John Child: My specialism is in criminal law, so this is a bit of a side-step from a number of the pieces of evidence you have heard so far. Indeed, when it comes to the Bill, I will focus on—and the group I work for focuses on—the potential in complementary pieces of legislation, and particularly the Computer Misuse Act 1990, for criminalisation and the role of criminalisation in this field.
I think that speaks directly to the first question, on effective collaboration. It is important to recognise in this field, where you have hostile actors and threats, that you have a process of potential criminalisation, which is obviously designed to be effective as a barrier. But the reality is that, where you have threats that are difficult to identify and mostly originating overseas, the actual potential for criminalisation and criminal prosecution is slight, and that is borne out in the statistics. The best way of protecting against threats is therefore very much through the use of our cyber-security expertise within the jurisdiction.
When we think about pure numbers, and the 70,000-odd cyber-security private experts, compared with a matter of hundreds in the public sector, police and others, better collaboration is absolutely vital for effective resilience in the system. Yet what you have at the moment is a piece of legislation, the Computer Misuse Act, that—perfectly sensibly for 1990—went with a protective criminalisation across-the-board approach, whereby any unauthorised access becomes a criminal offence, without mechanisms to recognise a role for a private sector, because essentially there was not a private sector doing this kind of work at the time.
When we think about potential collaboration, first and foremost for me—from a criminal law perspective—we should make sure we are not criminalising effective cyber-security. The reality is that, when we look at the current system, if any authorised access of any kind becomes a criminal offence, you are routinely criminalising engagement in legitimate cyber-security, which is a matter of course across the board. If you are encouraging those cyber-security experts to step back from those kinds of practices—which may make good sense—you are also lessening that level of protection and/or outsourcing to other jurisdictions or other cyber-security firms, with which you do not necessarily have that effective co-operation, reporting and so on. That is my perspective. Yes, you are absolutely right, but we now have mechanisms in place that actively disincentivise that close collaboration and professionalisation.
Sarah Russell
Q
Professor John Child: Yes. It is not the easiest criminal law tale, if you like. If there were a problem of overcriminalisation in the sense of prosecutions, penalisation, high sentences and so on, the solution would be to look at a whole range of options, including prosecutorial discretion, sentencing or whatever it might be, to try to solve that problem. That is not the problem under the status quo. The current problem is purely the original point of criminalisation. Think of an industry carrying out potentially criminalised activity. Even if no one is going to be prosecuted, the chilling effect is that either the work is not done or it is done under the veil of potential criminalisation, which leads to pretty obvious problems in terms of insurance for that kind of industry, the professionalisation of the industry and making sure that reporting mechanisms are accurate.
We have sat through many meetings with the CPS and those within the cyber-security industry who say that the channels of communication—that back and forth of reporting—is vital. However, a necessary step before that communication can happen is the decriminalisation of basic practices. No industry can effectively be told on the one hand, “What you are doing is vital,” but on the other, “It is a criminal offence, and we would like you to document it and report it to us in an itemised fashion over a period of time.” It is just not a realistic relationship to engender.
The cyber-security industry has evolved in a fragmented way both nationally and internationally, and the only way to get those professionalisation and cyber-resilience pay-offs is by recognising that the criminal law is a barrier—not because it is prosecuting or sentencing, but because of its very existence. It does not allow individuals to say, “If, heaven forbid, I were prosecuted, I can explain that what I was doing was nationally important. That is the basis on which I should not be convicted, not because of the good will of a prosecutor.”
Dr Gardner
Q
Professor John Child: I think the Bill does a lot of things quite effectively. It modernises in a sensible way and it allows for the recognition of change in type of threat. This goes back to my criminalisation point. Crucially, it also allows modernisation and flexibility to move through into secondary legislation, rather than us relying purely on the maturations of primary legislation.
In terms of board-level responsibility, I cannot speak too authoritatively on the civil law aspects, but drawing on my criminal law background, there is something in that as well. At the moment, the potential for criminalisation applies very much to those making unauthorised access to another person’s system. That is the way the criminal law works. We also have potential for corporate liability that can lead all the way up to board rooms, but only if you have a directing mind—so only if a board member is directing that specific activity, which is unlikely, apart from in very small companies.
You can have a legal regime that says, whether through accreditation or simple public interest offences, that there are certain activities that involve unauthorised access to another person’s system, which may be legitimate or indeed necessary. However, we want a professional culture within that; we do not want that outsourced to individuals around the world. You can then build in sensible corporate liability based on consent or connivance, which goes to individuals in the boardroom, or a failure-to-prevent model of criminalisation, which is more popular when it comes to financial crimes. That is where you say, “If this exists in your sector, as an industry and as a company, you can be potentially liable as an entity if you do not make sure these powers are used responsibly, and if you essentially outsource to individuals in order to avoid personal liabilities”.
(2 weeks, 6 days ago)
Commons ChamberThe hon. Lady has got this wrong, but her party wants to scrap the Online Safety Act 2023, and that says everything about Reform.
Sarah Russell (Congleton) (Lab)
The Children’s Commissioner spoke to a group of 15 and 16-year-olds in 2024 and found that three quarters of them had been sent a beheading video. It is possible that a great number of children are protecting us from what they see online, instead of us protecting them. Can I emphasise strongly the importance of speaking to a large range of children from different backgrounds about this? Sadly, they do not always feel able to make us aware of everything that they are exposed to online.
My hon. Friend raises a really important issue, which is making sure that young people trust us and feel confident in raising these matters. It is our job to make sure that nobody is frightened to say what is happening to them. We will not get this right unless we talk to people of all ages and from all backgrounds, in all parts of the country. Hon. Members know that they have a vital job to play in their constituency. As Secretary of State, I am responsible for the entire United Kingdom, so I urge hon. Members, for all the politics and show in this House, to engage locally, because then we will get this right.
(4 weeks ago)
Commons ChamberOther countries have different legislative systems. I believe that our Online Safety Act 2023, along with the other measures that I have mentioned, is one of the most comprehensive ways of addressing this issue. The hon. Lady is right to speak of the need for speedy and swift action, and that point has been made time and again in the House, but the Government’s determination to tackle violence against women and girls comes from the top down and goes right across every Department.
I should have said earlier that the Minister for Digital Government and Data, who is a joint Minister in DSIT and in the Department for Culture, Media and Sport, is looking at the issue of advertising, including the monetisation of some of these behaviours. “Follow the money” is a really important issue, and we want to address it.
Sarah Russell (Congleton) (Lab)
The overwhelming majority of child sexual abuse imagery produced online is still, very sadly, produced by children themselves, who have been groomed by adults in order to do so. What steps will the Government take to ensure that there are device-level protections to prevent children from taking and sharing nude images of themselves?
My hon. Friend has raised a really important issue, which I am happy to discuss with her further. What she says is exactly what is happening in this country.
I know that many Members have not had a chance to ask a question, but I will find a way to enable them to ask that question, and I will secure a response through the Department—including my parliamentary private secretaries—because I know how passionately all Members care about this issue, and I want to continue the debate.
(1 month, 4 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Sarah Russell (Congleton) (Lab)
It is a pleasure to serve with you in the Chair, Ms Butler. I thank the hon. Member for Dewsbury and Batley (Iqbal Mohamed) for securing this debate.
There are two problems—maybe three—with AI. The first is that we do not distinguish very well between what is and is not AI. Although AI and tech are obviously related, they are not the same thing. It is important that when we talk about AI we distinguish it from tech. There is a need to regulate a lot of tech much better than we currently do, but AI poses very specific problems. The first one—I can see people from ControlAI in the Public Gallery—is the fact that we do not fully understand the models.
It worries any sensible-thinking person that we are unleashing technologies that appear to be able to self-replicate and do other things, and we are incorporating them into military hardware without a full understanding of how they work. We do not have to be a catastrophist or conspiracy theorist to be worried. I am generally a very optimistic person, but it is important to be optimistic on the basis of understanding the technology that we use and then regulating it appropriately. That does not mean stifling innovation, but it does mean making sure we know what we are doing.
When I look at AI, we have, as I said, two problems. One is rubbish in, rubbish out, and there is a lot of rubbish going into AI at the moment. We can see that in all sorts of terrible situations. We have a huge amount of in-built gender bias in our society. That means that, for instance, if we ask for AI to generate a picture of a female solicitor, as I am, we will get a picture of a woman who is barely clothed, but has a library of books behind her. That is not how female solicitors that I know go to work, but that is how AI thinks we are, and that has real-world impacts.
If we ask AI to suggest an hourly rate as a freelancer, it is on average suggesting significantly lower rates for women than it is for men. There are questions about algorithmic bias permeating the whole of the algorithm. Questions have been raised recently about LinkedIn. I and a lot of women I know are finding that we have significantly less interaction via LinkedIn than we used to. Various women have now changed their gender on their bios to male and suddenly find that their engagement levels go straight back up. LinkedIn appears to think we are not interesting and people will not want to read our content, so it is stopping showing female content at the same rate, it would appear. I caveat that I have not been able to speak to LinkedIn directly, but certainly a lot of women I know are reporting these problems.
We put in bio stuff to start with, but huge amounts of the image training data is based on what is publicly available on the internet, and that image training data of women on the internet is largely pornographic, which influences what comes out the other end of these models. When we look at that in terms of children, we have real problems. Nudification apps are huge and need to be dealt with. I would like to get into how I am worried about that and deal with health and how we do not have good enough training data on the interaction between gender and health and various other matters, but I will stop now. I thank everyone for their time today. I know colleagues will pick up important points.
Victoria Collins (Harpenden and Berkhamsted) (LD)
It is a pleasure to serve under your chairmanship, Ms Butler. I congratulate the hon. Member for Dewsbury and Batley (Iqbal Mohamed) on securing this incredible debate. That so many issues have been packed into 90 minutes shows clearly that we need more time to debate this subject, and I think it comes down to the Government to say that an AI Bill, or further discussions, are clearly needed. The issue now pervades our lives, for the better but in many aspects for the worse.
As the Liberal Democrat spokesperson on science, innovation and technology, I am very excited about the positive implications of AI. It can clearly help grow our economy, solve the big problems and help us improve our productivity. However, it is clear from the debate that it comes with many risks that have nothing to do with growing our economy—certainly not the kind of economy we want to grow—including the use of generative AI for child sexual abuse material, children’s growing emotional dependency on chatbots, and the provision of suicide advice.
I have said for a long time the trust element is so important. It is two sides of the same coin: if we cannot trust this technology then we cannot develop as a society, but it is also really important for business and our economy. I find it fascinating that so many more businesses are now talking about this and saying, “If we can’t trust this technology, we can’t use it, we can’t spend money on it and we can’t adopt it.” Trust is essential.
If the UK acts fast and gets this right, we have a unique opportunity to be the leader on this. From talking to industry, I know that we have incredible talent and are great at innovating, but we also have a fantastic system for building trust. We need to take that opportunity. It is the right thing to do, and I believe we are the only country in the world that can really do it, but we have to act now.
Sarah Russell
Does the hon. Lady agree that we should be looking hard at the EU’s regulation in this area, and considering alignment and whether there might be points on which we would like to go further?
Victoria Collins
Absolutely, and the point about global co-operation has been made clearly across the Chamber today. The hon. Member for Leicester South (Shockat Adam) talked about what is now the AI Security Institute—it was the AI Safety Institute—and that point about leading and trust is really important. Indeed, I want to talk a little more about safety, because security and safety are slightly different. I see safety as consumer facing, but security is so important. Renaming the AI Safety Institute as the AI Security Institute, as the hon. Member mentioned, undermines the importance of both.
The first point is about AI psychosis and chatbots—this has been covered a lot today, and it is incredibly worrying. My understanding is that the problem of emotional dependency on AI chatbots is not covered by the Online Safety Act. Yes, elements of chatbots are covered—search functionality and user to user, for example—but Ofcom itself has said that there are certain harms from AI chatbots, which we can talk about, that are not covered. We have heard that 1.2 million users a week are talking to ChatGPT about suicide—we heard the example of Adam, who took his own life in the US after talking to a chatbot—and two thirds of 23 to 34-year-olds are turning to chatbots for their mental health. These are real harms.
Of course, the misinformation that is coming through chatbots also has to be looked at seriously. The hon. Member for York Outer (Mr Charters) mentioned the facts and the advice coming through. We can achieve powerful outcomes, but we need to make sure that chatbots are built in a way that ensures that advisory element, perhaps by linking with NHS or other proper advice.
The hon. Member for Milton Keynes Central (Emily Darlington), who has been very passionate about this issue, mentioned the Molly Rose Foundation, which is doing incredible work to show the harms coming through this black hole—many do not see the harms, which have an impact on children that parents do not understand, as well as on adults.
The harm of deepfakes, including horrific CSAM and sexual material of all ages, has also been mentioned, and it is also impacting our economy. Just recently, a deepfake was unfortunately made of the hon. Member for Mid Norfolk (George Freeman). The Sky journalist Yalda Hakim was also the victim of a deepfake. She mentioned her worry that it was shared thousands of times, but also picked up by media in the subcontinent. These things are coming through, and no one who watches them can tell the difference. It is extremely worrying.
As the hon. Member for Congleton (Sarah Russell) said, “Rubbish in, rubbish out.” What is worrying is that, as the Internet Watch Foundation has said, because a lot of the rubbish going in is online sexual content that has been scraped, that is what is coming out.
Then there is AI slop, as the right hon. Member for Oxford East (Anneliese Dodds) mentioned. Some of that is extreme content, but what worries me is that, as many may know, our internet is now full of AI slop—images, stories and videos—where users just cannot tell the difference. I do not know about others, but I often look at something and think, “Ah, that’s really cute. Oh no—that is not real.” What is really insidious is that this is breaking down trust. We cannot tell any more what is real and what is not, and that affects trust in our institutions, our news and our democracy. What we say here today can be changed. Small changes are breaking down trust, and it is really important that that stops. What is the Minister doing about AI labelling and watermarking, to make sure we can trust what we see? That is just one small part of it.
The other thing, which my hon. Friend the Member for Newton Abbot (Martin Wrigley) mentioned, is that often AI threats magnify what is already a threat, whether it is online fraud or a security threat. I believe that AI scams in just the first three months of this year cost Brits £1 billion. One third of UK businesses said in the first quarter they had been victims of AI fraud. And I have not got on to what the hon. Member for Dewsbury and Batley said about moving towards AI in security and defence, and superintelligence. What are the “exaggerated” threats that actually will become extremely threatening? What are the Government doing to clamp down on these threats, and what are they doing on AI fraud and online safety?
Another issue is global working. One of the Liberal Democrats’ calls is for an AI safety agency, which could be headquartered in the UK; we could take the lead on it. I think that is in line with what the hon. Member for Dewsbury and Batley was talking about. We have this opportunity; we need to take it seriously, and we could be a leader on that.
I will close by reiterating the incredible work that AI could do. We all know that it could solve the biggest problems of tomorrow, and it could improve our wellbeing and productivity, but the threats and risks are there. We have to manage them now, and make sure that trust is built on both sides.
The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
It is a pleasure to serve with you in the Chair, Ms Butler, for my first Westminster Hall debate. It is a particular pleasure not only to have you bring your technological expertise to the Chair, but for the hon. Member for Strangford (Jim Shannon) to be reliably present in my first debate, as well as the UK’s—perhaps the world’s—first AI MP, my hon. Friend the Member for Leeds South West and Morley (Mark Sewards). It is a distinct pleasure to serve with everyone present and the expertise they bring. I thank the hon. Member for Dewsbury and Batley (Iqbal Mohamed) for securing this debate on AI safety. I am grateful to him and to all Members for their very thoughtful contributions to the debate.
It is no exaggeration to say that the future of our country and our prosperity will be led by science, technology and AI. That is exactly why, in response to the question on growth posed by the hon. Member for Runnymede and Weybridge (Dr Spencer), we recently announced a package of new reforms and investments to use AI to power national renewal. We will drive growth through developing new AI growth zones across north and south Wales, Oxfordshire and the north-east, creating opportunities for innovation by expanding access to compute for British researchers and scientists.
We are investing in AI to drive breakthroughs in developing new drugs, cures and treatments. But we cannot harness those opportunities without ensuring that AI is safe for the British public and businesses, nor without agency over its development. I was grateful for the points made by my hon. Friend the Member for Milton Keynes Central (Emily Darlington) on the importance of standards and the hon. Member for Harpenden and Berkhamsted (Victoria Collins) about the importance of trust.
That is why the Government are determined to make the UK one of the best places to start a business, to scale up, to stay on our shores, especially for the UK AI assurance and standards market. Our trusted third-party AI assurance roadmap and AI assurance innovation fund are focused on supporting the growth of UK businesses and organisations providing innovative AI products that are proven to be safe for sale and use. We must ensure that the AI transformation happens not to the UK but with and through the UK.
In consistency with the points raised by my hon. Friend the Member for Milton Keynes Central, that is why we are backing the sovereign AI unit, with almost £500 million in investment, to help build and scale AI capabilities on British shores, which will reflect our country’s needs, values and laws. Our approach to those AI laws seeks to ensure that we balance growth and safety, and that we remain adaptable in the face of inevitable AI change.
On growth, I am glad to hear the points made by my hon. Friend the Member for Leeds South West and Morley about a space for businesses to experiment. We have announced proposals for an AI growth lab that will support responsible AI innovation by making targeted regulatory modifications under robust safeguards. That will help drive trust by providing a precisely safe space for experimentation and trialling of innovative products and services. Regulators will monitor that very closely.
On safety, we understand that AI is a general-purpose technology, with a wide range of applications. In recognition of the contribution from the hon. Member for Newton Abbot (Martin Wrigley), I reaffirm some of the points he made about being thoughtful in regulatory approaches that distinguish between the technology and the specific use cases. That is why we believe that the vast majority of AI should be regulated at the point of use, where the risk relates and tractable action is most feasible.
A range of existing rules already applies to those AI systems in application contexts. Data protection and equality legislation protect the UK public’s data rights. They prevent AI-driven discrimination where the systems decide, for example, who is offered a job or credit. Competition law helps shields markets from AI uses that could distort them, including algorithmic collusion to set unfair prices.
Sarah Russell
As a specialist equality lawyer, I am not currently aware of any cases in the UK around the kind of algorithmic bias that I am talking about. I would be delighted to see some, and delighted to see the Minister encouraging that, but I am not sure that the regulatory framework would achieve that at present.
(11 months, 2 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Mrs Sarah Russell (Congleton) (Lab)
It is an honour to serve under your chairmanship, Mr Stringer. I did a lot of research in preparation for my speech today and, as a parent of three primary-age children, what I found really alarmed me. The National Society for the Prevention of Cruelty to Children reports that there were more than 7,000 offences of sexual communication with children last year, which was a significant increase on the year before. It says that typically in those offences, the perpetrators start to talk to children on fairly mainstream web services, and then encourage them to communicate instead on more private messaging services such as Snapchat, WhatsApp and Instagram. I was pretty shocked. I did not appreciate that this was such a widespread problem. We all know that if there were 7,000 offences reported to the police, a considerably larger number will have happened. I also discovered the prevalence of dating app use among children. Children experience terrible offences when they go to meet people who were, in fact, adults preying on them.
Fundamentally, we need to understand that when we talk about social media, children are a product. If anything that we use on the internet does not cost any money, the gain for the provider is access to our thoughts, feelings and communications—in this case, our children’s thoughts, feelings and communications with their friends. We have a generation now for whose entire lifespan those thoughts, feelings and communications with friends can be monetised and tracked across multiple different websites or social media apps. The complex picture that those companies have of our children is incredibly sophisticated, and their ability to target content at them is like nothing we have ever even imagined.
There is also a problem with parents inadvertently facilitating some of this stuff. I would count myself within that description to some extent, so it is certainly not judgmental. When a parent naively says that when a child is 13, they can access something that they would broadly consider uncontroversial—such as WhatsApp so they can chat to their friends—that creates an ageing risk throughout the lifespan of that app use. As was mentioned previously, children subsequently appear to be 16 or 18 before they actually are, and therefore obtain access to services that are unsafe for them much younger than they otherwise would have done. The parents do not appreciate the ageing risk that they are creating, potentially several years down the line.
The NSPCC says that we have a fundamental problem. We now have the Online Safety Act, introduced by the Conservatives, and we are working hard as a Government to bring it into force. Ofcom has been given a significant role in looking at child risk assessment by online providers. We all know that if those people had children’s best interests at heart, they would already have done a lot of the things that Ofcom requires. The fact that Ofcom is having to do an investigation into OnlyFans, and its ability or willingness to prevent under-age children from seeing sexualised content, does not sit comfortably—that is the minimum I will say about it.
[Martin Vickers in the Chair]
If I am honest, I am not quite sure what the right solution is to those problems. If we do not get societal consensus on the right solution, we will, for instance, carry on seeing parents helping children to circumnavigate age restrictions, and children using VPNs to circumnavigate them themselves. Plenty of teenagers are sophisticated enough to do that. I am not sure what the right answer is. I am not sure that preventing under-16s from accessing such content will solve it. There is a risk that it will create a false sense of security and enable providers of the facilities and apps to say, “Well, under-16s can’t use it. We don’t have to put any safety features in because children are not allowed it anyway.” They will completely abdicate responsibility.
It is important that we keep talking about these issues, and that we move forward on a cross-party basis. These are sophisticated problems and I am not sure whether we have a sufficiently sophisticated response to them. The Online Safety Act provides us with a lot of tools, and I can see that its potential fines of 10% of global revenue are quite high. That has the potential to drive some behaviour change, provided the companies involved really see that the tools have teeth. I hope that we will monitor very heavily how Ofcom gets on with the new legislation; I am sure that Members of all parties will be interested in that.
My hon. and learned Friend the Member for Folkestone and Hythe (Tony Vaughan) said that he spoke to his children before the debate to tell them that he was going to raise these issues. I did so with my children over breakfast this morning, and one of them berated me for not having been in her online safety assembly. We have to be realistic about the capacity of both parents and schools to manage these issues without making it a blame game between different organisations—parents versus schools versus major corporations. These corporations have a huge vested interest in exploiting our children, and we have to figure out how better to protect them.