(6 months ago)
Commons ChamberI completely agree with my right hon. Friend. We recognise the risks and opportunities that AI presents. That is why we have tried to balance safety and innovation. I refer him to the Online Safety Act 2023, which is a technology agnostic piece of legislation. AI is covered by a range of spheres where the Act looks at illegal harms, so to speak. He is right to say that this is about helping humanity to move forward. It is absolutely right that we should be conscious of the risks, but I am also keen to support our start-ups, our innovative companies and our exciting tech economy to do what they do best and move society forward. That is why we have taken this pro-safety, pro-innovation approach; I repeat that safety in this field is an enabler of growth.
I would like to thank Sir Roger Gale, who has just left the Chair. He has been excellent in the Chair and I have very much enjoyed his company as well as his chairing.
I thank the Government for advance sight of the statement. My constituents and people across these islands are concerned about the increasing use of AI, not least because of the lack of regulation in place around it. I have specific questions in relation to the declarations and what is potentially coming down the line with regulation.
Who will own the data that is gathered? Who has responsibility for ensuring its safety? What is the Minister doing to ensure that regard is given to copyright and that intellectual property is protected for those people who have spent their time and energy and massive talents in creating information, research and artwork? What are the impacts of the use of AI on climate change? For example, it has been made clear that using this technology has an impact on the climate because of the intensive amounts of electricity that it uses. Are the Government considering that?
Will the Minister ensure that in any regulations that come forward there is a specific mention of AI harms for women and girls, particularly when it comes to deepfakes, and that they and other groups protected by the Equality Act 2010 are explicitly mentioned in any regulations or laws that come forward around AI? Lastly, we waited 30 years for an Online Safety Act. It took a very long time for us to get to the point of having regulation for online safety. Can the Minister make a commitment today that we will not have to wait so long for regulations, rather than declarations, in relation to AI?
The hon. Lady makes some interesting points. The thing about AI is not just the large language models, but the speed and power of the computer systems and the processing power behind them. She talks about climate change and other significant issues we face as humanity; that power to compute will be hugely important in predicting how climate change evolves and weather systems change. I am confident that AI will play a huge part in that.
AI does not recognise borders. That is why the international collaboration and these summits are so important. In Bletchley we had 28 countries, plus the European Union, sign the declaration. We had really good attendance at the Seoul summit as well, with some really world-leading declarations that will absolutely be important.
I refer the hon. Lady to my earlier comments around copyright. I recognise the issue is important because it is core to building trust in AI, and we will look at that. She will understand that I will not be making a commitment at the Dispatch Box today, for a number of reasons, but I am confident that we will get there. That is why our approach in the White Paper response has been well received by the tech industry and AI.
The hon. Lady started with a point about how constituents across the United Kingdom are worried about AI. That is why we all have to show leadership and reassure people that we are making advances on AI and doing it safely. That is why our AI Safety Institute was so important, and why the network of AI safety institutes that we have helped to advise on and worked with other countries on will be so important. In different countries there will be nuances regarding large language models and different things that they will be approaching—and sheer capability will be a huge factor.
(6 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Thank you for chairing this debate, Sir George. I congratulate the hon. Member for Penistone and Stocksbridge (Miriam Cates) on securing it. I want to talk about a number of things: safety online by design, the safety of devices by design, and parental and child education. Just to confuse everyone, I will do that in reverse order, starting off with parental and child education.
Ofcom has a media literacy strategy consultation on the go just now, as well as the consultation on the strategy around protecting children. Both are incredibly important. We have a massive parental knowledge gap. In about 15 or 20 years, this will not be nearly so much of a problem, because parents then will have grown up online. I am from the first generation of parents who grew up online. My kids are 10 and 13. I got the internet at home when I was six or seven, although not in the way that my kids did. Hardly anybody in this House grew up on the internet, and hardly any of the parents of my children’s peers grew up online.
I know very well the dangers there are online, and I am able to talk to my children about them. I have the privilege, the ability and the time to ensure that I know everything about everything they are doing online—whether that means knowing the ins and outs of how Fortnite works, or how any of the games they are playing online work, I am lucky enough to be able to do that. Some parents have neither the time nor the energy nor the capacity to do that.
I commend the hon. Lady for her knowledge and dedication, but is it not the case that even parents as diligent as her find that teenagers can bypass these controls? Even if our children do not have access to a device, they can easily be shown the most harmful of material on the school bus. Is this not actually about child development, and whether a child has the brain development to be able to use these devices safely, rather than just about education?
I wanted to talk about education among a number of other things. Children can absolutely be shown things on the bus, and stuff like that; children and young people will do what they can to subvert their parents’ authority in all sorts of ways, not just when it comes to mobile phones. Part of the point I was making is that I have the privilege of being able to take those actions, while parents who are working two jobs and are really busy dealing with many other children, for example, may not have the time to do so. We cannot put it all on to parental education, but we cannot put it all on to the education of the children, either. We know that however much information we give children or young people—particularly teenagers—they are still going to make really stupid decisions a lot of the time. I know I made plenty of stupid decisions as a teenager, and I am fairly sure that my children will do exactly the same.
I grew up using message boards, which have now been taken over by Reddit, and MSN Messenger, while kids now use Facebook Messenger or WhatsApp. I grew up using internet relay chat—IRC—and Yahoo! Chat, which have taken over by Discord, and playing Counter-Strike, which has now been subsumed by Fortnite. I used Myspace and Bebo, while kids now use things like Instagram. These things have been around for a very long time. We have needed an online safety Act for more than 20 years. When I was using these things in the ’90s, I was subject to the same issues that my children and other children are seeing today. Just because it was not so widespread does not mean it was not happening, because it absolutely was.
The issue with the Online Safety Act is that it came far too late—I am glad that we have it, but it should have been here 20 years ago. It also does not go far enough; it does not cover all the things that we need it to cover. During the passage of the Act, we talked at length about things like livestreaming, and how children should not be allowed to livestream at all under any circumstance. We could have just banned children from livestreaming and said that all platforms should not allow children to livestream because of the massive increase in the amount of self-generated child sexual abuse images, but the Government chose not to do that.
We have to have safety by design in these apps. We have to ensure that Ofcom is given the powers—which, even with the Online Safety Act, it does not have—to stop platforms allowing these things to happen and effectively ban children from accessing them. Effective age assurance would address some of the problems that the hon. Member for Penistone and Stocksbridge raises. Of course, children will absolutely still try to go around these things, but having that age assurance and age gating, as far as we possibly can—for example, the stuff that Ofcom is doing around pornographic content—will mean that children will not be able to access that content. I do not see that there should be any way for any child to access pornographic content once the Online Safety Act fully comes in, and once Ofcom has the full powers and ability to do that.
The other issue with the Online Safety Act is that it is too slow. There are a lot of consultation procedures and lead-in times. It should have come in far quicker, and then we would have had this protection earlier for our children and young people.
We need to have the safety of devices by design. I am slightly concerned about the number of children who are not lucky enough to get a brand-new phone; the right hon. Member for Chelmsford (Vicky Ford) talked about passing on a phone to a child. Covering that is essential if we are to have safety of devices by design. Online app stores are not effectively covered or as effectively covered as they should be, particularly when it comes to age ratings. I spoke to representatives of an online dating app, who said that they want their app to be 18-plus, but that one of the stores has rated it as 16-plus and they keep asking the store to change it and the store keeps refusing. It is totally ridiculous that we are in that situation. The regulation of app stores is really important, especially when parents will use the app store’s age rating; they will assume that the rating put forward by the app store is roughly correct. We need to make changes in that respect and we need to come down on the app stores, because they are so incredibly powerful. That is a real moment when parents, if they have parental controls, have that ability to make the changes.
In relation to safety online by design, I have already spoken about live streaming. When it comes to gaming, it is entirely possible for children to play online games without using chat functions. Lots of online games do not actually have any chat function at all. Children can play Minecraft without having any chat; they cannot play Roblox without having any effective access to chat. Parents need to understand the difference between Minecraft and Roblox—and not allow anyone to play Roblox, because it is awful.
There are decisions that need to be taken in relation to safety online by design. If people have effective age verification and an effective understanding of the audience for each of these apps and online settings, they can ensure that the rules are in place. I am not convinced yet that Ofcom has enough powers to say what is and what is not safe for children online. I am not convinced that even with the Online Safety Act, there is the flexibility for it to say, “Right—if you have done your child access assessment and you think that your app is likely to be used by children, you cannot have live streaming on the app.” I am not convinced that it has enough teeth to be able to take that action. It does when it comes to illegal content, but when it comes to things that are harmful for children but legal for adults, there is not quite enough strength for the regulator.
I will keep doing what I have been doing in this House, which is saying that the online world can be brilliant—it can be great. Kids can have a brilliant time playing online. They can speak to their friends; particularly if children are isolated or lonely, there are places where they can find fellowship and other people who are feeling the same way. That can be a positive thing. The hon. Member for Penistone and Stocksbridge has laid out where often the online world is negative, but it can be positive too. There are so many benefits in terms of schoolwork, clubs, accessing friends, and calendars. Cameras are great, too. My children sometimes use the bird app on their phones to work out which birds are singing. It is brilliant that they can do things like that online.
There are so many benefits, but we have a responsibility, just as we do when our children are playing in the park, to ensure that they are safe. We have a responsibility as legislators to ensure that the regulators have enough teeth to make sure that the online world is safe, so that children can get the benefits of the online world and of using smartphones but are not subject to the extremely negative outcomes. My hon. Friend the Member for Stirling (Alyn Smith) mentioned his constituent and the awful loss experienced by their family. Children should never, ever have to face that situation online, and we have a responsibility to regulate to ensure that they never have to.
(8 months, 3 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Thank you for your work chairing this debate, Mrs Harris. I congratulate the hon. Member for Ellesmere Port and Neston (Justin Madders) on bringing forward such a popular and important debate.
I will focus my comments on the skills required to access digital. The access issues have been raised, and are incredibly important—I do not want to take away from that. However, on the issues with skills, by 2030, 5 million workers will be acutely under-skilled in basic digital skills. That is a significant number, and it must be a massive concern for the Government.
The skills that people require to access digital must be considered. There is a generational issue: younger people are better at accessing these things. However, that is not true across the board. There is an intersectionality of issues. People are less likely to be able to have digital skills if they are more vulnerable, older, or in poverty, or if they do not have the capacity or time to access them. Given the cost of living crisis, I am increasingly seeing constituents working multiple jobs who just do not have the time to work on their digital skills because they are too busy trying to make ends meet. That is a really big concern for me.
Covid and the roll-out of accessing things online were mentioned. During covid, the Scottish Government provided 72,000 devices and 14,000 internet connections to individuals, children and families that were at risk of being digitally excluded. That has massively increased—the number of devices was up to 280,000 in 2022. We are increasing that as we go in order to ensure that young people are not digitally excluded and are able to spend time typing up documents in Microsoft Word, Google Sheets, or whatever the school prefers them to use when they are at home, because it is so important that digital skills are available for people and that the workforce of the future has digital skills.
I recognise the good work the Scottish Government, and indeed the English Government —the UK Government—did on getting devices out to people. However, UNESCO highlighted to us, among other things, the cost of devices: having gone out to people, they need to be maintained and their security needs to be upgraded. One of the things we need to think about very carefully in all our Government budgets as we go forward is how to ensure that there is ongoing investment in the digital technologies that are needed both for the people receiving them and those distributing them.
I agree. On continual access to the internet, a universal credit social tariff is available for people. Every time I meet with my local jobcentre, I make clear how important it is to stress that the social tariff is available so that people can access that reduced-cost internet access. It is important that we have that and that people know that it exists so that they can take it up.
Within my constituency, I have spoken to Virgin Money, which provides access to internet services. There is also an organisation called Silver Surfers, which provides older people with access to the services and advice they need to access the internet. We have heard about some of the negatives of the internet and some of the positives of online life. It is important to be able to access services online, particularly for people in rural communities who are a long way away from those services. It is important for tackling loneliness to be able to access communities online.
I am really sorry but I will not; I am just going to finish.
As I was saying, it is really important that people can access those things, and like-minded individuals. When my son had Kawasaki disease, it was something that hardly anybody had ever heard of, but I was able to access other parents whose children had been through the same thing to find out how my son’s disease might progress and how things might change—so access to the internet is really important.
Lastly on disenfranchisement, if someone wants to get a voter authority certificate, the main way they can do that is online. It is possible to get a certificate by post, but the process of proving their identity in order to access a certificate—a requirement that the UK Government have brought in—is mainly online. Therefore, people who are disenfranchised and unable to access those services are even more disenfranchised by the fact that the service is mainly online. I encourage the Government to ensure that particularly things like voter authority certificates are as available as possible to people, and that they are not just available online.
(1 year, 2 months ago)
Commons ChamberThe right hon. Lady raises some interesting points. We have conversed about harms, so I totally get her point about making sure that we tackle this issue in Parliament and be accountable in Parliament. As I have said, that will be done predominantly by monitoring the Bill through Ofcom’s reporting on what harms it is having to deal with. We have regular engagement with Ofcom, not only here and through the Select Committees, but through the Secretary of State.
On criminal liability, we conversed about that and made sure that we had a liability attached to something specific, rather than the general approach proposed at the beginning. It therefore means that we are not chilling innovation. People can understand, as they set up their approaches and systems, exactly what they are getting into in terms of risk for criminal liability, rather than having the general approach that was suggested at the beginning.
The review mechanism strikes me as one of the places where the Bill falls down and is weakest, because there is not a dedicated review mechanism. We have needed this legislation for more than 30 years, and we have now got to the point of legislating. Does the Minister understand why I have no faith that future legislation will happen in a timely fashion, when it has taken us so long even to get to this point? Can he give us some reassurance that a proper review will take place, rather than just having Ofcom reports that may or may not be read?
I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.
I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.
There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.
There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.
It is a pleasure to speak during what I hope are the final stages of the Bill. Given that nearly all the Bills on which I have spoken up to now have been money Bills, this business of “coming back from the Lords” and scrutinising Lords amendments has not been part of my experience, so if I get anything wrong, I apologise.
Like other Members, I want to begin by thanking a number of people and organisations, including the Mental Health Foundation, Carnegie UK, the Internet Watch Foundation, the National Society for the Prevention of Cruelty to Children and two researchers for the SNP, Aaron Lucas and Josh Simmonds-Upton, for all their work, advice, knowledge and wisdom. I also join the hon. Members for Pontypridd (Alex Davies-Jones) and for Gosport (Dame Caroline Dinenage) in thanking the families involved for the huge amount of time and energy—and the huge amount of themselves—that they have had to pour into the process in order to secure these changes. This is the beginning of the culmination of all their hard work. It will make a difference today, and it will make a difference when the Bill is enacted. Members in all parts of the House will do what we can to continue to scrutinise its operation to ensure that it works as intended, to ensure that children are kept as safe as possible online, and to ensure that Ofcom uses these powers to persuade platforms to provide the information that they will be required to provide following the death of a child about that child’s use of social media.
The Bill is about keeping people safe. It is a different Bill from the one that began its parliamentary journey, I think, more than two years ago. I have seen various Ministers leading from the Dispatch Box during that time, but the voices around the Chamber have been consistent, from the Conservative, Labour and SNP Benches. All the Members who have spoken have agreed that we want the internet to be a safer place. I am extremely glad that the Government have made so many concessions that the Opposition parties called for. I congratulate the hon. Member for Pontypridd on the inclusion of violence against women and girls in the Bill. She championed that in Committee, and I am glad that the Government have made the change.
Another change that the Government have made relates to small high-risk platforms. Back in May or June last year I tabled amendments 80, 81 and 82, which called for that categorisation to be changed so that it was not based just on the number of users. I think it was the hon. Member for Gosport who mentioned 4chan, and I have mentioned Kiwi Farms a number of times in the Chamber. Such organisations cannot be allowed to get away with horrific, vile content that encourages violence. They cannot be allowed a lower bar just because they have a smaller number of users.
The National Risk Register produced by the Cabinet Office—great bedtime reading which I thoroughly recommend—states that both the risk and the likelihood of harm and the number of people on whom it will have an impact should be taken into account before a decision is made. It is therefore entirely sensible for the Government to take into account both the number of users, when it is a significant number, and the extremely high risk of harm caused by some of these providers.
The hon. Lady is making an excellent speech, but it is critical to understand that this is not just about wickedness that would have taken place anyway but is now taking place on the internet; it is about the internet catalysing and exaggerating that wickedness, and spawning and encouraging all kinds of malevolence. We have a big responsibility in this place to regulate, control and indeed stop this, and the hon. Lady is right to emphasise that.
The right hon. Gentleman is entirely correct. Whether it involves a particularly right-wing cause or antisemitism—or, indeed, dieting content that drags people into something more radical in relation to eating disorders—the bubble mentality created by these algorithms massively increases the risk of radicalisation, and we therefore have an increased duty to protect people.
As I have said, I am pleased to see the positive changes that have been made as a result of Opposition pressure and the uncompromising efforts of those in the House of Lords, especially Baroness Kidron, who has been nothing short of tenacious. Throughout the time in which we have been discussing the Bill, I have spoken to Members of both Houses about it, and it has been very unusual to come across anyone who knows what they are talking about, and, in particular, has the incredible depth of knowledge, understanding and wisdom shown by Baroness Kidron. I was able to speak to her as someone who practically grew up on the internet—we had it at home when I was eight—but she knew far more about it than I did. I am extremely pleased that the Government have worked with her to improve the Bill, and have accepted that she has a huge breadth of knowledge. She managed to do what we did not quite manage to do in this House, although hopefully we laid the foundations.
I want to refer to a number of points that were mentioned by the Minister and are also mentioned in the letters that the Government provided relating to the Lords amendments. Algorithmic scrutiny is incredibly important, and I, along with other Members, have raised it a number of times—again, in connection with concern about radicalisation. Some organisations have been doing better things recently. For instance, someone who searches for something may begin to go down a rabbit hole. Some companies are now putting up a flag, for instance a video, suggesting that users are going down a dark hole and should look at something a bit lighter, and directing them away from the autoplaying of the more radical content. If all organisations, or at least a significant number—particularly those with high traffic—can be encouraged to take such action rather than allowing people to be driven to more extreme content, that will be a positive step.
I was pleased to hear about the upcoming researcher access report, and about the report on app stores. I asked a previous Minister about app stores a year or so ago, and the Minister said that they were not included, and that was the end of it. Given the risk that is posed by app stores, the fact that they were not categorised as user-to-user content concerned me greatly. Someone who wants to put something on an Apple app store has to jump through Apple’s hoops. The content is not owned by the app store, and the same applies to some of the material on the PlayStation store. It is owned by the person who created the content, and it is therefore user-to-user content. In some cases, it is created by one individual. There is no ongoing review of that. Age-rating is another issue: app stores choose whatever age they happen to decide is the most important. Some of the dating apps, such as match.com, have been active in that regard, and have made it clear that their platforms are not for under-16s or under-18s, while the app store has rated the content as being for a younger age than the users’ actual age. That is of concern, especially if the companies are trying to improve age-rating.
On the subject of age rating, I am pleased to see more in the Bill about age assurance and the frameworks. I am particularly pleased to see what is going to happen in relation to trying to stop children being able to access pornography. That is incredibly important but it had been missing from the Bill. I understand that Baroness Floella Benjamin has done a huge amount of work on pushing this forward and ensuring that parliamentarians are briefed on it, and I thank her for the work that she has done. Human trafficking has also been included. Again, that was something that we pushed for, and I am glad to see that it has been put on the face of the Bill.
I want to talk briefly about the review mechanisms, then I will go on to talk about end-to-end encryption. I am still concerned that the review mechanisms are not strong enough. We have pushed to have a parliamentary Committee convened, for example, to review this legislation. This is the fastest moving area of life. Things are changing so dramatically. How many people in here had even heard of ChatGPT a year and a half ago? How many people had used a virtual reality headset? How many people had accessed Rec Room of any of the other VR systems? I understand that the Government have genuinely tried their best to make the Bill as future-proof as possible, but we have no parliamentary scrutiny mechanisms written in. I am not trying to undermine the work of the Committee on this—I think it is incredibly important—but Select Committees are busy and they have no legislative power in this regard. If the Government had written in a review, that would have been incredibly helpful.
The hon. Lady is making a very good speech. When I first came to this House, which was rather a long time ago now, there was a Companies Act every year, because company law was changing at the time, as was the nature of post-war capitalism. It seems to me that there is a strong argument for an annual Act on the handling and management of the internet. What she is saying is exactly right, and that is probably where we will end up.
I completely support the right hon. Member’s point—I would love to see this happening on an annual basis. I am sure that the Ministers who have shepherded the Bill through would be terrified of that, and that the Government team sitting over there are probably quaking in their boots at the suggestion, but given how fast this moves, I think that this would be incredibly important.
The Government’s record on post-implementation reviews of legislation is pretty shoddy. If you ask Government Departments what percentage of legislation they have put through a post-implementation review in the timeline they were supposed to, they will say that it is very small. Some Departments are a bit better than others, but given the number of reshuffles there have been, some do not even know which pieces of legislation they are supposed to be post-implementation reviewing. I am concerned that this legislation will get lost, and that there is no legislative back-up to any of the mechanisms for reviewing it. The Minister has said that it will be kept under review, but can we have some sort of governmental commitment that an actual review will take place, and that legislation will be forthcoming if necessary, to ensure that the implementation of this Bill is carried out as intended? We are not necessarily asking the Government to change it; we are just asking them to cover all the things that they intend it to cover.
On end-to-end encryption, on child sexual exploitation and abuse materials, and on the last resort provider—I have been consistent with every Minister I have spoken to across the Dispatch Box and every time I have spoken to hon. Members about this—when there is any use of child sexual exploitation material or child sexual abuse material, we should be able to require the provider to find it. That absolutely trumps privacy. The largest increase in child sexual abuse material is in self-generated content. That is horrific. We are seeing a massive increase in that number. We need providers to be able to search—using the hash numbers that they can categorise images with, or however they want to do it—for people who are sharing this material in order to allow the authorities to arrest them and put them behind bars so that they cannot cause any more harm to children. That is more important than any privacy concerns. Although Ministers have not put it in the Bill until this point, they have, to their credit, been clear that that is more important than any privacy concerns, and that protecting children trumps those concerns when it comes to abuse materials and exploitation. I am glad to see that that is now written into the Bill; it is important that it was not just stated at the Dispatch Box, even though it was mentioned by a number of Members.
It is very kind of you to call me to speak, Mr Deputy Speaker. I apologise to your good self, to the Minister and to the House for arriving rather tardily.
My daughter and her husband have been staying with me over the past few days. When I get up to make my wife and myself an early-morning cup of tea, I find my two grandchildren sitting in the kitchen with their iPads, which does not half bring home the dangers. I look at them and think, “Gosh, I hope there is security, because they are just little kids.” I worry about that kind of thing. As everyone has said, keeping children safe is ever more important.
The Bill’s progress shows some of the best aspects of this place and the other place working together to improve legislation. The shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), and the hon. Member for Aberdeen North (Kirsty Blackman) both mentioned that, and it has been encouraging to see how the Bill has come together. However, as others have said, it has taken a long time and there have been a lot of delays. Perhaps that was unavoidable, but it is regrettable. It has been difficult for the Government to get the Bill to where it is today, and the trouble is that the delays mean there will probably be more victims before the Bill is enacted. We see before us a much-changed Bill, and I thank the Lords for their 150 amendments. They have put in a lot of hard work, as others have said.
The Secretary of State’s powers worry my party and me, and I wonder whether the Bill still fails to tackle harmful activity effectively. Perhaps better things could be done, but we are where we are. I welcome the addition of new offences, such as encouraging self-harm and intimate image abuse. A future Bill might be needed to set out the thresholds for the prosecution of non-fatal self-harm. We may also need further work on the intent requirement for cyber-flashing, and on whether Ofcom can introduce such requirements. I am encouraged by what we have heard from the Minister.
We would also have liked to see more movement on risk assessment, as terms of service should be subject to a mandatory risk assessment. My party remains unconvinced that we have got to grips with the metaverse—this terrifying new thing that has come at us. I think there is work to be done on that, and we will see what happens in the future.
As others have said, education is crucial. I hope that my grandchildren, sitting there with their iPads, have been told as much as possible by their teachers, my daughter and my son-in-law about what to do and what not to do. That leads me on to the huge importance of the parent being able, where necessary, to intervene rapidly, because this has to be done damned quickly. If it looks like they are going down a black hole, we want to stop that right away. A kid could see something horrid that could damage them for life—it could be that bad.
Once a child sees something, they cannot unsee it. This is not just about parental controls; we hope that the requirement on the companies to do the risk assessments and on Ofcom to look at those will mean that those issues are stopped before they even get to the point of requiring parental controls. I hope that such an approach will make this safer by design when it begins to operate, rather than relying on having an active parent who is not working three jobs and therefore has time to moderate what their children are doing online.
The hon. Lady makes an excellent point. Let me just illustrate it by saying that each of us in our childhood, when we were little—when we were four, five or six—saw something that frightened us. Oddly enough, we never forget that throughout the rest of life, do we? That is what bad dreams are made of. We should remember that point, which is why those are wise words indeed.