(1 year, 11 months ago)
Commons ChamberI rise to talk broadly about new clause 2, which I am pleased that the Government are engaging on. My right hon. and hon. Friends have done incredible work to make that happen. I share their elation. As—I think—the only Member who was on the Joint Committee under the fantastic Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), and on both Committees, I have seen the Bill’s passage over the past year or so and been happy with how the Government have engaged with it. That includes on Zach’s law, which will ensure that trolls cannot send flashing images to people with epilepsy. I shared my colleagues’ elation with my hon. Friend the Member for Stourbridge (Suzanne Webb) when we were successful in convincing the Government to make that happen.
May I reiterate the learnings from the Joint Committee and from the Committee earlier last year? When we took evidence from the tech giants—they are giants—it was clear that, as giants do, they could not see the damage underfoot and the harm that they were doing because they are so big. They were also blind to the damage they were doing because they chose not to see it. I remember challenging a witness from one of the big tech giants about whether they had followed the Committee on the harms that they were causing to vulnerable children and adults. I was fascinated by how the witnesses just did not care. Their responses were, “Well, we are doing enough already. We are already trying. We are putting billions of pounds into supporting people who are being harmed.” They did not see the reality on the ground of young people being damaged.
When I interviewed my namesake, Ian Russell, I was heartbroken because we had children of a similar age. I just could not imagine having the conversations he must have had with his family and friends throughout that terrible tragedy.
Is my hon. Friend aware that Ian Russell has pointed out that 26% of young people who present at hospital with self-harm and suicide attempts have accessed such predatory, irresponsible and wilful online content?
My hon. Friend is absolutely right. One of the real horrors is that, as I understand it, Facebook was not going to release—I do not want to break any rules here—the content that his daughter had being viewing, to help with the process of healing.
If I may, I want to touch on another point that has not been raised today, which is the role of a future Committee. I appreciate that is not part of the Bill, but I feel strongly that this House should have a separate new Committee for the Online Safety Bill. The internet and the world of social media is changing dramatically. The metaverse is approaching very rapidly, and we are seeing the rise of virtual reality and augmented reality. Artificial intelligence is even changing the way we believe what we see online and at a rate that we cannot imagine. I have a few predictions. I anticipate that in the next few years we will probably have the first No. 1 book and song written by AI. We can now hear online fake voices and impersonations of people by AI. We will have songs and so on created in ways that fool us and fool children even more. I have no doubt that in the coming months and years we will see the rise of children suing their parents for sharing content of them when they were younger without permission. We will see a changing dynamic in the way that young people engage with new content and what they anticipate from it.
My hon. Friend is making a valuable contribution to the debate, as I expected he would having discussed it with him from the very beginning. What he describes is not only the combination of heartlessness and carelessness on the part of the tech companies, but the curious marriage of an anarchic future coupled with the tyranny of their control of that future. He is absolutely right that if we are to do anything about that in this place, we need an ongoing role for a Committee of the kind he recommends.
I thank my right hon. Friend for those comments. I will wrap up shortly, Mr Deputy Speaker. On that point, I have said before that the use of algorithms on platforms is in my mind very similar to addictive drugs: they get people addicted and get them to change their behaviours. They get them to cut off from their friends and family, and then they direct them in ways that we would not allow if we could wrap our arms around them and stop it. But they are doing that in their own bedrooms, classrooms and playgrounds.
I applaud the work on the Bill. Yes, there are ways it could be improved and a committee that looks at ways to improve it as the dynamics of social media change will be essential. However, letting the Bill go to the other place will be a major shift forwards in protecting our young people both now and in the future.
Thank you for your patience, Siobhan Baillie.
(2 years ago)
Commons ChamberA number of Members across the House have been pushing for Zach’s law, and I am really delighted that Zach’s family can see in Hansard that that campaigning has really made a direct change to the law.
I just want to echo the previous points. This has been a hard-fought decision, and I am so proud that the Government have done this, but may I echo the thanks to Zach for being a true hero? We talk about David and Goliath, the giant—the beast—who was taken down, but Zach has beaten the tech giants, and I think this is an incredible success.
I absolutely echo my hon. Friend’s remarks, and I again thank him for his work.
We are also taking steps to strengthen Ofcom’s enforcement powers, which is why we are giving Ofcom a discretionary power to require non-compliant services to publish or notify their users of enforcement action that it has taken against the service. Ofcom will be able to use this power to direct a service to publish details or notify its UK users about enforcement notices it receives from Ofcom. I thank the Antisemitism Policy Trust for bringing this proposal to our attention and for its helpful engagement on the issue. This new power will promote transparency by increasing awareness among users about breaches of the duty in the Bill. It will help users make much more informed decisions about the services they use, and act as an additional deterrent factor for service providers.
The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.
It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.
My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.
Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.
May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.
I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.
There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.
We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.
It is a pleasure to follow Zach’s MP, the hon. Member for Batley and Spen (Kim Leadbeater). I particularly want to pick up on her final comments about the difficulties of platforms—not just small platforms, but larger ones—hosting extremist content, be it incels, the alt-right, the radical left or any other kind.
I will speak to my new clauses 34 and 35, which seek to deal with both disinformation and misinformation. They are important amendments, because although the Bill has taken huge steps forward—we are led to believe that it may take a couple more in due course when the revised version comes back if the recommittal is passed—there are still whole categories of harm that it does not yet address. In particular, it focuses, rightly and understandably, on individual harms to children and illegal activities as they relate to adults, but it does not yet deal with anything to do with collective harms to our society and our democracy, which matter too.
We have heard from former journalists in this debate. Journalists know it takes time and money to come up with a properly researched, authoritatively correct, accurate piece of journalism, but it takes a fraction of that time and cost to invent a lie. A lie will get halfway around the world before the truth has got its boots on, as the saying rightly goes. Incidentally, the hon. Member for Rotherham (Sarah Champion) said that it is wonderful that we are all learning so much. I share that sentiment; it is marvellous that we are all comparing and sharing our particular areas of expertise.
One person who seems to have all areas of expertise under his belt is my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who chaired the Joint Committee. He rightly pointed out that this is a systems Bill, and it therefore deals with trying to prevent some things from happening—and yet it is completely silent on misinformation and disinformation, and their effect on us collectively, as a society and a democracy. New clauses 34 and 35 are an attempt to begin to address those collective harms alongside some individual harms we face. One of them deals with a duty of balance; the other deals with factual accuracy.
The duty of balance is an attempt to address the problem as it relates to filter bubbles, because this is a systems Bill and because each of us has a tailored filter bubble, by which each of the major platforms, and some of the minor ones, work out what we are interested in and feed us more of the same. That is fine for people who are interested in fishing tackle; that is super. But if someone is interested in incels and they get fed more and more incel stuff, or they are vaguely left wing and get taken down a rabbit hole into the increasingly radical left—or alternatively alt-right, religious extremism or whatever it may be—pretty soon they get into echo chambers, and from echo chambers they get into radicalisation, and from radicalisation they can pretty soon end up in some very murky, dark and deep waters.
There are existing rules for other old-world broadcasters; the BBC, ITV and all the other existing broadcasters have a duty of balance and undue prominence imposed on them by Ofcom. My argument is that we should consider ways to impose a similar duty of balance on the people who put together the programs that create our own individual filter bubbles, so that when someone is shown an awful lot of stuff about incels, or alt-right or radical left politics, somewhere in that filter bubble they will be sent something saying, “You do know that this is only part of the argument, don’t you? Do you know that there is another side to this? Here’s the alternative; here’s the balancing point.” We are not doing that at the moment, which is one of the reasons we have an increasingly divided societal and political debate, and that our public square as a society is becoming increasingly more fractious—and dangerous, in some cases. New clause 35 would fix that particular problem.
New clause 34 would deal with the other point—the fact that a lie will get halfway around the world before the truth has got its boots on. It tries to deal with factual accuracy. Factual accuracy is not quite the same thing as truth. Truth is an altogether larger and more philosophical concept to get one’s head around. It is how we string together accurate and correct facts to create a narrative or an explanation. Factual accuracy is an essential building block for truth. We must at least try to ensure that we can all see when someone has made something up or invented something, whether it is that bleach is a good way to cure covid or whatever. When somebody makes something up, we need to know and it needs to be clear. In many cases that is clear, but in many cases, if it is a plausible lie, a deepfake or whatever it may be, it is not clear. We need to be able to see that easily, quickly and immediately, and say, “I can discount this, because I know that the person producing it is a serial liar and tells huge great big porkies, and I shouldn’t be trusting what they are sending me, or I can see that the actual item itself is clearly made up.”
The duty of achieving balance already exists in rules and law in other parts of our society and is tried and tested—it has stood us very well and done a good job for us for 40 or 50 years, since TV and radio became ubiquitous—and the same is true, although not for quite such a long time, for factual accuracy. There are increasingly good methods of checking the factual accuracy of individual bits of content, and if necessary, in some cases of doing so in real time, too. For example, Adobe is leading a very large global grouping producing something called the Content Authenticity Initiative, which can tell if something is a deepfake, because it has an audit trail of where the image, the item or whatever it may be came from and how it has been updated, modified or changed during the course of its life.
On that point, I want to raise the work that my hon. Friend the Member for Bosworth (Dr Evans), who is not in the Chamber at the moment, has done on body image. When images are photo-shopped and changed to give an idea of beauty that is very different from what is possible in the real world, that very much falls into the idea of truth. What are my hon. Friend’s thoughts on that point?
Addressing that is absolutely essential. That goes for any of the deepfake examples we have heard about, including from my right hon. Friend the Member for Basingstoke (Dame Maria Miller), because if we know that something has been changed—and the whole point about deepfake is that it is hard to tell—we can tell easily and say, “I know that is not right, I know that is not true, I know that is false, and I can aim away from it and treat it accordingly”.
Just to make sure that everybody understands, this is not some piece of new tech magic; it is already established. Adobe, as I have said, is doing it with the Content Authenticity Initiative, which is widely backed by other very serious tech firms. Others in the journalism world are doing the same thing, with the Journalism Trust Initiative. There is NewsGuard, which produces trust ratings; the Trust Project, which produces trust indicators; and we of course have our own press regulators in this country, the Independent Press Standards Organisation and IMPRESS.
I urge the Government and all here present not to be satisfied with where this Bill stands now. We have all heard how it can be improved. We have all heard that this is a new, groundbreaking and difficult area in which many other countries have not even got as far as we have, but we should not be in any way satisfied with where we are now. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said earlier that we need to approach this Bill in a spirit of being humble, and this is an area in which humility is absolutely essential. I hope all of us realise how much further we have to go, and I hope the Minister will say how he proposes to address these important and so far uncovered issues in due course.
The Minister might be of the same mind himself.
Through speaking in these debates, my office has seen an increase in correspondence from parents who are thankful that these difficult issues are being talked about. The world is changing and progressing, and if we are going to live in a world where we want to protect our children and our grandchildren—I have six grandchildren —and all other grandchildren who are involved in social media, the least we can do is make sure they are safe.
I commend the hon. Member for Batley and Spen (Kim Leadbeater) and others, including the hon. Member for Watford (Dean Russell), who have spoken about Zach’s law. We are all greatly impressed that we have that in the Bill through constructive lobbying. New clause 28, which the hon. Member for Rotherham (Sarah Champion) referred to, relates to advocacy for young people. That is an interesting idea, but I feel that advocacy should be for the parents first and not necessarily young people.
Ahead of the debate, I was in contact with the Royal College of Psychiatrists. It published a report entitled “Technology use and the mental health of children and young people”—new clause 16 is related to that—which was an overview of research into the use of screen time and social media by children and young teenagers. It has been concluded that excessive use of phones and social media by a young person is detrimental to their development and mental health—as we all know and as Members have spoken about—and furthermore that online abuse and bullying has become more prevalent because of that. The right hon. Member for Witham (Priti Patel) referred to those who are susceptible to online harm. We meet them every day, and parents tell me that our concerns are real.
A recent report by NHS Digital found that one in eight 11 to 16-year-olds reported that they had been bullied online. When parents contact me, they say that bulling online is a key issue for them, and the statistics come from those who choose to be honest and talk about it. Although the Government’s role is to create a Bill that enables protection for our children, there is also an incredible role for schools, which can address bullying. My hon. Friend the Member for Upper Bann (Carla Lockhart) and I talked about some of the young people we know at school who have been bullied online. Schools have stepped in and stopped that, encouraging and protecting children, and they can play that role as well.
We have all read of the story of Molly Russell, who was only 14 years old when she took her life. Nobody in this House or outside it could not have been moved by her story. Her father stated that he strongly believed that the images, videos and information that she was able to access through Instagram played a crucial part in her life being cut short. The Bill must complete its passage and focus on strengthening protections online for children. Ultimately, the responsibility is on large social media companies to ensure that harmful information is removed, but the Bill puts the onus on us to hold social media firms to account and to ensure that they do so.
Harmful and dangerous content for children comes in many forms—namely, online abuse and exposure to self-harm and suicidal images. In addition, any inappropriate or sexual content has the potential to put children and young people at severe risk. The Bill is set to put provisions in place to protect victims in the sharing of nude or intimate photos. That is increasingly important for young people, who are potentially being groomed online and do not understand the full extent of what they are doing and the risks that come with that. Amendments have been tabled to ensure that, should such cases of photo sharing go to court, provisions are in place to ensure complete anonymity for the victims—for example, through video links in court, and so on.
I commend the right hon. Member for Basingstoke (Dame Maria Miller), who is not in her place, for her hard work in bringing forward new clause 48. Northern Ireland, along with England and Wales, will benefit from new clause 53, and I welcome the ability to hand down sentences of between six months and potentially five years.
Almost a quarter of girls who have taken a naked image have had their image sent to someone else online without their permission. Girls face very distinct and increased risks on social media, with more than four in five online grooming crimes targeting girls, and 97% of child abuse material featuring the sexual abuse of girls—wow, we really need to do something to protect our children and to give parents hope. There needs to be increased emphasis and focus on making children’s use of the internet safer by design. Once established, all platforms and services need to have the capacity and capability to respond to emerging patterns of sexual abuse, which often stem from photo sharing.
The Minister referred to terrorism and how terrorism can be promoted online. I intervened on him to mention the glorification of IRA terrorism and how that encourages further acts of terrorism and people who are susceptible to be involved. I am quite encouraged by the Minister’s response, and I think that we need to take a significant step. Some in Northern Ireland, for instance, try to rewrite history and use the glorification of terrorism for that purpose. We would like to see strengthening of measures to ensure that those involved in those acts across Northern Ireland are controlled.
In conclusion, there are many aspects of the Bill that I can speak in support of in relation to the benefits of securing digital protections for those on social media. This is, of course, about protecting not just children, but all of us from the dangers of social media. I have chosen to speak on these issues as they are often raised by constituents. There are serious matters regarding the glorification and encouragement of self-harm that the Bill needs to address. We have heard stories tonight that are difficult to listen to, because they are true stories from people we know, and we have heard horror stories about intimate photo sharing online. I hope that action on those issues, along with the many others that the Government are addressing, will be embedded in the Bill with the intent to finally ensure that we have regulations and protection for all people, especially our children—I think of my children and grandchildren, and like everybody else, my constituents.
I welcome the Minister to his place; I know that he will be excellent in this role, and it is incredible that he is so across the detail in such a short time.
I will primarily talk about new clause 53—that may not be that surprising, given how often it has been spoken about today—which is, ultimately, about Zach’s law. Zach is a truly heroic figure, as has been said. He is a young child with cerebral palsy, autism and epilepsy who was cruelly trolled by sick individuals who sent flashing images purposely to cause seizures and cause him damage. That was not unique to Zach, sadly; it happened to many people across the internet and social media. When somebody announced that they were looking for support, having been diagnosed with epilepsy, others would purposely identify that and target the person with flashing images to trigger seizures. That is absolutely despicable.
My hon. Friend the Member for Stourbridge (Suzanne Webb) has been my partner in crime—or in stopping the crime—over the past two years, and this has been a passion for us. Somebody said to me recently that we should perhaps do our victory lap in the Chamber today for the work that has been done to change the law, but Zach is the person who will get to go around and do that, as he did when he raised funds after he was first cruelly trolled.
My hon. Friend the Member for Folkestone and Hythe (Damian Collins) also deserves an awful lot of praise. My hon. Friend the Member for Stourbridge and I worked with him on the Joint Committee on the draft Online Safety Bill this time last year. It was incredible to work with Members of both Houses to look at how we can make the Bill better. I am pleased about the response to so many measures that we put forward, including the fact that we felt that the phrase “legal but harmful” created too many grey areas that would not catch the people who were doing these awful—what I often consider to be—crimes online to cause harm.
I want to highlight some of what has been done over the past two years to get Zach’s law to this point. If I ever write a memoir, I am sure that my diaries will not be as controversial as some in the bookshops today, but I would like to dedicate a chapter to Zach’s law, because it has shown the power of one individual, Zach, to change things through the democratic process in this House, to change the law for the entire country and to protect people who are vulnerable.
Not only was Zach’s case raised in the Joint Committee’s discussions, but afterwards my hon. Friend the Member for Stourbridge and I managed to get all the tech companies together on Zoom—most people will probably not be aware of this—to look at making technical changes to stop flashing images being sent to people. There were lots of warm words: lots of effort was supposedly put in so that we would not need a law to stop flashing images. We had Giphy, Facebook, Google, Twitter—all these billion-pound platforms that can do anything they want, yet they could not stop flashing images being sent to vulnerable people. I am sorry, but that is not the work of people who really want to make a difference. That is people who want to put profit over pain—people who want to ensure that they look after themselves before they look after the most vulnerable.
Talking of Christmas, would not the best Christmas present for lovely Zach be to enshrine new clause 53, that amazing amendment, as Zach’s law? Somehow we should formalise it as Zach’s law—that would be a brilliant Christmas present.
I wholeheartedly agree. Zach, if you are listening right now, you are an absolute hero—you have changed so much for so many people. Without your effort, this would not be happening today. In future, we can look back on this and say, “You know what? Democracy does work.”
I thank all hon. Members for their campaigning work to raise Zach’s law in the public consciousness. It even reached the US. I am sure many hon. Members dance along to Beyoncé of an evening or listen to her in the car when they are bopping home; a few months ago she changed one of her YouTube videos, which had flashing images in it, because the Epilepsy Society reached out to describe the dangers that it would cause. These campaigns work. They are about public awareness and about changing the law. We talk about the 15 minutes of shame that people face on social media, but ultimately the shame is on the platforms for forcing us to legislate to make them do the right thing.
I will end with one small point. The internet has evolved; the world wide web has evolved; social media is evolving; the metaverse, 3D virtual reality worlds and augmented reality are changing. I urge the Government or the House to look at creating a Committee specifically on the Bill. I know that there are lots of arguments that it should be a Sub-Committee of the Digital, Culture, Media and Sport Committee, but the truth is that the online world is changing dramatically. We cannot take snapshots every six months, every year or every two years and assume that they will pick up on all the changes happening in the world.
As the hon. Member for Pontypridd (Alex Davies-Jones) said, TikTok did not even exist when the Bill was first discussed. We now have an opportunity to ask what is coming next, keep pace with it and put ethics and morality at the heart of the Bill to ensure that it is fit for purpose for many decades to come. I thank the Minister for his fantastic work; my partner in crime, my hon. Friend the Member for Stourbridge, for her incredible work; and all Members across the House. Please, please, let us get this through tonight.
It is a privilege to follow my hon. Friend the Member for Watford (Dean Russell) and so many hon. Members who have made thoughtful contributions. I will confine my comments to the intersection of new clauses 28 and 45 to 50 with the impact of online pornography on children in this country.
There has been no other time in the history of humanity when we have exposed children to the violent, abusive, sexually explicit material that they currently encounter online. In 2008, only 14% of children under 13 had seen pornography; three years later, that figure had risen to 49%, correlating with the rise in children owning smartphones. Online pornography has a uniquely pernicious impact on children. For very young children, there is an impact just from seeing the content. For older teenagers, there is an impact on their behaviour.
We are seeing more and more evidence of boys exhibiting sexually aggressive behaviour, with actions such as strangulation, which we have dealt with separately in this House, and misogynistic attitudes. Young girls are being conditioned into thinking that their value depends on being submissive or objectified. That is leading children down a pathway that leads to serious sexual offending by children against children. Overwhelmingly, the victims are young girls.
Hon. Members need not take my word for it: after Everyone’s Invited began documenting the nature and extent of the sexual experiences happening in our schools, an Ofsted review revealed that the most prevalent victims of serious sexual assaults among the under-25s are girls aged 15 to 17. In a recent publication in anticipation of the Bill, the Children’s Commissioner cited the example of a teenage boy arrested for his part in the gang rape of a 14-year old girl. In his witness statement to the police, the boy said that it felt just like a porn film.
Dr John Foubert, the former White House adviser on rape prevention, has said:
“It wasn’t until 10 years ago when I came to the realization that the secret ingredient in the recipe for rape was not secret at all…That ingredient…is today’s high speed Internet pornography.”
The same view has been expressed, in one form or another, by the chief medical officers for England and for Wales, the Independent Inquiry into Child Sexual Abuse, the Government Equalities Office, the Children’s Commissioner, Ofsted and successive Ministers.
New clause 28 requests an advocacy body to represent and protect the interests of child users. I welcome the principle behind the new clause. I anticipate that the Minister will say that he is already halfway there by making the Children’s Commissioner a statutory consultee to Ofcom, along with the Domestic Abuse Commissioner and others who have been named in this debate. However, whatever the Government make of the Opposition’s new clause, they must surely agree that it alights on one important point: the online terrain in respect of child protection is evolving very fast.
By the time the Bill reaches the statute book, new providers will have popped up again. With them will come unforeseen problems. When the Bill was first introduced, TikTok did not exist, as my hon. Friend the Member for Watford said a moment ago, and neither did OnlyFans. That is precisely the kind of user-generated site that is likely to try and dodge its obligations to keep children safe from harm, partly because it probably does not even accept that it exposes them to harm: it relies on the fallacy that the user is in control, and operates an exploitative business model predicated on that false premise.
I think it important for someone to represent the issue of child protection on a regular basis because of the issue of age verification, which we have canvassed, quite lightly, during the debate. Members on both sides of the House have pointed out that the current system which allows children to self-certify their date of birth is hopelessly out of date. I know that Ministers envisage something much more ambitious with the Bill’s age assurance and age verification requirements, including facial recognition technology, but I think it is worth our having a constant voice reporting on the adequacy of whatever age assurance steps internet providers may take, because we know how skilful children can be in navigating the internet. We know that there are those who have the technological skills to IP shroud or to use VPN. I also think it important for there to be a voice to maintain the pressure on the Government—which is what I myself want to do tonight—for an official Government inquiry into pornography harms, akin to the one on gambling harms that was undertaken in 2019. That inquiry was extremely important in identifying all the harm that was caused by gambling. The conclusions of an equivalent inquiry into pornography would leave no wriggle room for user-generated services to deny the risk of harm.
My right hon. Friend the Member for Basingstoke (Dame Maria Miller) pointed out, very sensibly, that her new clauses 45 to 50 build on all the Law Commission’s recommendations. It elides with so much work that has already been done in the House. We have produced, for instance, the Domestic Abuse Act 2021, which dealt with revenge porn, whether threatened or actual and whether genuine or fake, and with coercive control. Many Members recognise what was achieved by all our work a couple of years ago. However, given the indication from Ministers that they are minded to accept the new clauses in one form or another, I should like them to explain to the House how they think the Bill will capture the issue of sexting, if, indeed, it will capture that issue at all.
As the Minister will know, sexting means the exchanging of intimate images by, typically, children, sometimes on a nominally consensual basis. Everything I have read about it seems to say, “Yes, prima facie this is an unlawful act, but no, we do not seek to criminalise children, because we recognise that they make errors of judgment.” However, while I agree that it may be proportionate not to criminalise children for doing this, it remains the case that when an image is sent with the nominal consent of the child—it is nearly always a girl—it is often a product of duress, the image is often circulated much more widely than the recipient, and that often has devastating personal consequences for the young girl involved. All the main internet providers now have technology that can identify a nude image. It would be possible to require them to prevent nude images from being shared when, because of extended age-verification abilities, they know that the user is a child. If the Government are indeed minded to accept new clauses 45 to 50, I should like them to address that specific issue of sexting rather than letting it fall by the wayside as something separate, or outside the ambit of the Bill.
Does my hon. Friend agree that the work of charities such as Dignify in Watford, where Helen Roberts does incredible work in raising awareness of this issue, is essential to ensuring that people are aware of the harm that can be done?
I completely agree. Other charities, such as CEASE—the Centre to End All Sexual Exploitation —and Barnardo’s have been mentioned in the debate, and I think it so important to raise awareness. There are many harms in the internet, but pornography is an epidemic. It makes up a third of the material on the internet, and its impact on children cannot be overstated. Many boys who watch porn say that it gives them ideas about the kind of sex that they want to try. It is not surprising that a third of child sexual abuse is committed by other children. During puberty—that very important period of development—boys in particular are subject to an erotic imprint. The kind of sex that they see and the sexual ideas that they have during that time determine what they see as normal behaviour for the rest of their lives. It is crucial for children to be protected from harmful pornography that encourages the objectification and abuse of—almost always—women.
(2 years, 6 months ago)
Public Bill CommitteesGood morning, Ms Rees; it is a pleasure to serve under your chairmanship again. The SNP spokesman and the shadow Minister have already explained what these provisions do, which is to provide a power for the Secretary of State to make directions to Ofcom in relation to modifying a code of conduct. I think it is important to make it clear that the measures being raised by the two Opposition parties are, as they said, envisaged to be used only in exceptional circumstances. Of course the Government accept that Ofcom, in common with other regulators, is rightly independent and there should be no interference in its day-to-day regulatory decisions. This clause does not seek to violate that principle.
However, we also recognise that although Ofcom has great expertise as a regulator, there may be situations in which a topic outside its area of expertise needs to be reflected in a code of practice, and in those situations, it may be appropriate for a direction to be given to modify a code of conduct. A recent and very real example would be in order to reflect the latest medical advice during a public health emergency. Obviously, we saw in the last couple of years, during covid, some quite dangerous medical disinformation being spread—concerning, for example, the safety of vaccines or the “prudence” of ingesting bleach as a remedy to covid. There was also the purported and entirely false connection between 5G phone masts and covid. There were issues on public policy grounds—in this case, medical grounds—and it might have been appropriate to make sure that a code of conduct was appropriately modified.
It was mentioned earlier that some of us were on previous Committees that made recommendations more broadly that would perhaps be in line with the amendment. Since that time, there has been lots of discussion around this topic, and I have raised it with the Minister and colleagues. I feel reassured that there is a great need to keep the clause as is because of the fact that exceptional circumstances do arise. However, I would like reassurances that directions would be made only in exceptional circumstances and would not override the Ofcom policy or remit, as has just been discussed.
I can provide my hon. Friend with that reassurance on the exceptional circumstances point. The Joint Committee report was delivered in December, approximately six months ago. It was a very long report—I think it had more than 100 recommendations. Of course, members of the Committee are perfectly entitled, in relation to one or two of those recommendations, to have further discussions, listen further and adjust their views if they individually see fit.
During the Joint Committee we were concerned about future-proofing. Although I appreciate it is not specifically included in the Bill because it is a House matter, I urge the setting up of a separate Online Safety Act committee that runs over time, so that it can continue to be improved upon and expanded, which would add value. We do not know what the next metaverse will be in 10 years’ time. However, I feel confident that the metaverse was included and I am glad that the Minister has confirmed that.
I thank my hon. Friend for his service on the Joint Committee. I heard the representations of my right hon. Friend the Member for Basingstoke about a Joint Committee, and I have conveyed them to the higher authorities.
(2 years, 6 months ago)
Public Bill CommitteesI refer Members to my entry in the Register of Members’ Financial Interests regarding work I did six months ago for a business called DMA.
We will now hear oral evidence from Kevin Bakhurst, group director of broadcasting and online content at Ofcom, and Richard Wronka, director of Ofcom’s online harms policy. Before calling the first Member to ask a question, I remind all Members that questions should be limited to matters within the scope of the Bill, and we must stick to the timings in the programme motion that the Committee has agreed. For this witness panel, we have until 10.05 am. Could the witnesses please introduce themselves for the record?
Kevin Bakhurst: Good morning. I am Kevin Bakhurst, group director at Ofcom for broadcasting and online content.
Richard Wronka: I am Richard Wronka, a director in Ofcom’s online safety policy team.
Q
Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.
A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.
Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.
Q
Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.
Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.
Q
Kevin Bakhurst: This is a really important point, which Richard just tried to make. The Bill gives us a great range of tools to try and prevent harm as far as possible; I just think we need to get expectations right here. Unfortunately, this Bill will not result in no harm of any type, just because of the nature of the internet and the task that we face. We are ambitious about driving constant improvement and stopping and addressing the main harms, but it is not going to stop any particular harm. We will absolutely focus on the ones that have a significant impact, but unfortunately that is the nature of the web.
Q
“psychological harm amounting to serious distress”?
Therefore, sending somebody a flashing image with the intention of inducing an epileptic fit would be likely caught under this new harmful communications offence in clause 150, even before a separate future offence that may be introduced.
Richard Wronka: I think we can certainly understand the argument. I think it is important that the Bill is as clear as possible. Ultimately, it is for the courts to decide whether that offence would pick up these kinds of issues that we are talking about around flashing imagery.
Q
Ben Bradley: Speaking for TikTok, we view ourselves as a second-generation platform. We launched in 2018, and at that time when you launched a product you had to make sure that safety was at the heart of it. I think the Secretary of State herself has said that the Bill process actually predates the launch of TikTok in the UK.
We view ourselves as an entertainment platform and to express yourself, enjoy yourself and be entertained you have to feel safe, so I do not think we would be seen as kicking and screaming under this regime. It is something that we have supported for a long time and we are regulated by Ofcom under the video-sharing platform, or VSP, regime. What the Bill will achieve is to raise the floor of industry standards, a bit like GDPR did for data, so that for all the companies in the future—to Alex’s point, this is about the next five and 10 years—there will be a baseline of standards that everyone must comply with and expectations that you will be regulated. Also, it takes a lot of these difficult decisions about the balance between safety and expression, privacy and security out of the hands of tech companies and into the hands of a regulator that, of course, will have democratic oversight.
Katy Minshall: I do not have very much more to add. We already engage positively with Ofcom. I remember appearing before a Select Committee back in 2018 or 2019 and at that point saying that we were absolutely supportive of Ofcom taking in this role and regulation potentially being a game changer. We are supportive of the systems and processes approach and look forward to engaging constructively in the regulation.
Q
Katy Minshall: I am glad you asked that question. The problem with the Bill is it depends on so many things that do not exist yet. We are looking at the Bill and thinking how we can prepare and start thinking about what is necessary, but in practice, content that is harmful to adults and harmful to children has not been set out yet. So much of the Bill depends on secondary legislation and codes of practice, and as I described earlier in the question from Alex Davies-Jones, there are such real workability questions around exemptions and ID verification that I worry there would be the risk of substantial delays at the other end, which I do not think anyone wants to see.
Ben Bradley: It is the same from our perspective. We have our community guidelines and we are committed to enforcing those at the moment. A lot of the detail of the Bill will be produced in Ofcom’s codes of practice but I think it is important we think about operationalising the process, what it looks like in practice and whether it is workable.
Something like Katy mentioned in terms of the user empowerment duties, how prescriptive those would be and how those would work, not just from the platforms of today but for the future, is really important. For TikTok, to use a similar example on the user empowerment duties, the intent is to discover content from all over the world. When you open the app, you are recommended content from all sorts of users and there is no expectation that those would be verified. If you have opted into this proposed user empowerment duty, there is a concern that it could exacerbate the risk of filter bubbles, because you would only be receiving content from users within the UK who have verified themselves, and we work very hard to make sure there is a diverse range of recommendations in that. I think it is a fairly easy fix. Much like elsewhere in the Bill, where Ofcom has flexibility about whether to require specific recommendations, they could have that flexibility in this case as well, considering whether this type of power works for these types of platforms.
To use the example of the metaverse, how would it work once the metaverse is up and running? The whole purpose of the metaverse is a shared environment in which users interact, and because the Bill is so prescriptive at the minute about how this user empowerment duty needs to be achieved, it is not clear, if you were verified and I were unverified and you had opted not to see my content but I moved something in the shared environment, like this glass, whether that would move for everyone. It is a small point, but it just goes to the prescriptiveness of how it is currently drafted and the importance of giving Ofcom the flexibility that it has elsewhere in the Bill, but in this section as well.
Q
Katy Minshall: At present, we follow the industry standard of age self-declaration. How you manage and verify identity—whether using a real-name system or emerging technologies like blockchain or documentation—is at the heart of a range of industries, not just ours.
Technology will change and new products that we cannot even envisage today will come on to the market. In terms of what we would do in relation to the Bill, as I said, until we see the full extent of the definitions and requirements, we cannot really say what exact approach we would take.
(2 years, 6 months ago)
Public Bill CommitteesThank you very much. Don’t worry, ladies; I am sure other colleagues will have questions that they wish to pursue. Dean Russell, please.
Q
One of the reasons why we are bringing in this Bill is that platforms such as Facebook—Meta, sorry—just have not fulfilled their moral obligations to protect children from harm. What commitment are you making within your organisation to align yourself to deliver on the requirements of the Bill?
To be frank, the track record up until now is appalling, and all I hear when in these witness sessions, including before Christmas on the Joint Committee, is that it is as though the big platforms think they are doing a good job—that they are all fine. They have spent billions of pounds and it is not going anywhere, so I want to know what practical measures you are going to be putting into place following this Bill coming into law.
Richard Earley: Of course, I do not accept that we have failed in our moral obligation to our users, particularly our younger users. That is the most important obligation that we have. I work with hundreds of people, and there are thousands of people at our company who spend every single day talking to individuals who have experienced abuse online, people who have lived experience of working with victims of abuse, and human rights defenders—including people in public life such as yourself—to understand the impact that the use of our platform can have, and work every day to make it better.
Q
Richard Earley: Again, we publish this transparency report every quarter, which is our attempt to show how we are doing at enforcing our rules. We publish how many of the posts that break our rules we take down ourselves, and also our estimates of how likely you are to find a piece of harmful content on the platform—as I mentioned, it is around three in every 10,000 for hate speech right now—but we fully recognise that you will not take our word for it. We expect confidence in that work to be earned, not just assumed.
That is why last year, we commissioned EY to carry out a fully independent audit of these systems. It published that report last week when we published our most recent transparency report and, again, I am very happy to share it with you here. The reason we have been calling for many years for pieces of legislation like this Bill to come into effect is that we think having Ofcom, the regulator—as my colleagues just said—able to look in more detail at the work we are doing, assess the work we are doing, and identify areas where we could do more is a really important part of what this Bill can do.
Q
Richard Earley: To start with, as I said, we are not waiting for the Bill. We are introducing new products and new changes all the time.
Q
Richard Earley: Well, I just spoke about some of the changes we made regarding young people, including defaulting them into private accounts. We have launched additional tools making it possible for people to put in lists of words they do not want to see. Many of those changes are aligned with the core objectives of the Bill, which are about assessing early the risks of any new tools that we launch and looking all the time at how the use of technology changes and what new risks that might bring. It is then about taking proactive steps to try to reduce the risk of those harms.
Q
Richard Earley: This is an issue we have discussed at length with DCMS, and we have consulted a number of people. It is, of course, one of the most sensitive, delicate and difficult issues we have to deal with, and we deal with those cases very regularly. In the process that exists at present, there are, of course, coronial powers. There is a process in the UK and other countries for coroners to request information.
When it comes to access for parents to individuals’ accounts, at present we have a system for legacy contacts on some of our services, where you can nominate somebody to have access to your account after you pass away. We are looking at how that can be expanded. Unfortunately, there are an awful lot of different obligations we have to consider, not least the obligations to a person who used our services and then passed away, because their privacy rights continue after they have passed away too.
Okay, so there is a compassion element. I am conscious of time, so I will stop there.
One moment, please. I am conscious of the fact that we are going to run out of time. I am not prepared to allow witnesses to leave without feeling they have had a chance to say anything. Ms Foreman, Ms O’Donovan, is there anything you want to comment on from what you have heard so far? If you are happy, that is fine, I just want to make sure you are not being short-changed.
Becky Foreman: No.
Katie O'Donovan: No, I look forward to the next question.
I have four Members plus the Minister to get in, so please be brief. I call Dean Russell.
Q
Lulu Freemont: On future-proofing, one of the real strengths of the Bill is the approach: it is striving to rely on systems and processes, to be flexible and to adapt to future technologies. If the Bill sticks to that approach, it will have the potential to be future-proof. Some points in the Bill raise a slight concern about the future-proofness of the regulation. There is a risk that mandating specific technologies—I know that is one of Ofcom’s powers under the Bill—would put a bit of a timestamp on the regulation, because those technologies will likely become outdated at some point. Ensuring that the regulation remains flexible enough to build on the levels of risk that individual companies have, and on the technologies that work for the development and innovation of those individual companies, will be a really important feature, so we do have some concerns around the mandating of specific technologies in the Bill.
On the point about setting up a committee, one of the things for which techUK has called for a really long time is an independent committee that could think about the current definitions of harm and keep them under review. As companies put in place systems and processes that might mitigate levels of risk of harm, will those levels of harm still be harmful? We need to constantly evolve the regime so that it is true to the harms and risks that are present today, and to evaluate it against human rights implications. Having some sort of democratically led body to think about those definitional points and evaluate them as times change and harm reduces through this regime would be very welcome.
Adam Hildreth: To add to that, are people starting to think differently? Yes, they definitely are. That ultimately, for me, is the purpose of the Bill. It is to get people to start thinking about putting safety as a core principle of what they do as an overall business—not just in the development of their products, but as the overall business. I think that will change things.
A lot of the innovation that comes means that safety is not there as the principal guiding aspect, so businesses do need some help. Once they understand how a particular feature can be exploited, or how it impacts certain demographics or particular age groups—children being one of them—they will look for solutions. A lot of the time, they have no idea before they create this amazing new metaverse, or this new metaverse game, that it could actually be a container for harmful content or new types of harm. I think this is about getting people to think. The risk assessment side is critical, for me—making sure they go through that process or can bring on experts to do that.
Ian Stevenson: I would split the future-proofing question into two parts. There is a part where this Bill will provide Ofcom with a set of powers, and the question will be: does Ofcom have the capacity and agility to keep up with the rate of change in the tech world? Assuming it does, it will be able to act fairly quickly. There is always a risk, however, that once a code of conduct gets issued, it becomes very difficult to update that code of conduct in a responsive way.
There is then a second piece, which is: are the organisations that are in scope of regulation, and the powers that Ofcom has, sufficient as things change? That is where the idea of a long-term committee to keep an eye on this is extremely helpful. That would be most successful if it did not compromise Ofcom’s independence by digging deeply into individual codes of conduct or recommendations, but rather focused on whether Ofcom has the powers and capacity that it needs to regulate as new types of company, platform and technology come along.
Q
Lulu Freemont: It is a great question. One of the biggest challenges is capacity. We hear quite a lot from the smaller tech businesses within our membership that they will have to divert their staff away from existing work to comply with the regime. They do not have compliance teams, and they probably do not have legal counsel. Even at this stage, to try to understand the Bill as it is currently drafted—there are lots of gaps—they are coming to us and saying, “What does this mean in practice?” They do not have the answers, or the capability to identify that. Attendant regulatory costs—thinking about the staff that you have and the cost, and making sure the regulation is proportionate to the need to divert away from business development or whatever work you might be doing in your business—are really fundamental.
Another real risk, and something in the Bill that smaller businesses are quite concerned about, is the potential proposal to extend the senior management liability provisions. We can understand them being in there to enable the regulators to do their job—information requests—but if there is any extension into individual pieces of content, coupled with a real lack of definitions, those businesses might find themselves in the position of restricting access to their services, removing too much content or feeling like they cannot comply with the regime in a proportionate way. That is obviously a very extreme case study. It will be Ofcom’s role to make sure that those businesses are being proportionate and understand the provisions, but the senior management liability does have a real, chilling impact on the smaller businesses within our membership.
Adam Hildreth: One of the challenges that we have seen over the last few years is that you can have a business that is small in revenue but has a huge global user base, with millions of users, so it is not really a small business; it just has not got to the point where it is getting advertisers and getting users to pay for it. I have a challenge on the definition of a small to medium-sized business. Absolutely, for start-ups with four people in a room—or perhaps even still just two—that do not have legal counsel or anything else, we need to make it simple for those types of businesses to ingest and understand what the principles are and what is expected of them. Hopefully they will be able to do quite a lot early on.
The real challenge comes when someone labels themselves as a small business but they have millions of users across the globe—and sometimes actually quite a lot of people working for them. Some of the biggest tech businesses in the world that we all use had tens of people working for them at one point in time, when they had millions of users. That is the challenge, because there is an expectation for the big-tier providers to be spending an awful lot of money, when the small companies are actually directly competing with them. There is a challenge to understanding the definition a small business and whether that is revenue-focused, employee-focused or about how many users it has—there may be other metrics.
Ian Stevenson: One of the key questions is how much staffing this will actually take. Every business in the UK that processes data is subject to GDPR from day one. Few of them have a dedicated data protection officer from day one; it is a role or responsibility that gets taken on by somebody within the organisation, or maybe somebody on the board who has some knowledge. That is facilitated by the fact that there are a really clear set of requirements there, and there are a lot of services you can buy and consume that help you deliver compliance. If we can get to a point where we have codes of practice that make very clear recommendations, then even small organisations that perhaps do not have that many staff to divert should be able to achieve some of the basic requirements of online safety by buying in the services and expertise that they need. We have seen with GDPR that many of those services are affordable to small business.
If we can get the clarity of what is required right, then the staff burden does not have to be that great, but we should all remember that the purpose of the Bill is to stop some of the egregiously bad things that happen to people as a result of harmful content, harmful behaviours and harmful contact online. Those things have a cost in the same way that implementing data privacy has a cost. To come back to Lulu’s point, it has to be proportionate to the business.
(2 years, 8 months ago)
Commons ChamberI had the great privilege of sitting on the Joint Committee on the draft Bill before Christmas and working with the Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), fantastic Members from across both Houses and amazing witnesses.
We heard repeated stories of platforms profiting from pain and prejudice. One story that really affected me was that of Zach Eagling, a heroic young boy who has cerebral palsy and epilepsy and who was targeted with flashing images by cruel trolls to trigger seizures. Those seizures have been triggered for other people with epilepsy, affecting their lives and risking not just harm, but potentially death, depending on their situation. That is why I and my hon. Friend the Member for Stourbridge (Suzanne Webb)—and all members of the Joint Committee, actually, because this was in our report—backed Zach’s law.
Ten-year-old Zach is a child in my constituency who has, as the hon. Member said, cerebral palsy and epilepsy, and he has been subjected to horrendous online abuse. I hope that the Minister can provide clarity tonight and confirm that Zach’s law—which shows that not just psychological harm and distress, but physical harm can be created as a result of online abuse and trolling—will be covered in the Bill.
My understanding—hopefully this will be confirmed from the Dispatch Box—is that Zach’s law will be covered by clause 150 in part 10, on communications offences, but I urge the Ministry of Justice to firm that up further.
One thing that really came through for me was the role of algorithms. The only analogy that I can find in the real world for the danger of algorithms is narcotics. This is about organisations that focused on and targeted harmful content to people to get them to be more addicted to harm and to harmful content. By doing that, they numbed the senses of people who were using technology and social media, so that they engaged in practices that did them harm, turning them against not only others, but themselves. We heard awful stories about people doing such things as barcoding—about young girls cutting themselves—which was the most vile thing to hear, especially as a parent myself. There was also the idea that it was okay to be abusive to other people and the fact that it became normalised to hurt oneself, including in ways that can be undoable in future.
That leads on to a point about numbing the senses. I am really pleased that in debating the Bill today we have talked about the metaverse, because the metaverse is not just some random technology that we might talk about; it is about numbing the senses. It is about people putting on virtual reality headsets and living in a world that is not reality, even if it is for a matter of minutes or hours. As we look at these technologies and at virtual reality, my concern is that children and young people will be encouraged to spend more time in worlds that are not real and that could include more harmful content. Such worlds are increasingly accurate in their reality, in the impact that they can have and in their capability for user-to-user engagement.
I therefore think that although at the moment the Bill includes Meta and the metaverse, we need to look at it almost as a tech platform in its own right. We will not get everything right at first; I fully support the Bill as it stands, but as we move forward we will need to continue to improve it, test it and adapt it as new technologies come out. That is why I very much support the idea of a continuing Joint Committee specifically on online safety, so that as time goes by the issues can be scrutinised and we can look at whether Ofcom is delivering in its role. Ultimately, we need to use the Bill as a starting point to prevent harm now and for decades to come.
(3 years, 11 months ago)
Commons ChamberUrgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.
Each Urgent Question requires a Government Minister to give a response on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Of course I recognise that this is not the solution that we would have wanted, and it is not the solution that we fought really hard for. I point out to the hon. Lady that the Labour party voted for this deal in the full knowledge of what it involved, including the end to freedom of movement.
From The Horns to the Colosseum to the Palace Theatre, music literally beats at the heart of my constituency of Watford. That means that we have many amazing musicians. They are asking me whether my hon. Friend can confirm that it was not the UK that ended these visas, and what measures are going forward to support this amazing sector?
My hon. Friend is such a vibrant champion, and not just for the music industry in his constituency; we have also spoken about the film industry. I expect him at any moment to be descending from the ceiling on a wire in the next “Mission Impossible” movie. He does it all with great panache, and that is exactly what we want to do. The cultural recovery fund has been about supporting music venues so that musicians can get back to doing what they love. Arts Council England is there to support them. We will look at every opportunity we have to put in place more of that vital encouragement and support.
(3 years, 11 months ago)
Public Bill CommitteesQ
Dr Sellars: You are quite right that 5G opens up a whole load of new benefits, predominantly high-speed access/lower latency. I think some of the security risks are around who is providing the infrastructure to support 5G. The concern that we have at the moment is that we need to have security of supply—both resilience of the supply chain for that infrastructure, and the cyber-security and encryption element of that infrastructure.
I think it is fair to say that 5G is likely to support a much broader selection of services. It is likely to have an impact on commercial, governmental and security transmission, just because of the widespread access and its very high-speed capability. It is also likely to support a very large number of internet of things devices—the sort of devices that UtterBerry develops. Some of those devices are another potential attack vector, if you like; they are another potential vulnerability. It is broadening the access into the network, which is potentially opening up new sorts of vulnerabilities that we need to take into consideration.
Dr Johnson: Let me start by saying that some aspects of security in 5G networks are actually much more secure than in previous generations. Looking over the lifetime of cellular, you will know that you could just listen into first generation analogue networks with a very high frequency radio. GSM—the global system for mobile communications—was secure, partly at least. The network and the phones would authenticate to each other, but only asymmetrically, so the phone could be captured by a surreptitious network. That sort of attack is still used.
3G is much more secure, with symmetric authentication. It is harder for devices to be captured by the wrong network, but it is still possible. It is also possible for the IMSI—that is to say, the international mobile subscriber identity—of an individual or group to be found from that network. The same is true of 4G. In 5G, that is much more difficult. In terms of the security of the user of the network, 5G has tightened up a lot of the loopholes in previous generations in a way that is very hard to unpick. That creates tactical problems for some law enforcement agencies, which rely on some of the insecurities of earlier generations to do their job.
From the network side of things, there are some issues. There is a new network model in terms of the way nodes are connected in the core network. No longer are there physical interfaces as in previous generations of network, where there would be an S1 connection from the base station to the core. There are still connections, but they are much more in a publish-subscribe-type model. I think those, conceivably at least, bring a little more opportunity for attackers to probe nodes within the core network to find weaknesses and vulnerabilities. That is my take on 5G.
Heba Bevan: We have three elements that the telecoms community could work on: the communication aspect, which is provided by companies such as BT; the hardware aspect, which is probably provided by companies such as Utterberry; and the software element within the system. So there are three types of vulnerability that could be introduced in the path of these three elements. The only problem with these paths is this: who is responsible if there is an attack? Usually, the communication aspect is the most important part to get protected.
Currently with 5G, there is a huge opportunity for opening up a huge economic impact from the sector in terms of healthcare, education and tech industries. These industries will need to move on and having 5G is definitely an important element, but how can we make sure it is secure in providing an effective communications network that provides an end-to-end solution and security? That is where I think we need to concentrate on the telecommunications and how can we make sure that what we are getting from that communication is totally secure, and that the encryption within it passes certain thresholds.
We can follow a certain standard within the hardware and software, but if the network is weak and has not provided us with good reliability, that is where things could be broken.
Q
Is there a shelf-life of the older versions? I am surprised that we are still talking about 2G—that it has not been removed. Is there a shelf-life for those elements and will they be removed from what I term “the network”, which is of course the whole global telecommunications infrastructure of the UK? Nick, do you want to start on this question?
Dr Johnson: Yes. Let me start on that shelf-life question. GSM is a little bit like Radio Four longwave, right? I do not think that it is ever really going to die; there are just too many people who depend on it for one reason or another, whether that is for emergency calls, or just for coverage in remote locations or wherever. I think GSM will stay there forever, despite its security issues. They are well known and understood, and managed in due course.
The shelf-life of network components is an interesting aspect. Our experience of deploying into cellular networks is that there is always a security audit involved. When we take a piece of equipment into a new operator, there is always a hurdle to be overcome. They have their own audit procedures and those include a sort of paper audit, where they look at the particular software components that the software is built from, some of which we build ourselves, some of which is open source and some of which is commercial off-the-shelf software libraries and so on. They want to make sure that those are all up to date and properly patched, with all the latest security patches and so on. I think that will just continue on. To some extent, that is just the baseline hurdle.
I am not sure this is exactly what you are asking, but what has changed in my mind as we go forward is this idea that there can be software in the network that is not so much interested in security—as in, somebody hacking into it—but is more of a Trojan horse type of software, completely undetectable until some signal or some date comes by and it springs to life and does bad things. The example I have in mind is the SolarWinds example from December last year, where software had been inserted in the supply chain and had been sitting there quite happily for a while. That, to my mind, is very difficult to detect. Until it goes off, you do not know there is a bomb inside it, and that is an issue.
Coming back to the shelf-life question, keeping the software up to date is a major issue. It sounds easy, but practically speaking, I know it is an operational dialogue all the time within vendor businesses: they are striving for revenue from new customers, for new features to be added, and that is acting against updating the software libraries and so on to bring them up to date. There is a continual dialogue in every vendor company to ask, “Do we need these features to get more revenue, or do we need to update these libraries because we need to maintain secure software?” I guess to some extent, the whole reason for this Bill is to try and force that to the front of the conversation; to say, “Look, you can’t go on. That dialogue has to stop now. The software needs to be secure.” That has to be the baseline; it has to be a basic hygiene factor in selling software that it must be secure to a certain level, and the features need to come as value added. If you have some questions coming up on the code of practice, designated vendors and so on, we might talk about that, but those are my comments on shelf-life.
I think I missed your first question. I apologise.
Q
Mike Fake: I think the diversification strategy is important. It is great to see the national telecoms centre proposal and the £250 million for research and development. One concern is whether that will be enough. Listening to earlier parts of the hearing last week, BT said that they it invests £500 million per annum and Huawei has a revenue of probably $120 billion per year. Sorry, did I say, “million”? I meant billion. What do they invest in research and development? Probably $2 billion a year. The opportunity I see is that we have a short-term focus for network equipment manufacturers to replace high-risk vendor equipment, but it will be difficult in that period for other new entrants to get their share.
The opportunity is to foster new entrants in technologies in the UK telecoms supply chain, and to leverage innovative solutions for manufacturing scale in the UK. Another issue is that there is a lot of focus on the radio access part of 5G, but that is only one small part of the network. There is optical fibre connectivity from the masts, and transport to the network’s core: that is critical to the network’s security and performance.
Helen Duncan: When I started my career, the industry was dominated by big names such as STC, Plessey, GEC and Racal. They all received funding from defence organisations such as the Royal Signals and Radar Establishment at Malvern. They used a lot of the spin-offs from that technology to develop their telecoms capability. That all ceased in the 1990s after the Berlin wall came down and cost-plus was abolished and so on. It is significant that independent industry research shrank in those times. We are now, at last, seeing a bit of stimulation going back into British industry thanks to the catapults, like Andy Sellars’, and this could be an opportunity, if not to return to those days, to put some investment in and to develop the talents we have in this country.
Dr Cleevely: The Bill is a great opportunity, as the other speakers have said. In technical jargon, it is a necessary but not sufficient condition. It does provide some great opportunities. I am an investor and have created a number of British companies of which, like you, I am very proud. We do, however, need to think carefully about how the market actually works. A number of speakers before us talked about the way in which the number of suppliers has come down in this business. We need to be careful in thinking about how we intervene to set the rules of the game and to encourage certain kinds of behaviour. I am very familiar with one example that relates not only to Government but also to large corporates: the notion that you go through a procurement department that is forcing you down on price, and it does not have the notion of innovation as one of its key performance indicators. The notion of innovation, on the other hand, is built into a lot of the systems that are employed in other countries, primarily the United States, as a way of evaluating whether a technology should be procured or not. We need to think rather more carefully about how we foster that development and growth of smaller companies into larger companies, particularly with this view about innovation.
For example, Ofcom is an economic regulator—one of 11 or so economic regulators in the UK. It has always, below the radar, treated innovation as one of the things it ought to be fostering. I would suggest, for example, that alongside the consideration of this Bill, we think about how we push innovation rather more firmly and put some money behind it in terms of procurement.
Q
Mike Fake: Obviously, we have got two things to do here. We need to replace the existing vendors’ equipment, but in parallel, if we can invest in the UK supply chain—we have a very healthy supply chain in the sense that there are a lot of companies which provide optical components and subsystems into the equipment manufacturers. We need to do both things at once. We need to swap out the equipment, and also invest in the new companies coming up, so that in the future we can have a much more future-proof, innovative, secure and leading network.
Pushing the timescales forward, we have to recognise that in the short term we are going to be stuck with two alternative vendors that we need to swap out, but if we can invest in the up-and-coming, innovative, small SMEs and really foster those, as the previous speakers have said, I think we have got a real opportunity to change things and to have a world-leading, British, high-UK content network moving forward.
Thank you. Could I ask Helen the same question?
Helen Duncan: I think there are some real practical difficulties in swapping out the equipment. It sounds simple; you just take one radio out and put another one in, but I think you would find that cell sites would be down and consumers would be complaining.
There has been some research recently by a company—albeit funded by Huawei—called Assembly Research, which estimates that it would put the UK three years behind in its programme of 5G deployment. At a time when communications are key to our surviving the unusual circumstances of the pandemic, it seems counter-intuitive to think about putting even more strain on that by moving the deadline closer. I think perhaps it should be the installation engineers who work for the networks we should be putting this question to: how much disruption is it going to cause?
Thank you. David too?
Dr Cleevely: I would like to echo what Helen said, but in a rather different way. There is an engineering problem, which is what we have been dealing with, but there is also a human behavioural problem. Anybody who has worked in a large corporation or worked on these large projects will know that the way in which people approach the problem, and the way they think about it, the way they want to programme it and the urgency they feel, is driven as much by the psychological issues as it is by the technical. I would urge you to think through how you would encourage the behaviour that you want to see. Now, obviously Government can do that by simply issuing an edict and forcing a deadline, but there may be other ways that you can get more innovation and a more rapid shift than the 2027 deadline, simply by thinking through with the industry—going back to Helen’s point about the engineers on the ground—about what is required. A little bit more detailed thinking on that could yield some very positive result.
Q
Mike Fake: That is a difficult problem to solve, but I think it is important that innovation is a powerful force, and you can turn around things in this new world very quickly. Although you have old legacy systems, and you replace everything from overseas vendors with old legacy systems, you need to keep moving forward. In terms of optics, we probably have one of the world’s leading telecom fibre optics innovation capabilities in the world, through the universities. We have a whole bunch of small and medium-sized enterprises out there, and they are struggling to make that step to some scale and to get that innovation deployed in the network. But I think innovation—
Q
Doug Brake: I think it is absolutely right that there is a real risk if we cut off supply to China, particularly in semiconductors. We have already seen an aggressive action on their part to stand up an indigenous semiconductor industry. This is getting a little outside of my area of expertise; semiconductors is not some place that I know super well. However, I think that it is absolutely correct that there is a real risk that the extent to which we try to cut off Chinese companies will see them double down efforts to create their own indigenous supply chain. So—absolutely.
I am hopeful that we see either a change to that or a much broader international coalition to double down on those efforts. I think that it is more likely that we will see a Biden Administration ease some of those restrictions, or work through the current legal means to allow for licences for companies to sell semiconductors to Huawei and others.
Q
Doug Brake: That is absolutely right. This is a long-term effort. I worry about some who tout ORAN as something of a silver bullet that we can make a quick transition to, that it is a flash cut for existing equipment providers to an open RAN sort of system—a more modular and diverse ecosystem. It is something that is going to take a number of years. I honestly worry that it is late for ORAN to be incorporated into 5G, at least on a broad scale. For greenfield networks, it is a different story and it might make sense to go with these open and modular systems from the get-go.
I worry that this is much more a conversation about putting in the tools, resources, testing facilities, the labs, R&D, et cetera, to put us on a path for years down the road so that this becomes the industry standard. I do think, absolutely, that this is the time to be looking at those early stage investments to be driving further and, frankly, looking down the road to 6G, to be able to put in place the policies and efforts to transition the industry to this more diverse future, and put those in place now for years to come.
Q
Doug Brake: I worry that sometimes 5G is conceptualised as a singular technology or a singular thing. It is not a monolith; there are a number of different component technologies and a number of different flavours. Depending on whether you are doing a fully 5G network, a stand-alone network or a non-stand-alone network, it is a very different sort of system. There are also a lot of differences between what spectrum is used to deploy the network—if you are using low-band, mid-band or high-band spectrum or a combination of all three. It is hard to answer that question in generalities.
A number of different component technologies and architectures will be rolled out over time. At a high level, the real advantage of 5G compared with 4G is in its flexibility. It is able to tailor its connectivity to a number of different applications’ needs. It can offer extremely high throughput and much faster speeds. It is very reliable, with very low latency. For example, if you want to stream a football match while travelling on a train, it can do that quite well, or quite a bit better than LTE and 4G today. At the same time, you can also change very obscure technical parameters to make for simple communications that require very little battery on the device side to be able to communicate. If you want to have massive deployments of sensors for smart agriculture, or something like that, that have battery life in the order of decades, it can do that. The hallmark is its flexibility.
Given that flexibility, it is anticipated that 5G is going to be much more deeply integrated within the economy and trade sectors, and will be a key tool to boost productivity. There is an important hope that we see a broad deployment, not just in urban areas but in rural areas. Again, I go back to that note on differences depending on the spectrum that is used to deploy—unless it is of interest, I do not want to get too bogged down in the details, but there are real differences in what we would expect to see deployed in urban versus rural areas. But, again, we would also expect to see very different use cases in those areas. Admittedly, there will likely be a performance difference between urban areas and more rural areas. But at the same time, like I said, the use cases look very different—you are not likely to have massive crowds of people all looking to share video from a stadium or something like that in rural areas. There will be a real difference in the roll-out, but I worry that sometimes the challenges with that have been overstated.
(3 years, 11 months ago)
Public Bill CommitteesQ
Patrick Binchy: We know where all the equipment is for our main supplier, yes.
Derek McManus: On the question on the asset register, absolutely. As for whether networks are interconnected, Patrick gave a good answer. The O2 and Vodafone networks are somewhat different, in that we work together on a network share; the O2 team manages and maintains a network in a certain geography, and the Vodafone team manages and maintains a physical network in another geography. In that sense, the O2 and Vodafone networks are very interconnected.
Andrea Donà: It is vital that the secondary legislation that accompanies the Bill clarifies assets in the telecoms network architecture that will be in scope of the security requirement, so that we can work knowing what we have audited, and knowing that the auditors always shared with NCSC. We need a clear understanding between Ofcom and us as providers before the legislation is enforced, so that we understand exactly the boundaries and the scope, and we all work together, having done the audits, to close any vulnerabilities that we might have. That is a clear aspect of our working together: ensuring that the assets in the telecoms network infrastructure that are in scope are very well defined.
Q
Derek McManus: There are a number of different security threats. I will talk about network from a physical point of view, though there are obviously also scams and threats through direct human contact. It is mostly penetration of the physical network either from attack or from virus software. Attack is where foreign agencies or bodies look for vulnerabilities or holes in your defences. The role of the telecoms operator is to ensure that all its physical equipment and software are of the highest support and variation that defends from attack. We see quite a high volume of attack, either DDoS or penetration, on a regular basis. As I said, we do cyber-security by design. It is built into the fundamental processes of expanding and adding to our network, to protect us from those very things.
Andrea Donà: To add to what Derek says, it is also important that Government play a role in securing the additional security needs across the whole ecosystem of the supply chain, including the vendors. With the ever-changing nature of the threats we are exposed to, as Derek explained in layman’s terms, we have to change the protocols and the rules by which we and our vendors implement our defence mechanisms.
It is important that the Government do not leave providers such as us alone to reinforce these additional minimum security standards; they should play an active role in ensuring that vendors adapt their technology road map, so that things are done in a much more future-ready, cyber-security-compliant manner, because we face an ever-changing picture and ever-changing scenarios.
Patrick Binchy: In terms of the threats and penetration, as Derek said, the key things are that they get into the networks, either to bring the networks down and create chaos for the UK economy, or to extract information from the networks. All our security, as both my colleagues have said, is built into design, right from the very start of the procurement process. How do we protect against, and build networks that are able to detect, avoid and block, any of those risks and threats? We do that through our knowledge, the knowledge of NCSC and the authorities, and the knowledge of the wider industry on what is going on beyond the UK and in the international regime. We are constantly reviewing and updating our capability to protect against any of those threats.
Gentlemen, we are right up against the clock. We have seven minutes left. Your answers are superb, but they need to be pithy, because we have three sets of questions coming and we need to get the answers in, and I am afraid that 12.30 pm is a hard cut-off; I am not allowed to extend beyond that.
Thank you. The running order is Dean Russell, Miriam Cates, Kevan Jones, Christian Matheson and Chi Onwurah.
Q
Alex Towers: I think we see long term that diversification of vendors would be good for the operators in the marketplace if we can get to that point. It is important to say, I suppose, as the other operators were doing earlier on, that we are not at that point right now, so we are having to manage a situation where with the market as it stands we have a small number of very large-scale, important vendors and suppliers and we are having to remove one of them, clearly, from the 5G marketplace. That creates a degree of complexity and engineering difficulty that we need to just work our way through; so there is a lot of work to do just to manage within the current market framework to replace Huawei and to bring Nokia and Ericsson to the point we want. While we are doing that, if we can at the same time create the prospects of, in the longer term, a more open marketplace with a wider range of vendors—with other-scale vendors that do not quite work at the minute in the UK market, and Howard could probably explain exactly why that is, as well as with the potential for open RAN and other types of technology and software-based models to be developed—that is good for the whole industry and could be good for UK jobs and potential UK companies and therefore also for the citizen.
Howard Watson: I certainly welcome the Government’s supply chain diversification initiative here. It is concerning that we are moving from, essentially, three suppliers in the mobile supply chain down to only two. Our network going forward will use both of those. So widening that choice over time, for all the operators in the UK, is I think a critical opportunity. Please bear in mind that most operators quite like to have a primary source and a second source. It is unlikely that we will all start deploying equipment from four or five different vendors, because the operational challenge of the person in the van maintaining that tends to limit you to a choice of two; but being able to choose two from six is a lot better than choosing two from two, of course.
We welcome the three initiatives, which I will summarise. The first is whether we can we encourage Samsung, NEC and other large vendors who build mobile networks elsewhere to enter the UK market. The second is open RAN and it really just creates through more open standards the ability to have more players in that end-to-end solution. The third area really is to have a thriving research agenda for the UK. We really welcome the £250 million allocated in the recent spending review. We already have a thriving research capability in the UK and I think continuing to focus that on antenna design, optoelectronics and semiconductors will have a role to play in diversification going forward.
Q
Howard Watson: I actually think the structure of the Bill accommodates that quite well. It allows secondary legislation and guidelines to be upgraded. We note the critical role of the National Cyber Security Centre working with Government in doing that. I think, actually, you have taken care of that well with the way the Bill is structured.
Alex Towers: Yes, I would completely agree with that. I suppose our concern, slightly, at the minute, is to see some of the detail that is going to sit underneath the Bill in terms of a code of practice, in particular, and secondary legislation, because that is where it will become clear exactly what the implications are for operators. The sooner we can see some of that detail and get into the teeth of that, that would be great; but the way the Bill is structured, to allow that sort of detail to be updated on a regular basis as the world changes around us, seems totally sensible.
(3 years, 11 months ago)
Public Bill CommitteesQ
Matthew Evans: I am happy to take that as well. We completely agree with the overall objective of the Bill, which we think provides clarity to the sector and helps us to further enhance the security and resilience of the UK’s telecommunication networks. Obviously, as more and more services and applications are used over our fixed and mobile networks, ensuring their security and resilience is incredibly important. That is why we are pleased to welcome the Bill and the associated diversification strategy alongside it, which is obviously separate to the Bill but intrinsic to matters of resilience as we seek to broaden the supply chain.
Hamish MacLeod: I should perhaps reiterate what my colleague said this morning—that the mobile sector very much welcomes the Bill. Security has always been a top priority for mobile operators. We have always worked closed closely with the National Cyber Security Centre, but this is a great opportunity to formalise the arrangements and to make them more structured and transparent.
Chi Onwurah, did I detect that you were going to ask questions on behalf of Catherine West?
Q
Charles Parton: I think you are absolutely right to focus on our Five Eyes allies, in particular America and Australia—Canada and New Zealand at the moment are a little bit undeclared—which have come out very forthrightly to say that we really should not be entertaining Huawei in our systems. We have now followed them—even if only by 2027—and I think that is very much the right decision for a number of reasons, which I could go into if you wish me to.
I am not a technologist, and look at it much more from the political angle. It seems to me, if I may say briefly on the technology and the 5G system that is going to last us for the best part of 25 years and on which, no doubt, 6G will be built, that the idea that we can stay ahead in technology and be absolutely certain for the next two or three decades that we are ahead of the game and can keep them out of manipulating our data or using it in some advantageous fashion, is one of very great trust in our own abilities—first, they are putting enormous resources into it.
There are other reasons why the decision to get rid of Huawei was correct, and one is what I call the “black vulture of policy”. We have seen the way in which China will bully and sit on those countries that go against its wishes, in whatever field—way outside telecom. If you are dependent on another country’s systems, whether for getting equipment on time, or upgrades—let alone the more devious aspects of possible interference—I think that you will be looking at that black vulture and thinking, “Is it safe to pursue a policy that is very much in my interests, on telecoms, if I am going to be hit hard in other areas?” We have seen that: Australia, at the moment, is under the cosh; the UK was under the cosh when the Dalai Lama visited in 2012; Norway has been under the cosh, and so on.
In that context, are we saying that Huawei rules the Chinese Communist party’s policies? Of course not, but they are very intimately linked. I think that if the Chinese Communist party says to Huawei, “Jump!”, the only response from Huawei is, “Yes, sir! In what direction and how high?” You might look at the national security laws and say that those of course oblige them to co-operate and all that, but I do not think that matters so much—if the Communist party says, “Do it!”, they have no choice. If you look at how close they are, as another illustration, look at what is happening in Canada with the two hostages and the chief financial officer, Meng Wanzhou. Again, I could go into more detail if you want.
Also, there is the financial support that Huawei has received over the years, in terms of cheap finance, loans to customers, tax rebates and so on. Why does it do that? Because the Communist party wants to dominate the technology of the future, and Huawei is its tool for doing that. So I think that to trust Huawei in the long term would be a very unwise decision.
Dr Steedman: Can I take us back to the Bill and talk in that context? We are in a period of very rapid technological development and evolution. Many countries, including the Five Eyes countries, have allowed the market to drive this forward and not perhaps paid attention to it. While this was a hardware-driven sort of infrastructure, that was possibly manageable, and we have managed it over the last few years fairly satisfactorily. But looking ahead to the 5G and, perhaps—who knows?—the 6G world, we have moved to a much more vulnerable position away from hardware and towards software.
I welcome this Bill because I think it is incumbent on countries that want to protect themselves with secure and resilient infrastructure, and because it puts in place a structure of regulation, guidance and standards, which I represent, that will enable a transformation in the industry of the United Kingdom. It will enable us to use technology and software from providers all over the world, but also from SMEs and start-ups in the UK that we can encourage, and create a really innovation-friendly future. But to do that we have to create a market framework that is structured under a quality piece of regulation that enables that to take place in a clear way—clear for the market, clear for the regulator Ofcom, and clear for the Department that manages it on behalf of the Government.
In this Bill we see clear statements about new duties, codes of practice and guidance—another form of standard —to be approved by a Secretary of State for the industry, and also indications about the use of industry standards to support and deliver a new policy. We can really play to our strength in the UK, where we work in a very performance-based market structure, and we can enable a pro-innovation culture that will stimulate and deliver the diversification, security and resilience that we are looking for.
It is not unusual in the world that major commercial players, given free rein, try to influence things in the direction that suits them best. It is not unusual. We are talking about China specifically, but it is not unusual. The key to this is ensuring that in the standards landscape, which is used to support the delivery of regulatory bodies, the governance and processes of the development of those standards is managed and influenced with UK stakeholder interest at heart. In the big landscape of standards, which we might want to talk about further, there is a very wide range of organisations developing standards, from the fringes to the formal systems, and we can discuss and deploy that in a coherent and consistent way.
There is evidence from other Departments of how this works in a co-regulatory manner, supporting industry, Government, Departments and the regulator to deliver the outcomes that we as a nation desperately want.
Q
Charles Parton: Of course, Huawei got the headlines because of the urgent need for 5G, but you are absolutely right that it is not the only player in telecoms, and indeed telecoms is not the only subject. I think that we need to look much more seriously at the whole question of technological co-operation with China. This gets into the whole question of divergence, or decoupling if you are American.
We have to recognise that, whereas our aim in China relations is to maximise trade, investment, global goods and so on, there are increasingly limits because divergence is happening. The intention of the Chinese Communist party is to dominate. As Xi Jinping in fact said in his first speech to the Politburo, the intention is to dominate western capitalism. He said that the Chinese system will take the superior position. Clearly, technology and its advance is a very important way of doing that, so it is not just Huawei and 5G. Therefore, we have to look very carefully at the whole question—that, I suppose, is what lies behind the National Security and Investment Bill—of how we co-operate on technology with China.
I have called for this a number of times, as many others have. The Government will need to set up a body and give much clearer guidance on which subjects in this field of technology we can co-operate happily with China, as well as which organisations—many are connected with the military, and the distinction between civil and military technology is eroding—and which individuals, because there are a number of individuals who have taken back or collected technology to help the Chinese security apparatus develop it.
You are absolutely right that it is really important to look much more broadly than Huawei. The company that comes immediately to mind is Hikvision, because it has such a large amount of the CCTV market. Secretary of State Dominic Raab made an interesting point in his speech the other day about the reputational harm that could be done to some of our companies if they are co-operating with Chinese companies that are deeply involved in the surveillance state, of which of course Huawei and Hikvision are two. Huawei has three laboratories with the public security bureau in Xinjiang, and is devising for them technology that will enable them to pick out Uyghur faces in crowds. That is on that side.
I think your second question was, why has Huawei been successful?