(1 year, 11 months ago)
Commons ChamberI agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.
On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.
I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.
The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.
Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.
During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?
My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.
New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.
I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.
Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.
In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.
The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.
I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.
The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”
If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—
Thank you, Madam Deputy Speaker. The Minister has the potential to do so much with this Bill. I urge him to do it, and to do it speedily, because that is what this country really needs.
I do not agree with every detail of what the hon. Member for Rotherham (Sarah Champion) said, but I share her aims. She has exactly the right surname for what she does in standing up for children.
To avoid the risk of giving my Whip a seizure, I congratulate the Government and the Minister on all they have done so far, both in delaying the Bill and in modifying their stance.
My hon. Friend the Member for Solihull (Julian Knight), who is no longer in the Chamber, said that this is five Bills in one and should have had massively more time. At the risk of sounding like a very old man, there was a time when this Bill would have had five days on Report. That is what should have happened with such a big Bill.
Opposition Members will not agree, but I am grateful that the Government decided to remove the legal but harmful clause. The simple fact is that the hon. Member for Pontypridd (Alex Davies-Jones) and I differ not in our aim—my new clause 16 is specifically designed to protect children—but on the method of achieving it. Once upon a time, there was a tradition that this Chamber would consider a Companies Bill every year, because things change over time. We ought to have a digital Bill every year, specifically to address not legal but harmful but, “Is it harmful enough to be made illegal?” Obviously, self-harm material is harmful enough to be made illegal.
The hon. Lady and I have similar aims, but we have different perspectives on how to attack this. My perspective is as someone who has seen many pieces of legislation go badly wrong despite the best of intentions.
The Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), knows he is a favourite of mine. He did a fantastic job in his previous role. I think this Bill is a huge improvement, but he has a lot more to do, as he recognises with the Bill returning to Committee.
One area on which I disagree with many of my hon. and right hon. Friends is the question of encryption. The Bill allows Ofcom to issue notices directing companies to use “accredited technology,” but it might as well say “magic,” because we do not know what is meant by “accredited technology.” Clause 104 will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications. The clause sounds innocuous and legalistic, especially given that the notices will be issued to remove terrorist or child sexual exploitation content, which we all agree has no place online.
I do not think I am changing my view. I am saying that this is not the last stage of the Bill, so there will be plenty of opportunity further to test this, should Members want to do so.
On new clause 28, the Government recognise and agree with the intent behind this amendment to ensure that the interests of child users of regulated services are represented. Protecting children online is the top priority in this Bill, and its key measures will ensure that children are protected from harmful content. The Bill appoints a regulator with comprehensive powers to force tech companies to keep children safe online, and the Bill’s provisions will ensure that Ofcom will listen and respond to the needs of children when identifying priority areas for regulatory action, setting out guidance for companies, taking enforcement action and responding to super-complaints.
Right from the outset, Ofcom must ensure that its risk assessment and priorities reflect the needs of children. For example, Ofcom is required to undertake research that will help understand emerging risks to child safety. We have heard a lot today about the emerging risks with changing technology, and it is important that we keep on top of those and have that children’s voice at the heart of this. The Bill also expands the scope of the Communications Consumer Panel to online safety matters. That independent panel of experts ensures that user needs are at the heart of Ofcom’s regulatory approach. Ofcom will also have the flexibility to choose other mechanisms to better understand user experiences and emerging threats. For example, it may set up user panels or focus groups.
Importantly, Ofcom will have to engage with expert bodies representing children when developing codes of practice and other regulatory guidance. For example, Ofcom will be required to consult persons who represent the interests of children when developing its codes of practice. That means that Ofcom’s codes will be fully informed by how children behave online, how they experience harm and what impact the proposed measures will have on their online experience. The super-complaints process will further enable independent bodies advocating for children to have their voices heard, and will help Ofcom to recognise and eliminate systemic failures.
As we have heard, the Government also plan to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it develops its code of practice. That amendment will be tabled in the House of Lords. Through this consultation, the commissioner will be able to flag systemic issues or issues of particular importance to the regulator, helping Ofcom to target investigations and, if necessary, sanctions at matters that most affect children’s online experience.
As such, there are ample opportunities in the framework for children’s voices to be heard, and the Government are not convinced of the need to legislate for another child user advocacy body. There are plenty of bodies out there that Ofcom will already be reaching out to and there is an abundance of experience in committed representative groups that are already engaged and will be engaged with the online safety framework. They include the existing statutory body responsible for promoting the interests of children, the Children’s Commissioner. Adding an additional statutory body would duplicate existing provision, creating a confusing landscape, and that would not be in the best interests of children.
I hear what the Minister is saying about creating a statutory body, but will he assure this House that there is a specific vehicle for children’s voices to be heard in this? I ask because most of us here are not facing the daily traumas and constant recreation of different apps and social media ways to reach out to children that our children are. So unless we have their voice heard, this Bill is not going to be robust enough.
As I say, we are putting the Children’s Commissioner as a statutory consultee in the Bill. Ofcom will also have to have regard to all these other organisations, such as the 5Rights Foundation and the NSPCC, that are already there. It is in the legislation that Ofcom will have to have regard to those advocates, but we are not specifically suggesting that there should be a separate body duplicating that work. These organisations are already out there and Ofcom will have to reach out to them when coming up with its codes of practice.
We also heard from my hon. Friend the Member for Dover (Mrs Elphicke) about new clause 55. She spoke powerfully and I commend her for all the work she is doing to tackle the small boats problem, which is affecting so many people up and down this country. I will continue to work closely with her as the Bill continues its passage, ahead of its consideration in the Lords, to ensure that this legislation delivers the desired impact on the important issues of illegal immigration and modern slavery. The legislation will give our law enforcement agencies and the social media companies the powers and guidance they need to stop the promotion of organised criminal activity on social media. Clearly, we have to act.
My right hon. Friend the Member for Witham (Priti Patel), who brings to bear her experience as a former Home Secretary, spoke eloquently about the need to have joined-up government, to make sure that lots of bits of legislation and all Departments are working on this space. This is a really good example of joined-up government, where we have to join together.
(3 years, 8 months ago)
Commons ChamberThe economic impact of the coronavirus pandemic has been immediate and severe. While it is welcome that the Government have taken some steps to protect jobs in the short term, the reality for many is that they face losing jobs that were stable and secure prior to this crisis.
Sadly, that process has already begun in Rotherham. In my constituency, 75 workers at Rolls-Royce face imminent redundancy. Those are well-paid, highly-skilled jobs, and their loss will have a devastating impact on the town. Rolls-Royce’s Rotherham facility is based at the Advanced Manufacturing Park, which I know will be familiar to the Minister. The park is a world-class base for innovation, research and manufacturing and the jewel in the crown of our local economy. The Government are more than happy to use it as a backdrop for policy announcements. What they must do now, however, is defend its long-term future. The aerospace sector has been hit especially hard by the pandemic, and I appreciate the profound challenges that Rolls-Royce faces. The need for the Government to support this strategically important industry is self-evident. They must recognise the inextricable link between aerospace and the wider aviation sector. That is particularly true for businesses such as Rolls-Royce, which derives revenue from the flying hours of the engines it produces. The global travel taskforce is an important step, but the aviation and aerospace sectors need a clear exit strategy, and one that works internationally. These businesses are global and do not work just to a UK boundary.
The UK should use its chairing of the G7 this year to create a global plan to get aviation flying again. Aerospace and aviation are industries that may take considerably longer than others to recover once restrictions are lifted. The Government need to acknowledge that additional, long-term business support will be needed. This means also accepting that measures such as furlough may need to continue beyond September in certain sectors. The Government should view this as an investment for future prosperity. As the Chancellor himself acknowledged in the Budget:
“Business investment creates jobs, lifts growth, spurs innovation and drives productivity.”—[Official Report, 3 March 2021; Vol. 690, c. 527.]
I agree, but the Government’s rhetoric on levelling up the north will ring hollow if they stand idly by while dedicated, highly skilled workers lose their jobs.
While I am sympathetic to the challenges that the industry faces, Rolls-Royce is not without fault. I am concerned that despite the furlough scheme, it is pressing ahead with substantial job losses. Furlough’s very purpose is to prevent that from happening, so why is it not using it? These are skilled employees who will not easily be replaced as the industry recovers. To show them the door now is deeply short-sighted and will have wider implications for the supply chain and Rotherham’s economy. Taxpayer support for business to survive the current crisis must aim to protect jobs and not the bottom line of shareholders.
(5 years, 1 month ago)
Commons ChamberUrgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.
Each Urgent Question requires a Government Minister to give a response on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
My hon. Friend highlights one of the crucial differences between our new policy approach and the old one, which is that we are now able, via the “Online Harms” White Paper, to consider what the duty of care might mean for social media companies in a way that would not have been in the scope of the original proposal. That is just one example that demonstrates how much further we are able to go with this new approach, and it is a reason why this is the right thing to do, even though it is a tough decision.
I am really struggling to understand the logic here. Some 95% of 14-year-olds have seen porn, and the harm that it causes to future relationships is well documented. Why, when the age verification regulator was ready to install this measure by Christmas, can it not go ahead? When, under the Minister’s new proposals, will we see protections in place for children?
I sympathise with what the hon. Lady seeks to achieve, but we can do more by going slightly slower. As I have said, we will respond to the consultation by Christmas and bring forward legislation for prelegislative scrutiny in the new year. I hope that she will work with us on that. We will, of course, seek to bring forward this part of that agenda much more rapidly than the whole package, because, as she says, this is hugely important. Getting it right is important, but getting it enacted quickly is also important.
(5 years, 11 months ago)
Commons ChamberOrder. Before I call the Opposition spokesman, let me say it will be obvious that many people wish to speak. This debate runs until 8.36 pm and I see people with large wads of notes. It might be helpful for colleagues to know now that they should edit down their notes to some three or four minutes.
On a point of order, Madam Deputy Speaker. You can see by the number of people who want to speak and the amount of notes we have that this is something we are really keen for the Government to get right. May I therefore ask whether there is any opportunity to extend the debate, at least towards its allocated time?
That is a perfectly reasonable point of order, but not now. There was a point when Mr Speaker asked whether the House agreed to take the three matters we are discussing this evening together or separately. At that point, anyone could have objected and each would have been taken separately; thus there would have been a much longer debate, but I am afraid that that moment has passed. However, it is very good, just for once, to have a point of order that is a real point of order, and I thank the hon. Lady for it.
I will rattle through some points, because I would like them to be on the record for the Minister and the Secretary of State.
On the guidance on the ancillary service providers, under section 15(1)(d) of the Digital Economy Act 2017 and annex one of the guidance, pornography material is defined as a video work or material that has been issued an 18 certificate and that
“it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal”.
This is a neutral definition that fails to recognise that porn is almost always coercive, usually violent, aggressive and degrading, and is gendered. It is also almost always men doing it to women. Other countries are broad in their definition of pornography, to capture that aspect of it. In Spain, it is defined as “pornography, gender violence, mistreatment”, and in Poland as very strong and explicit violence, racist comments, bad language and erotic scenes. Does the Minister agree that our definition could be amended to acknowledge that pornography represents gendered violence, misogyny and abuse?
Am I right that the point my hon. Friend wants to register this evening is that there is much to learn from other countries?
That is absolutely right, and that becomes more apparent as we go forward. This legislation is very UK-based; pornography, of course, is international.
Minister, I am very concerned about the ability of the BBFC to compel ancillary service providers and payment-service providers to block access to non-compliant pornography services, as described under sections 21 and 23 of the Digital Economy Act. What power does the BBFC have to force companies to comply with its enforcement measures? What happens if credit card companies, banks or advertising agencies refuse to comply? I know of pornographic sites that accept supermarket points instead of cash to get around such legislation from other countries. What assessment has the Minister made of the likelihood of opportunistic websites being established to circumvent UK legislation and the child protection risks that follow? It is unclear how the BBFC will appraise sites and what review mechanisms it will put in place to judge whether the scheme is effective in practice.
Under part 1, paragraph 10 of the guidance:
“The BBFC will report annually to the Secretary of State”.
Will the Minister commit to an interim review after six months from the implementation date, so that we can see whether this is working? Under part 1, paragraph 11 of the guidance,
“the BBFC will…carry out research… into the effectiveness of the regime”
with a view to child protection “from time to time”. As that is the very purpose of the legislation, does the Minister agree that this should occur at least every two years? Under part 2, paragraph 7 of the guidance,
“the BBFC will…specify a prompt timeframe for compliance”.
However, there is no detail on what this timeframe is. It could be a week—it might be a year. Will the Minister please explain the timetable for enforcement?
The guidance also details the enforcement measures available to the BBFC in the case of a non-compliant provider. I broadly welcome those enforcement measures, but I am concerned about the ability of the BBFC to take action. Will the Minister tell us which body will be effectively enforcing these punishments? Will it be the Department for Digital, Culture, Media and Sport or the Home Office? Will the Minister put on the record the additional resources being committed both to the BBFC and whichever Government agent is meant to enforce the legislation?
Turning to the BBFC guidance on age-verification arrangements, I want to register my concerns about the standards laid out on what constitutes sufficient age verification from providers. Section 3, paragraph 5 mentions
“an effective control mechanism at the point of registration or access by the end user which verifies that the user is aged 18 or over at the point of registration or access”.
That is very vague and could in practice mean any number of methods, many of which are yet to be effectively put to the test and some of which may jeopardise the security of personal data. That raises concerns about the robustness of the whole scheme, so will the Minister detail how she plans to ensure that the qualifying criteria are not so lax as to be useless?
Part 4, paragraph 3a states that
“age-verification systems must be designed with data protection in mind—ensuring users’ privacy is protected by default”.
Has the Minister also made an assessment of the safeguarding implications for the personal data of children, some of whom may attempt to falsify their age to access pornographic imagery? Following the data hack of Ashley Madison, that has concerning implications for adults and children alike. While age verification certainly is not a silver bullet, as an idea it does have a place in a regulatory child protection framework. However, we need to ensure that that framework is as robust as it can be. Guidelines for websites that host pornographic material must be clear, so that the policy can be rigorously applied and potential loopholes are closed.
I also want to say that this has to work across Government. At the moment, we are still waiting for the Department for Education to bring forward the guidance on relationship and sex education. Unless we prevent, we cannot—
(6 years, 10 months ago)
Commons ChamberUrgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.
Each Urgent Question requires a Government Minister to give a response on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
My hon. Friend is absolutely spot on. Without that decision and without the support to bring down the threshold to £150,000, there would still be silence on this issue, and now there is not, which is good.
This urgent question plus the gender pay gap figures released at the weekend show that gender assumptions across the UK are still pervasive—assumptions about what a woman is worth, what her potential is and what she can aspire to. What will the Minister do in his new role to tackle those assumptions?
Getting to the bottom of this problem in the BBC is not just important for the BBC itself and for all the brilliant women who work in the BBC and who are not paid as much as their male counterparts doing the same job. It is symbolic across the whole country and shows that we believe in the equality of opportunity and in people being paid fairly. Gender should not define how much an individual is paid.