Kirsty Blackman
Main Page: Kirsty Blackman (Scottish National Party - Aberdeen North)(2 years, 6 months ago)
Public Bill CommitteesI rise to agree with all the amendments in this group that have been tabled by the Opposition. I want to highlight a couple of additional groups who are particularly at risk in relation to fraudulent advertising. One of those is pensioners and people approaching pension age. Because of the pension freedoms that are in place, we have a lot of people making uninformed decisions about how best to deal with their pensions, and sometimes they are able to withdraw a significant amount of money in one go. For an awful lot of people, withdrawing that money and paying the tax on it leads to a major financial loss—never mind the next step that they may take, which is to provide the money to fraudsters.
For pensioners in particular, requiring adverts to be clearly different from other search results would make a positive difference. The other thing that we have to remember is that pensioners generally did not grow up online, and some of them struggle more to navigate the internet than some of us who are bit younger.
I speak with some experience of this issue, because I had a constituent who was a pensioner and who was scammed of £20,000—her life savings. Does my hon. Friend realise that it is sometimes possible to pressurise the banks into returning the money? In that particular case, I got the money back for my constituent by applying a great deal of pressure on the bank, and it is worth knowing that the banks are susceptible to a bit of publicity. That is perhaps worth bearing in mind, because it is a useful power that we have as Members of Parliament.
I thank my hon. Friend for his public service announcement. His constituent is incredibly lucky that my hon. Friend managed to act in that way and get the money back to her, because there are so many stories of people not managing to get their money back and losing their entire life savings as a result of scams. It is the case that not all those scams take place online—people can find scams in many other places—but we have the opportunity with the Bill to take action on scams that are found on the internet.
The other group I want to mention, and for whom highlighting advertising could make a positive difference, is people with learning disabilities. People with learning disabilities who use the internet may not understand the difference between adverts and search results, as the hon. Member for Worsley and Eccles South mentioned. They are a group who I would suggest are particularly susceptible to fraudulent advertising.
We are speaking a lot about search engines, but a lot of fraudulent advertising takes place on Facebook and so on. Compared with the majority of internet users, there is generally an older population on such sites, and the ability to tackle fraudulent advertising there is incredibly useful. We know that the sites can do it, because there are rules in place now around political advertising on Facebook, for example. We know that it is possible for them to take action; it is just that they have not yet taken proper action.
I am happy to support the amendments, but I am also glad that the Minister has put these measures in the Bill, because they will make a difference to so many of our constituents.
I thank the hon. Member for Aberdeen North for her latter remarks. We made an important addition to the Bill after listening to parliamentarians across the House and to the Joint Committee, which many people served on with distinction. I am delighted that we have been able to make that significant move. We have heard a lot about how fraudulent advertising can affect people terribly, particularly more vulnerable people, so that is an important addition.
Amendments 23 and 24 seek to make it clear that where the target is in the UK, people are covered. I am happy to assure the Committee that that is already covered, because the definitions at the beginning of the Bill—going back to clause 3(5)(b), on page 3—make it clear that companies are in scope, both user-to-user and search, if there is a significant number of UK users or where UK users form one of the target markets, or is the only target market. Given the reference to “target markets” in the definitions, I hope that the shadow Minister will withdraw the amendment, because the matter is already covered in the Bill.
New clause 5 raises important points about the regulation of online advertising, but that is outside the purview of what the Bill is trying to achieve. The Government are going to work through the online advertising programme to tackle these sorts of issues, which are important. The shadow Minister is right to raise them, but they will be tackled holistically by the online advertising programme, and of course there are already codes of practice that apply and are overseen by the Advertising Standards Authority. Although these matters are very important and I agree with the points that she makes, there are other places where those are best addressed.
New clause 6 is about the verification process. Given that the Bill is primary legislation, we want to have the core duty to prevent fraudulent advertising in the Bill. How that is implemented in this area, as in many others, is best left to Ofcom and its codes of practice. When Ofcom publishes the codes of practice, it might consider such a duty, but we would rather leave Ofcom, as the expert regulator, with the flexibility to implement that via the codes of practice and leave the hard-edged duty in the Bill as drafted.
I absolutely agree with the points that have been made about the violence against women code of conduct. It is vital, and it would be a really important addition to the Bill. I associate myself with the shadow Minister’s comments, and am happy to stand alongside her.
I want to make a few comments about new clause 20 and some of the issues it raises. The new clause is incredibly important, and we need to take seriously the concerns that have been raised with us by the groups that advocate on behalf of children. They would not raise those concerns if they did not think the Bill was deficient in this area. They do not have spare people and cannot spend lots of time doing unnecessary things, so if they are raising concerns, those are very important things that will make a big difference.
I want to go a little further than what the new clause says and ask the Minister about future-proofing the Bill and ensuring that technologies can be used as they evolve. I am pretty sure that everybody agrees that there should be no space where it is safe to share child sexual exploitation and abuse, whether physical space or online space, private messaging or a more open forum. None of those places should be safe or legal. None should enable that to happen.
My particular thought about future-proofing is about the development of technologies that are able to recognise self-generated pictures, videos, livestreams and so on that have not already been categorised, do not have a hash number and are not easy for the current technologies to find. There are lots of people out there working hard to stamp out these images and videos online, and I have faith that they are developing new technologies that are able to recognise images, videos, messages and oral communications that cannot currently be recognised.
I agree wholeheartedly with the new clause: it is important that a report be produced within six months of the Bill being passed. It would be great if the Minister would commit to thinking about whether Ofcom will be able to require companies to implement new technologies that are developed, as well as the technologies that are currently available. I am not just talking about child sexual abuse images, material or videos; I am also talking about private messaging where grooming is happening. That is a separate thing that needs to be scanned for, but it is incredibly important.
Some of the stories relayed by the shadow Minister relate to conversations and grooming that happened in advance of the self-generated material being created. If there had been a proactive action to scan for grooming behaviour by those companies whose platforms the direct messaging was taking place on, then those young people would potentially have been in a safer place, because it could have been stopped in advance of that self-generated material being created. Surely, that should be the aim. It is good that we can tackle this after the event—it is good that we have something—but tackling it before it happens would be incredibly important.
Online sexual exploitation is a horrific crime, and we all want to see it ended for good. I have concerns about whether new clause 20 is saying we should open up all messaging—where is the consideration of privacy when the scanning is taking place? Forgive me, I do not know much about the technology that is available to scan for that content. I do have concerns that responsible users will have an infringement of privacy, even when doing nothing of concern.
I do not know whether everybody draws the same distinction as me. For me the distinction is that, because it will be happening with proactive technology—technological means will be scanning those messages rather than humans—nobody will see the messages. Software will scan messages, and should there be anything that is illegal—should there be child sexual abuse material—that is what will be flagged and further action taken.
I am not sure whether the hon. Member for Wolverhampton North East heard during my contribution, but this technology does exist, so it is possible. It is a false argument made by those who believe that impacting end-to-end encryption will limit people’s privacy. The technology does exist, and I named some that is able to scan without preventing the encryption of the data. It simply scans for those images and transfers them over existing databases. It would have no impact on anybody’s right to privacy.
I thank the shadow Minister for her assistance with that intervention, which was incredibly helpful. I do not have concerns that anybody will be able to access that data. The only data that will be accessible is when the proactive technology identifies something that is illegal, so nobody can see any of the messages except for the artificial intelligence. When the AI recognises that something is abuse material, at that point the Bill specifies that it will go to the National Crime Agency if it is in relation to child abuse images.
My concern is that, at the point at which the data is sent to the National Crime Agency, it will be visible to human decision making. I am wondering whether that will stop parents sharing pictures of their babies in the bath? There are instances where people could get caught up in a very innocent situation that is deemed to be something more sinister by AI. However, I will take the advice of the hon. Member for Pontypridd advice and look into the technology.
In terms of the secondary processes that kick in after the AI has scanned the data, I assume it will be up to Ofcom and the provider to discuss what happens then. Once the AI identifies something, does it automatically get sent to the National Crime Agency, or does it go through a process of checking to ensure the AI has correctly identified something? I agree with what the Minister has reiterated on a number of occasions; if it is child sexual abuse material then I have no problem with somebody’s privacy being invaded in order for that to be taken to the relevant authorities and acted on.
I want to make one last point. The wording of new clause 20 is about a report on those proactive technologies. It is about requiring Ofcom to come up with and justify the use of those proactive technologies. To give the hon. Member for Wolverhampton North East some reassurance, it is not saying, “This will definitely happen.” I assume that Ofcom will be able to make the case—I am certain it will be able to—but it will have to justify it in order to be able to require those companies to undertake that use.
My key point is about the future-proofing of this, ensuring that it is not just a one-off, and that, if Ofcom makes a designation about the use of proactive technologies, it is able to make a re-designation or future designation, should new proactive technologies come through, so that we can require those new proactive technologies to be used to identify things that we cannot identify with the current proactive technologies.
I want to associate myself with the comments of the right hon. Member for Basingstoke and the hon. Member for Aberdeen North, and to explore the intersection between the work we are doing to protect children and the violence against women and girls strategy. There is one group, girls, who apply to both. We know that they are sadly one of the most vulnerable groups for online harm and abuse, and we must do everything we can to protect them. Having a belt and braces approach, with a code of conduct requirement for the violence against women and girls strategy, plus implementing new clause 20 on this technology that can protect girls in particular, although not exclusively, is a positive thing. Surely, the more thorough we are in the preventive approach, the better, rather than taking action after it is too late?
I agree 100%. The case that the shadow Minister, the hon. Member for Pontypridd, made and the stories she highlighted about the shame that is felt show that we are not just talking about a one-off impact on people’s lives, but potentially years of going through those awful situations and then many years to recover, if they ever do, from the situations they have been through.
I do not think there is too much that we could do, too many codes of practice we could require or too many compliances we should have in place. I also agree that girls are the most vulnerable group when considering this issue, and we need to ensure that this Bill is as fit for purpose as it can be and meets the Government’s aim of trying to make the internet a safe place for children and young people. Because of the additional risks that there are for girls in particular, we need additional protections in place for girls. That is why a number of us in this room are making that case.
This has been an important debate. I think there is unanimity on the objectives we are seeking to achieve, particularly protecting children from the risk of child sexual exploitation and abuse. As we have discussed two or three times already, we cannot allow end-to-end encryption to frustrate or prevent the protection of children.
I will talk about two or three of the issues that have arisen in the course of the debate. The first is new clause 20, a proposal requiring Ofcom to put together a report. I do not think that is strictly necessary, because the Bill already imposes a requirement to identify, assess and mitigate CSEA. There is no optionality here and no need to think about it; there is already a demand to prevent CSEA content, and Ofcom has to produce codes of practice explaining how it will do that. I think what is requested in new clause 20 is required already.
The hon. Member for Pontypridd mentioned the concern that Ofcom had to first of all prove that the CSEA risk existed. I think that might be a hangover from the previous draft of the Bill, where there was a requirement for the evidence to be “persistent and prevalent”—I think that might have been the phrase—which implied that Ofcom had to first prove that it existed before it could take action against it. So, for exactly the reason she mentioned, that it imposed a requirement to prove CSEA is there, we have changed the wording in the new version. Clause 103(1), at the top of page 87, instead of “persistent and prevalent”, now states “necessary and proportionate”. Therefore, if Ofcom simply considers something necessary, without needing to prove that it is persistent and prevalent—just if it thinks it is necessary—it can take the actions set out in that clause. For the reason that she mentioned, the change has been made already.
I am grateful to the Minister for that clarification.
The Government have drafted the Bill in a way that puts codes of practice at its heart, so they cannot and should not be susceptible to delay. We have heard from platforms and services that stress that the ambiguity of the requirements is causing concern. At least with a deadline for draft codes of practice, those that want to do the right thing will be able to get on with it in a timely manner.
The Age Verification Providers Association provided us with evidence in support of amendment 48 in advance of today’s sitting. The association agrees that early publication of the codes will set the pace for implementation, encouraging both the Secretary of State and Parliament to approve the codes swiftly. A case study it shared highlights delays in the system, which we fear will be replicated within the online space, too. Let me indulge Members with details of exactly how slow Ofcom’s recent record has been on delivering similar guidance required under the audio-visual media services directive.
The directive became UK law on 30 September 2020 and came into force on 1 November 2020. By 24 June 2021, Ofcom had issued a note as to which video sharing platforms were in scope. It took almost a year until, on 6 October 2021, Ofcom issued formal guidance on the measures.
In December 2021, Ofcom wrote to the verification service providers and
“signalled the beginning of a new phase of supervisory engagement”.
However, in March 2022 it announced that
“the information we collect will inform our Autumn 2022 VSP report, which intends to increase the public’s awareness of the measures platforms have in place to protect users from harm.”
There is still no indication that Ofcom intends to take enforcement action against the many VSPs that remain non-compliant with the directive. It is simply not good enough. I urge the Minister to carefully consider the aims of amendment 48 and to support it.
Labour supports the principles of clause 42. Ofcom must not drag out the process of publishing or amending the codes of practice. Labour also supports a level of transparency around the withdrawal of codes of practice, should that arise.
Labour also supports clause 43 and the principles of ensuring that Ofcom has a requirement to review its codes of practice. We do, however, have concerns over the Secretary of State’s powers in subsection (6). It is absolutely right that the Secretary of State of the day has the ability to make representations to Ofcom in order to prevent the disclosure of certain matters in the interests of national security, public safety or relations with the Government of a country outside the UK. However, I am keen to hear the Minister’s assurances about how well the Bill is drafted to prevent those powers from being used, shall we say, inappropriately. I hope he can address those concerns.
On clause 44, Ofcom should of course be able to propose minor amendments to its codes of practice. Labour does, however, have concerns about the assessment that Ofcom will have to make to ensure that the minor nature of changes will not require amendments to be laid before Parliament, as in subsection (1). As I have said previously, scrutiny must be at the heart of the Bill, so I am interested to hear from the Minister how exactly he will ensure that Ofcom is making appropriate decisions about what sorts of changes are allowed to circumvent parliamentary scrutiny. We cannot and must not get to a place where the Secretary of State, in agreeing to proposed amendments, actively prevents scrutiny from taking place. I am keen to hear assurances on that point from the Minister.
On clause 45, as I mentioned previously on amendment 65 to clause 37, as it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in subsection (1). However, providers could take alternative measures to comply, as outlined in subsection (5). Labour supports the clause in principle, but we are concerned that the definition of alternative measures is too broad. I would be grateful if the Minister could elaborate on his assessment of the instances in which a service provider may seek to comply via alternative measures. Surely the codes of practice should be, for want of a better phrase, best practice. None of us want to get into a position where service providers are circumnavigating their duties by taking the alternative measures route.
Again, Labour supports clause 46 in principle, but we feel that the provisions in subsection (1) could go further. We know that, historically, service providers have not always been transparent and forthcoming when compelled to be so by the courts. While we understand the reasoning behind subsection (3), we have broader concerns that service providers could, in theory, lean on their codes of practice as highlighting their best practice. I would be grateful if the Minister could address our concerns.
We support clause 47, which establishes that the duties in respect of which Ofcom must issue a code of practice under clause 37 will apply only once the first code of practice for that duty has come into force. However, we are concerned that this could mean that different duties will apply at different times, depending on when the relevant code for a particular duty comes into force. Will the Minister explain his assessment of how that will work in practice? We have concerns that drip feeding this information to service providers will cause further delay and confusion. In addition, will the Minister confirm how Ofcom will prioritise its codes of practice?
Lastly, we know that violence against women and girls has not a single mention in the Bill, which is an alarming and stark omission. Women and girls are disproportionately likely to be affected by online abuse and harassment. The Minister knows this—we all know this—and a number of us have spoken up on the issue on quite a few occasions. He also knows that online violence against women and girls is defined as including, but not limited to, intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive sexting and the creation and sharing of deepfake pornography.
The Minister will also know that Carnegie UK is working with the End Violence Against Women coalition to draw up what a code of practice to tackle violence against women and girls could look like. Why has that been left out of the redraft of the Bill? What consideration has the Minister given to including a code of this nature in the Bill? If the Minister is truly committed to tackling violence against women and girls, why will he not put that on the face of the Bill?
I have a quick question about timelines because I am slightly confused about the order in which everything will happen. It is unlikely that the Bill will have been through the full parliamentary process before the summer, yet Ofcom intends to publish information and guidance by the summer, even though some things, such as the codes of practice, will not come in until after the Bill has received Royal Assent. Will the Minister give a commitment that, whether or not the Bill has gone through the whole parliamentary process, Ofcom will be able to publish before the summer?
Will Ofcom be encouraged to publish everything, whether that is guidance, information on its website or the codes of practice, at the earliest point at which they are ready? That will mean that anyone who has to apply those codes of practice or those regulations—people who will have to work within those codes, for example, or charities or other organisations that might be able to make super-complaints—will have as much information as possible, as early as possible, and will be able to prepare to fully implement their work at the earliest possible time. They will need that information in order to be able to gear up to do that.
I have three short questions for the Minister about clause 40 and the Secretary of State’s powers of direction. Am in order to cover that?