Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
None Portrait The Chair
- Hansard -

Good afternoon, ladies and gentlemen. We are now sitting in public and the proceedings are being broadcast. Thank you all for joining us.

We will now hear oral evidence from Stephen Almond, the director of technology and innovation in the Information Commissioner’s Office. Mr Almond, thank you for coming. As I have introduced you, I am not going to ask you to introduce yourself, so we can go straight into the questions. I call the shadow Front-Bench spokesman.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

Q 224 Thank you for coming to give evidence to us this afternoon, Mr Almond. There has been a lot of debate about the risk end-to-end encrypted platforms pose to online safety. What need is there to mitigate that risk in the Bill?

Stephen Almond: Let me start by saying that the ICO warmly welcomes the Bill and its mission to make the UK the safest place in the world to be online. End-to-end encryption supports the security and privacy of online communication and keeps people safe online, but the same characteristics that create a private space for the public to communicate can also provide a safe harbour for more malicious actors, and there are valid concerns that encrypted channels may be creating spaces where children are at risk.

Our view is that the Bill has the balance right. All services in scope, whether encrypted or not, must assess the level of risk that they present and take proportionate action to address it. Moreover, where Ofcom considers it necessary and proportionate, it will have the power to issue technology notices to regulated services to require them to deal with child sexual abuse and exploitation material. We think this presents a proportionate way of addressing the risk that is present on encrypted channels.

It is worth saying that I would not favour provisions that sought to introduce some form of outright ban on encryption in a generalised way. It is vital that the online safety regime does not seek to trade off one sort of online safety risk for another. Instead, I urge those advancing more fundamentalist positions around privacy or safety to move towards the question of how we can incentivise companies to develop technological innovation that will enable the detection of harmful content without compromising privacy. It is one reason why the ICO has been very pleased to support the Government’s safety tech challenge, which has really sought to incentivise the development of technological innovation in this area. Really what we would like to see is progress in that space.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q On that point around technological advances and enabling people to access the internet, people have raised concerns that tech-savvy children will be able to use VPNs, Tor Browser and other tricks to easily circumnavigate the measures that will be in the Bill, especially around age verification and user identity. How do you respond to that, and how do you suggest we close those loopholes, if we can?

Stephen Almond: First and foremost, it is incredibly important that the Bill has the appropriate flexibility to enable Ofcom as the online safety regulator to be agile in responding to technological advances and novel threats in this area. I think the question of VPNs is ultimately going to be one that Ofcom and the regulator services themselves are going to have to work around. VPNs play an important role in supporting a variety of different functions, such as the security of communications, but ultimately it is going to be critical to make sure that services are able to carry out their duties. That is going to require some questions to be asked in this area.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q One final question from me. I would like to discuss your thoughts on transparency and how we can make social media companies like Meta be more transparent and open with their data, beyond the measures we currently have in the Bill. For instance, we could create statute to allow academics or researchers in to examine their data. Do you have any thoughts on how this can be incentivised?

Stephen Almond: Transparency is a key foundation of data protection law in and of itself. As the regulator in this space, I would say that there is a significant emphasis within the data protection regime on ensuring that companies are transparent about the processing of personal data that they undertake. We think that that provides proportionate safeguards in this space. I would not recommend an amendment to the Bill on this point, because I would be keen to avoid duplication or an overlap between the regimes, but it is critical; we want companies to be very clear about how people’s personal data is being processed. It is an area that we are going to continue to scrutinise.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

May I ask a supplementary to that before I come on to my main question?

--- Later in debate ---
None Portrait The Chair
- Hansard -

Moving, I hope, seamlessly on, we are now going to hear oral evidence from Sanjay Bhandari, who is the chairman of Kick It Out, and—as the Committee agreed this morning—after Tuesday’s technical problems, if we do not have further technical problems, we are going to hear from Lynn Perry from Barnardo’s, again by Zoom. Is Lynn Perry on the line? [Interruption.] Lynn Perry is not on the line. We’ve got pictures; now all we need is Lynn Perry in the pictures.

I am afraid we must start, but if Lynn Perry is able to join, we will be delighted to hear from her. We have Mr Bhandari, so we will press on, because we are very short of time as it is. We hope that Lynn Perry will join us.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good afternoon, Mr Bhandari; thank you for joining us. What response have you as a football charity seen from the social media companies to the abuse that has been suffered by our sports players online? We all saw the horrendous abuse that our football heroes suffered during the Euros last year. What has been the reaction of the social media companies when this has been raised? Why has it not been tackled?

Sanjay Bhandari: I think you would have to ask them why it has not been tackled. My perception of their reaction is that it has been a bit like the curate’s egg: it has been good in parts and bad in parts, and maybe like the original meaning of that allegory, it is a polite way of saying something is really terrible.

Before the abuse from the Euros, actually, we convened a football online hate working group with the social media companies. They have made some helpful interventions: when I gave evidence to the Joint Committee, I talked about wanting to have greater friction in the system, and they are certainly starting to do that with things like asking people, “Do you really want to send this?” before they post something. We understand that that is having some impact, but of course, it is against the backdrop of a growing number of trolls online. Also, we have had some experiences where we make a suggestion, around verification for instance, where we are introducing third-party companies to social media companies, and very often the response we get is different between London and California. London will say “maybe”, and California then says “no”. I have no reason to distrust the people we meet locally here, but I do not think they always have the power to actually help and respond. The short answer is that there are certainly good intentions from the people we meet locally and there is some action. However, the reality is that we still see quite a lot of content coming through.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you for that. The Centre for Countering Digital Hate, which we will hear from later this afternoon, has identified that, as well as a vast majority of abuse being directed on public profiles, it is also done via direct messaging, in private and sometimes on those smaller high-harm platforms. There are concerns raised by others that this would not be covered by the Bill. Do you have any thoughts on that and what would you like to see?

Sanjay Bhandari: I think we need to work that through. I am sorry that my colleagues from the Premier League and the Football Association could not be here today; I did speak to them earlier this week but unfortunately they have got some clashes. One thing we are talking about is how we tag this new framework to exist in content. We have a few hundred complaints that the Premier League investigates, and we have got a few thousand items that are proactively identified by Signify, working with us and the Professional Footballers’ Association. Our intention is to take that data and map it to the new framework and say, “Is this caught? What is caught by the new definition of harm? What is caught by priority illegal content? What is caught by the new communication offences, and what residue in that content might be harmful to adults?” We can then peg that dialogue to real-world content rather than theoretical debate. We know that a lot of complaints we receive are in relation to direct messaging, so we are going to do that exercise. It may take us a little bit of time, but we are going to do that.

None Portrait The Chair
- Hansard -

Lynn Perry is on the line, but we have lost her for the moment. I am afraid we are going to have to press on.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We will hear oral evidence first from Eva Hartshorn-Sanders, who is the head of policy at the Centre for Countering Digital Hate. We shall be joined in due course by Poppy Wood. Without further ado, I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you for joining us this afternoon. I have quoted a lot of the stats that the Centre for Countering Digital Hate has produced on online abuse directed at individuals with protected characteristics. In the previous panel, I mentioned that the vast majority is done via direct messaging, sometimes through end-to-end encryption on platforms. What are your concerns about this issue in the Bill? Does the Bill adequately account for tackling that form of abuse?

Eva Hartshorn-Sanders: That is obviously an important area. The main mechanism to look at are the complaints pathways and ensuring that when reports are made, action is taken, and that that is included in risk assessments as well. In our “Hidden Hate” report, we found that 90% of misogynist abuse, which included quite serious sexual harassment and abuse, videos and death threats, was not acted on by Instagram, even when we used the current pathways for the complainant. This is an important area.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Part of the issue is that the regulated service providers have to rely heavily on the use of AI to facilitate monitoring and take down problematic content in order to comply with the Bill, but, as several stakeholders have said, algorithmic moderation is inadequate for recognising the nuance and subtleties, in order to actively and effectively take down the content. What more would you like to see in the Bill to counteract that issue?

Eva Hartshorn-Sanders: There has to be human intervention as part of that process as well. Whatever system is in place—the relationship between Ofcom and the provider is going to vary by platform and by search provider too, possibly—if you are making those sorts of decisions, you want to have it adequately resourced. That is what we are saying is not happening at the moment, partly because there is not yet the motivation or the incentives there for them to be doing any differently. They are doing the minimum; what they say they are going to do often comes out through press releases or policies, and then it is not followed through.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You mentioned that there is not adequate transparency and openness on how these things work. What systems would you like to see the Bill put the place to ensure the transparency, independence and accountability of Ofcom, but also the transparency and openness of the tech companies and the platforms that we are seeking to regulate?

Eva Hartshorn-Sanders: I think there is a role for independent civil society, working with the regulator, to hold those companies to account and to be accessing that data in a way that can be used to show how they are performing against their responsibilities under the Bill. I know Poppy from Reset.tech will talk to this area a bit more. We have just had a global summit on online harms and misinformation. Part of the outcome of that was looking at a framework for how we evaluate global efforts at legislation and the transparency of algorithms and rules enforcement, and the economics that are driving online harms and misinformation. That is an essential part of ensuring that we are dealing with the problems.

None Portrait The Chair
- Hansard -

May I say, for the sake of the record, that we have now been joined by Poppy Wood, the UK director of Reset.tech? Ms Wood, you are not late; we were early. We are trying to make as much use as we can of the limited time. I started with the Opposition Front Bencher. If you have any questions for Poppy Wood, go ahead.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q I do—thank you, Sir Roger. I am not sure if you managed to hear any of that interaction, Poppy. Do you have any comments to make on those points before I move on?

Poppy Wood: I did not hear your first set of questions—I apologise.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

That is fine. I will just ask you what you think the impact is of the decision to remove misinformation and disinformation from the scope of the Bill, particularly in relation to state actors?

Poppy Wood: Thank you very much, and thank you for having me here today. There is a big question about how this Bill tackles co-ordinated state actors—co-ordinated campaigns of disinformation and misinformation. It is a real gap in the Bill. I know you have heard from Full Fact and other groups about how the Bill can be beefed up for mis- and disinformation. There is the advisory committee, but I think that is pretty weak, really. The Bill is sort of saying that disinformation is a question that we need to explore down the line, but we all know that it is a really live issue that needs to be tackled now.

First of all, I would make sure that civil society organisations are on that committee and that its report is brought forward in months, not years, but then I would say there is just a real gap about co-ordinated inauthentic behaviour, which is not referenced. We are seeing a lot of it live with everything that is going on with Russia and Ukraine, but it has been going on for years. I would certainly encourage the Government to think about how we account for some of the risks that the platforms promote around co-ordinated inauthentic behaviour, particularly with regard to disinformation and misinformation.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q We have heard a lot from other witnesses about the ability of Ofcom to regulate the smaller high-risk platforms. What is your view on that?

Poppy Wood: Absolutely, and I agree with what was said earlier, particularly by groups such as HOPE not hate and Antisemitism Policy Trust. There are a few ways to do this, I suppose. As we are saying, at the moment the small but high-risk platforms just are not really caught in the current categorisation of platforms. Of course, the categories are not even defined in the Bill; we know there are going to be categories, but we do not know what they will be.

I suppose there are different ways to do this. One is to go back to where this Bill started, which was not to have categories of companies at all but to have a proportionality regime, where depending on your size and your functionality you had to account for your risk profile, and it was not set by Ofcom or the Government. The problem of having very prescriptive categories—category 1, category 2A, category 2B—is, of course, that it becomes a race to the bottom in getting out of these regulations without having to comply with the most onerous ones, which of course are category 1.

There is also a real question about search. I do not know how they have wriggled out of this, but it was one of the biggest surprises in the latest version of the Bill that search had been given its own category without many obligations around adult harm. I think that really should be revisited. All the examples that were given earlier today are absolutely the sort of thing we should be worrying about. If someone can google a tractor in their workplace and end up looking at a dark part of the web, there is a problem with search, and I think we should be thinking about those sorts of things. Apologies for the example, but it is a really, really live one and it is a really good thing to think about how search promotes these kinds of content.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to touch on something we have not talked about a lot today, which is enforcement and the enforcement powers in the Bill. There are significant enforcement powers in the Bill, but do our two witnesses here which those enforcement powers are enough. Eva?

Eva Hartshorn-Sanders: Are you specifically asking about the takedown notices and the takedown powers?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good afternoon, both, and thank you for coming this afternoon. We have heard a lot about the journalistic content exemption. What is your view of the current measures in the Bill and their likely consequences?

Owen Meredith: You may be aware that we submitted evidence to the Joint Committee that did prelegislative scrutiny of the draft Bill, because we think that although the Government’s stated intention to have content from recognised news media publishers, who I represent, outside the scope of the Bill, we do not believe that the drafting, as it was and still is, achieves that. Ministers and the Secretary of State have confirmed, both in public appearances and on Second Reading, that they wish to table further amendments to achieve the aim that the Government have set out, which is to ensure that content from recognised news publishers is fully out of scope of the Bill. It needs to go further, but I understand that there will be amendments coming before you at some point to achieve that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q What further would you like to see?

Owen Meredith: I would like to see a full exemption for recognised news publisher content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You would like to see a full exemption. Matt, do you have any thoughts on that?

Matt Rogerson: Yes. I would step back a bit and point to the evidence that a few of your witnesses gave today and Tuesday. I think Fair Vote gave evidence on this point. At the moment, our concern is that we do not know what the legal but harmful category of content that will be included in the Bill will look like. That is clearly going to be done after the event, through codes of practice. There is definitely a danger that news publisher content gets caught by the platforms imposing that. The reason for having a news publisher exemption is to enable users of platforms such as Facebook, Twitter and others to access the same news as they would via search. I agree with Owen’s point. I think the Government are going in the right direction with the exemption for broadcasters such as the BBC, The Times and The Guardian, but we would like to see it strengthened a bit to ensure a cast-iron protection.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Currently, is the definition of journalistic content used in the Bill clear, or do you find it ambiguous?

Matt Rogerson: I think it is quite difficult for platforms to interpret that. It is a relatively narrow version of what journalism is—it is narrower than the article 10 description of what journalism is. The legal definitions of journalism in the Official Secrets Act and the Information Commissioner’s Office journalism code are slightly more expansive and cover not just media organisations but acts of journalism. Gavin Millar has put together a paper for Index on Censorship, in which he talks about that potentially being a way to expand the definition slightly.

The challenge for the platforms is, first, that they have to take account of journalistic content, and there is not a firm view of what they should do with it. Secondly, defining what a piece of journalism or an act of journalism is takes a judge, generally with a lot of experience. Legal cases involving the media are heard through a specific bench of judges—the media and communications division—and they opine on what is and is not an act of journalism. There is a real challenge, which is that you are asking the platforms to—one assumes—use machine learning tools to start with to identify what is a potential act of journalism. Then an individual, whether they are based in California or, more likely, outsourced via an Accenture call centre, then determines within that whether it is an act of journalism and what to do with it. That does place quite a lot of responsibility on the platforms to do that. Again, I would come back to the fact that I think if the Bill was stripped back to focus on illegal content, rather than legal but harmful content, you would have less of these situations where there was concern that that sort of content was going to be caught.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q We have heard a lot of concern about disinformation by state actors purporting to be journalists and using that exemption, which could cause harm. Do you have any thoughts on that?

Matt Rogerson: Yes, a few. The first thing that is missing from the Bill is a focus on advertising. The reason we should focus on advertising is that that is why a lot of people get involved in misinformation. Ad networks at the moment are able to channel money to “unknown” sites in ways that mean that disinformation or misinformation is highly profitable. For example, a million dollars was spent via Google’s ad exchanges in the US; the second biggest recipient of that million dollars was “Unknown sites”—sites that do not categorise themselves as doing anything of any purpose. You can see how the online advertising market is channelling cash to the sort of sites that you are talking about.

In terms of state actors, and how they relate to the definition, the definition is set out quite broadly in the Bill, and it is more lengthy than the definition in the Crime and Courts Act 2013. On top of that definition, Ofcom would produce guidance, which is subject to a full and open public consultation, which would then work out how you are going to apply the definition in practice. Even once you have that guidance in place, there will be a period of case law developing where people will appeal to be inside of that exemption and people will be thrown out of that exemption. Between the platforms and Ofcom, you will get that iteration of case law developing. So I suppose I am slightly more confident that the exemption would work in practice and that Ofcom could find a workable way of making sure that bad actors do not make use of it.

None Portrait The Chair
- Hansard -

Mr Meredith, do you wish to add to that?

Owen Meredith: No, I would echo almost entirely what Matt has said on that. I know you are conscious of time.

--- Later in debate ---
None Portrait The Chair
- Hansard -

One final quick question from the Opposition Front Bench.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Mr Rogerson, you mentioned that platforms and tech companies currently have a list of approved broadcasters that they are enabled to use, to ensure they have that content. Isn’t it true that one of those broadcasters was Russia Today, and it was only because Ofcom intervened to remove it from social media that it was taken down, but under the current provisions in this Bill, Ofcom would not be able to do that and Russia Today would be allowed to spread disinformation on social media platforms?

Matt Rogerson: On the Russia Today problem, I think Russia Today had a licence from Ofcom, so the platforms probably took their cue from the fact that Russia Today was beamed into British homes via Freeview. Once that changed, the position of having their content available on social media changed as well. Ultimately, if it was allowed to go via broadcast, if it had a broadcast licence, I would imagine that social media companies took that as meaning that it was a—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q But under the new Bill, as journalistic content, it would be allowed to remain on those social media platforms.

Matt Rogerson: I think that would be subject to the guidance that Ofcom creates and the consultation on that guidance. I do not believe that Russia Today would be allowed under the definitions. If it is helpful, I could write to you to set out why.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We will now hear from Tim Fassam, the director of government relations and policy at PIMFA, the Personal Investment Management & Financial Advice Association, and from Rocio Concha, director of policy and advocacy at Which? We will be joined by Martin Lewis, of MoneySavingExpert, in due course. Thank you to the witnesses for joining us. I call the Opposition Front Bench.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you for joining us this afternoon. As a constituency MP, I am sure I am not alone in saying that a vast amount of my casework comes from members of my community writing to me to say that they have been scammed online, that they have been subject to fraud and that they feel horrendous about it. They feel shame and they do not know what to do about it. It is the single biggest crime in the UK, with victims losing an estimated £2.3 billion. In your opinion, does the Bill go far enough to tackle that?

Rocio Concha: This Bill is very important in tackling fraud. It is very important for Which? We were very pleased when fraud was included to tackle the issue that you mentioned and also when paid-for advertising was included. It was a very important step, and it is a very good Bill, so we commend DCMS for producing it.

However, we have found some weakness in the Bill, and those can be solved with very simple amendments, which will have a big impact on the Bill in terms of achieving its objective. For example, at the moment in the Bill, search engines such as Google and Yahoo! are not subject to the same duties in terms of protecting consumers from fraudulent advertising as social media platforms are. There is no reason for Google and Yahoo! to have weaker duties in the Bill, so we need to solve that.

The second area is booster content. Booster content is user-generated content, but it is also advertising. In the current definition of fraudulent advertising in the Bill, booster content is not covered. For example, if a criminal makes a Facebook page and starts publishing things about fake investments, and then he pays Facebook to boost that content in order to reach more people, the Bill, at the moment, does not cover that fraudulent advertising.

The last part is that, at the moment, the risk checks that platforms need to do for priority illegal content, the transparency reporting that they need to do to basically say, “We are finding this illegal content and this is what we are doing about it,” and the requirement to have a way for users to tell them about illegal content or complain about something that they are not doing to tackle this, only apply to priority illegal content. They do not apply to fraudulent advertising, but we think they need to.

Paid-for advertising is the most expensive way that criminals have to reach out to a lot of people. The good news, as I said before, is that this can be solved with very simple amendments to the Bill. We will send you suggestions for those amendments and, if we fix the problem, we think the Bill will really achieve its objective.

None Portrait The Chair
- Hansard -

One moment—I think we have been joined by Martin Lewis on audio. I hope you can hear us, Mr Lewis. You are not late; we started early. I will bring you in as soon as we have you on video, preferably, but otherwise on audio.

Tim Fassam: I would echo everything my colleague from Which? has said. The industry, consumer groups and the financial services regulators are largely in agreement. We were delighted to see fraudulent advertising and wider issues of economic crime included in the Bill when they were not in the initial draft. We would also support all the amendments that Which? are putting forward, especially the equality between search and social media.

Our members compiled a dossier of examples of fraudulent activity, and the overwhelming examples of fraudulent adverts were on search, rather than social media. We would also argue that search is potentially higher risk, because the act of searching is an indication that you may be ready to take action. If you are searching “invest my pension”, hopefully you will come across Martin’s site or one of our members’ sites, but if you come across a fraudulent advert in that moment, you are more likely to fall foul of it.

We would also highlight two other areas where we think the Bill needs further work. These are predominantly linked to the interaction between Ofcom, the police and the Financial Conduct Authority, because the definitions of fraudulent adverts and fraudulent behaviour are technical and complex. It is not reasonable to expect Ofcom to be able to ascertain whether an advert or piece of content is in breach of the Financial Services and Markets Act 2000; that is the FCA’s day job. Is it fraud? That is Action Fraud’s and the police’s day job. We would therefore suggest that the Bill go as far as allowing the police and the FCA to direct Ofcom to have content removed, and creating an MOU that enables Ofcom to refer things to the FCA and the police for their expert analysis of whether it breaches those definitions of fraudulent adverts or fraudulent activity.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, both. You mentioned that search is a concern, especially because it is currently out of scope of the Bill in terms of this issue. Another issue is that when people do use search to look for a financial service or something that they wish to purchase, the cookies are remembered. The algorithms on social media platforms are then triggered to promote specific adverts to them as a result of that search history or things they have mentioned via voice control to their home help devices. That is a concern. Digital advertising that you see on third-party websites is also not within scope. That has been raised as well. Do you have any thoughts on those points?

Rocio Concha: Yes. Open-display advertising is not part of the Bill. That also needs to be tackled. I think the online advertising programme should be considered, to tackle this issue. I agree with you: this is a very important step in the right direction, and it will make a huge difference if we fix this small weakness in terms of the current scope. However, there are still areas out there that need to be tackled.

None Portrait The Chair
- Hansard -

Mr Lewis, I am living in hope that we may be able to see you soon—although that may be a forlorn hope. However, I am hoping that you can hear us. Do you want to come in and comment at all at this point? [Interruption.] Oh, we have got you on the screen. Thank you very much for joining us.

Martin Lewis: Hurrah. I am so sorry, everybody—for obvious reasons, it has been quite a busy day on other issues for me, so you’ll forgive me.

None Portrait The Chair
- Hansard -

I can’t think why it has been.

Martin Lewis: I certainly agree with the other two witnesses. Those three issues are all very important to be brought in. From a wider perspective, I was vociferously campaigning to have scam adverts brought within the scope of the Online Safety Bill. I am delighted that that has happened, but let us be honest among ourselves: it is far from a panacea.

Adverts and scams come in so many places—on social media, in search engines and in display advertising, which is very common and is not covered. While I accept that the online advertising programme will address that, if I had my way I would be bringing it all into the Online Safety Bill. However, the realpolitik is that that is not going to happen, so we have to have the support in the OAP coming later.

It is also worth mentioning just for context that, although I think there is little that we can do about this—or it would take brighter people than me—one of the biggest routes for scams is email. Everybody is being emailed—often with my face, which is deeply frustrating. We have flaccid policing of what is going on on social media, and I hope the Bill will improve it, but at least there is some policing, even though it is flaccid, and it is the same on search engines. There is nothing on email, so whatever we do in this Bill, it will not stop scams reaching people. There are many things that would improve that, certainly including far better resourcing for policing so that people who scam individuals get at least arrested and possibly even punished and sentenced. Of course, that does not happen at the moment, because scamming is a crime that you can undertake with near impunity.

There is a lot that needs to be done to make the situation work, but in general the moves in the Online Safety Bill to include scam advertising are positive. I would like to see search engines and display advertising brought into that. I absolutely support the call for the FCA to be involved, because what is and is not a scam can certainly be complicated. There are more obvious ones and less obvious ones. We saw that with the sale of bonds at 5% or 6%, which pretend to be deposit bonds but are nothing of the sort. That might get a bit more difficult for Ofcom, and it would be great to see the regulator involved. I support all the calls of the other witnesses, but we need to be honest with ourselves: even if we do all that, we are still a long way from seeing the back of all scam adverts and all scams.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Thank you, Mr Lewis. My final question is not necessarily about financial services advertising. With the rise of influencer culture, specifically on social media platforms such as TikTok and Instagram, we are seeing a failure to disclose adverts correctly and the potential for harmful advertising. Slimming products, for example, that are not particularly safe, especially for children, are being targeted at children. What more would you like to see this Bill do to tackle some of that? I know the ASA has taken action against some prolific offenders, but what more would you like to see in this Bill to tackle that and keep children safe from adverts that are not marked as such?

Rocio Concha: To be honest, in this area we do not have any specific proposals. I completely agree with you that this is an area that needs to be tackled, but I do not have a specific proposal for this Bill.

Tim Fassam: This is an area that we have raised with the Financial Conduct Authority—particularly the trend for financial advice TikTok and adverts for non-traditional investments, such as whisky barrels or wine, which do not meet the standards required by the FCA for other investment products. That is also true of a number of cryptocurrency adverts and formats. We have been working with the FCA to try to identify ways to introduce more consistency in the application of the rule. There has been a welcome expansion by the Treasury on the promotion of high-risk investments, which is now a regulated activity in and of itself.

I go back to my initial point. We do not believe that there is any circumstance in which the FCA would want content in any place taken down where that content should not be removed, because they are the experts in identifying consumer harm in this space.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Mr Lewis, do you have anything to add?

Martin Lewis: I still believe that most of this comes down to an issue of policing. The rules are there and are not being enforced strongly enough. The people who have to enforce the rules are not resourced well enough to do that. Therefore, you get people who are able to work around the rules with impunity.

Advertising in the UK, especially online, has been the wild west for a very long time, and it will continue to be so for quite a while. The Advertising Standards Authority is actually better at dealing with the influencer issue, because of course it is primarily strong at dealing with people who listen to the Advertising Standards Authority. It is not very good at dealing with criminal scammers based outside the European Union, who frankly cannot be bothered and will not reply—they are not going to stop—but it is better at dealing with influencers who have a reputation.

We all know it is still extremely fast and loose out there. We need to adequately resource it; putting rules and laws in place is only one step. Resourcing the policing and the execution of those rules and laws is a secondary step, and I have doubts that we will ever quite get there, because resources are always squeezed and put on the back burner.

None Portrait The Chair
- Hansard -

Thank you. Do I have any questions from Government Back Benchers? No. Does anyone have any further questions?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Yes, I do. If nobody else has questions, I will have another bite of the cherry.

None Portrait The Chair
- Hansard -

The Minister is going to come in in a minute.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q I would just like to query your thoughts on a right to redress for victims. Do you think that having an ombudsman in the Bill would be appropriate, and what would you like to see to support victims of fraud?

Martin Lewis: As you will know, I had to sue Facebook for defamation, which is a ridiculous thing to do in order to stop scam adverts. I was unable to report the scam adverts to the police, because I had not been scammed—even though it was my face that was in them—and many victims were not willing to come forward. That is a rather bizarre situation, and we got Facebook to put forward £3 million to set up Citizens Advice Scam Action—that is what I settled for, as well as a scam ad reporting tool.

There are two levels here. The problem is who is at fault. Of course, those mainly at fault for scams are the scammers. They are criminals and should be prosecuted, but not enough of them are. You have times when it is the bank’s fault. If a company has not put proper precautions in place, and people have got scammed because it has put up adverts or posts that it should have prevented, they absolutely need to have some responsibility for that. I think you will struggle to have a direct redress system put in place. I would like to see it, but it would be difficult.

It is rather interesting to me that I am worried that the £3 million for Citizens Advice Scam Action, which was at least meant to provide help and support for victims of scams, is going to run out. I have not seen any more money coming from Facebook, Google or any of the other big players out there. If we are not going to fund direct redress, we could at least make sure that they fund a collective form of redress and help for the victims of scams, as a bare minimum. It is very strange that these firms go so quiet on this, and what they say is, “We are doing everything we can.”

From my meetings with these firms—these are meetings with lawyers in the room, so I have to be slightly careful—one of the things that I would warn the Committee about is that they tend to get you in and give you a presentation on all the technological reasons why they cannot stop scam adverts. My answer to them after about 30 seconds, having stopped what was meant to be an hour-long presentation, is, “I have not framed the fact that you need a technological solution. I have said you need a solution. If the answer to stopping scam adverts, and to stopping scams, is that you have to pre-vet every single advert, as old-fashioned media did, and that every advert that you put up has to have been vetted by a human being, so be it. You’re making it a function of technology, but let’s be honest: this is a function of profitability.” We have to look at the profitability of these companies when it comes to redress. What your job is—if you forgive me saying this—is to make sure that it costs them more money to let people be scammed than it does to stop people being scammed. If we solve that, we will have a lot fewer scams on social media and on the search advertising.

Rocio Concha: I completely agree with everything that Martin says. At the moment, the provisions in the Bill for “priority illegal content” require the platforms to publish reports that say, “This is how much illegal content we are seeing on the platform, and these are the measures that we are going to take.” They are also required to have a way for users to report it and to complain when they think that the platforms are not doing the right thing. At the moment, that does not apply to fraudulent advertising, so you have an opportunity to fix that in the Bill very easily, to at least get the transparency out there. The platform has to say, “We are finding this”—that puts pressure on the platform, because it is there and is also with the regulator—“and these are the measures that we are taking.” That gives us transparency to say, “Are these measures enough?” There should also be an easy way for the user to complain when they think that platforms are not doing the right thing. It is a complex question, but there are many things in the Bill that you can improve in order to improve the situation.

Tim Fassam: I wonder if it would be useful to give the Committee a case study. Members may be familiar with London Capital & Finance. Now, London Capital & Finance is one of the most significant recent scams. It sold mini-bonds fraudulently, at a very high advertised return, which then collapsed, with individuals losing all their money.

Those individuals were compensated through two vehicles. One was a Government Bill; so, they were compensated by the taxpayer. The others, because they were found to have been given financial advice despite LCF not having advice permissions or operating through a regulated product, went on to the Financial Services Compensation Scheme, which, among others, our members pay for; legitimate financial services companies pay for it. The most recent estimate is over £650 million. The expectation is that that will reach £1 billion at some point over the next few years, in terms of cost to the economy.

LCF was heavily driven by online advertising, and we would argue that the online platforms were in fact probably the only people who could have stopped it happening. They have profited from those adverts and they have not contributed anything to either of those two schemes. We would argue—possibly not for this Bill—that serious consideration should be given to the tech platforms being part of the financial services compensation scheme architecture and contributing to the costs of scams that individuals have fallen foul of, as an additional incentive for them to get on top of this problem.

Martin Lewis: That is a very important point, but I will just pick up on what Rocio was saying. One of the things that I would like to see, as well as much more rigid requirements of how reporting scams can be put in place—because I cannot see proper pre-vetting happening with these technology companies, but we can at least rely on social policing and reporting of scams. There are many people who recognise a scam, just as there are many people who do not recognise a scam.

However, I also think this is a wonderful opportunity to make sure that the method, the language and the symbols used for reporting scams are universal in the UK, so that whatever site you are on, if you see an advert you click the same symbol, and the process is unified and universal, and works in a very similar way, so that you can report a scam the same way on every site, which makes it simpler, and we can train people in how to do it and we can make the processes work.

Then, of course, we have to make sure that they act on the back of reports, but simply the various ways it is reported, and the complexity, and the number of clicks that you need to make mean it is a lot easier generally to click on an advert than it is to click to report an advert that is a scam. And with so many scams out there, I think there should be a parity of ease between those two factors.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q May I ask, directly related to that, about the complaints procedure? What would you like to see in terms of changes there, to make it more unified, more universal and simpler? It has been suggested that it is not robust enough, not dynamic enough and not fast enough.

Rocio Concha: There were complaints from the users. At the moment, this Bill will not allow this for fraudulent advertising. So, we need to make sure that it is a requirement for the platforms to allow and to have an easy tool for people to complain and to report when they see something that is fraudulent. At the moment, the Bill does not do that. It is an easy fix; you can do it. And then the user will have that tool. It would also give us transparency for the regulator and for organisations such as ours, to see what is happening and to see what measures the platforms are taking.

Tim Fassam: I would agree with that. I would also highlight a particular problem that our members have flagged, and we have flagged directly with Meta and Instagram. Within the definition in the Bill of individuals who can raise concern about social media platforms, our members find they fall between two stools, because quite often what is happening is that people are claiming an association with a legitimate firm. So they will have a firm’s logo, or a firm’s web address, in their profile for their social media and then they will not directly claim to be a financial adviser but imply an association with a legitimate financial advice firm. This happens surprisingly frequently.

Our members find it incredibly difficult to get those accounts taken down, because it is not a fraudulent account; that individual is not pretending to be someone else and they are not the individual claiming pretence. They are not directly claiming to be an employee; they could just say they are a fan of the company. And they are not a direct victim of this individual. What happens is that when they report, it goes into a volume algorithm, and only if a very large number of complaints are made does that particular site get taken down. I think that could be expanded to include complaints from individuals affected by the account, rather than directly believing they are pretending to be that.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We now have Frances Haugen, a former Facebook employee. Thank you for joining us.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q Good afternoon, Frances. Thank you for joining us.

Frances Haugen: Thank you so much for inviting me.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

No problem. Could you give us a brief overview of how, in your opinion, platforms such as Meta will be able to respond to the Bill if it is enacted in its current form?

Frances Haugen: There are going to be some pretty strong challenges in implementing the Bill as it is currently written. I want to be really honest with you about the limitations of artificial intelligence. We call it artificial intelligence, but people who actually build these systems call it machine learning, because it is not actually intelligent. One of the major limitations in the Bill is that there are carve-outs, such as “content of democratic importance”, that computers will not be able to distinguish. That might have very serious implications. If the computers cannot differentiate between whether something is or is not hate speech, imagine a concept even more ambiguous that requires even more context, such as defining what is of democratic importance. If we have carve-outs like that, it may actually prevent the platforms from doing any content moderation, because they will never know whether a piece of content is safe or not safe.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q You have just answered my question on AI and algorithmic intention. When I questioned Meta in Tuesday’s oral evidence session, they were unable to tell me how many human moderators they had directly working for them and how many had abided by a UK standard and code of conduct. Do you see the lack of human moderators being a problem as the Bill is enacted by platforms such as Meta?

Frances Haugen: I think it is unacceptable that large corporations such as this do not answer very basic questions. I guarantee you that they know exactly how many moderators they have hired—they have dashboards to track these numbers. The fact that they do not disclose those numbers shows why we need to pass laws to have mandatory accountability. The role of moderators is vital, especially for things like people questioning judgment decisions. Remember, no AI system is going to be perfect, and one of the major ways people can have accountability is to be able to complain and say, “This was inaccurately judged by a computer.” We need to ensure that there is always enough staffing and that moderators can play an active role in this process.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

Q One final question from me, because I know others will want to come in. How do you think platforms such as Meta—I know we have used Meta as an example, but there are others—can be incentivised, beyond the statutory duty that we are currently imposing, to publish their data to allow academics and researchers into their platforms to examine exactly what is going on? Or is this the only way?

Frances Haugen: All industries that live in democratic societies must live within democratic processes, so I do believe that it is absolutely essential that we the public, through our democratic representatives like yourself, have mandatory transparency. The only two other paths I currently see towards getting any transparency out of Meta, because Meta has demonstrated that it does not want to give even the slightest slivers of data—for example, how many moderators there are—are via ESG, so we can threaten then with divestment by saying, “Prosocial companies are transparent with their data,” and via litigation. In the United States, sometimes we can get data out of these companies through the discovery process. If we want consistent and guaranteed access to data, we must put it in the Bill, because those two routes are probabilistic—we cannot ensure that we will get a steady, consistent flow of data, which is what we need to have these systems live within a democratic process.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Turning to the issue of child safety and online abuse with images involving children, what should be added to or removed from the Bill to improve how it protects children online? Have you got any thoughts on that? Some groups have described the Bill’s content as overly broad. Would you make any comments on how effective it will be in terms of online safety for children?

Frances Haugen: I am not well versed on the exact provisions in the Bill regarding child safety. What I can say is that one of the most important things that we need to have in there is transparency around how the platforms in general keep children under the age of 13 off their systems—transparency on those processes—because we know that Facebook is doing an inadequate job. That is the single biggest lever in terms of child safety.

I have talked to researchers at places like Oxford and they talk about how, with social media, one of the critical windows is when children transition through puberty, because they are more sensitive on issues, they do not have great judgment yet and their lives are changing in really profound ways. Having mandatory transparency on what platforms are doing to keep kids off their platforms, and the ability to push for stronger interventions, is vital, because keeping kids off them until they are at least 13, if not 16, is probably the biggest single thing we can do to move the ball down the field for child safety.