Online Safety Bill (Fourth sitting)

(Limited Text - Ministerial Extracts only)

Read Full debate
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q You have no concerns about that.

Stephen Almond: No.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Mr Almond, welcome to the Committee. Thank you for joining us this afternoon. Can I start with co-operation? You mentioned a moment ago in answer to Maria Miller that co-operation between regulators, particularly in this context the ICO and Ofcom, was going to be very important. Would you describe the co-operative work that is happening already and that you will be undertaking in the future, and comment on the role that the Digital Regulation Cooperation Forum has in facilitating that?

Stephen Almond: Thank you very much. I will start by explaining the Digital Regulation Cooperation Forum. It is a voluntary, not statutory, forum that brings together ourselves, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority—some of the regulators with the greatest interest in digital regulation—to make sure that we have a coherent approach to the regulation of digital services in the interests of the public and indeed the economy.

We are brought together through our common interest. We do not require a series of duties or statutory frameworks to make us co-operate, because the case for co-operation is very, very clear. We will deliver better outcomes by working together and by joining up where our powers align. I think that is what you are seeing in practice in some of the work we have done jointly—for example, around the implementation of the children’s code alongside Ofcom’s implementation of the video-sharing platform regime. A joined-up approach to questions about, for example, how you assure the age of children online is really important. That gives me real confidence in reassuring the Committee that the ICO, Ofcom and other digital regulators will be able to take a very joined-up approach to regulating in the context of the new online safety regime.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you very much. That is extremely helpful. From the perspective of privacy, how satisfied are you that the Bill as constructed gives the appropriate protections to users’ privacy?

Stephen Almond: In our view, the Bill strikes an appropriate balance between privacy and online safety. The duties in the Bill should leave service providers in no doubt that they must comply with data protection law, and that they should guard against unwarranted intrusion of privacy. In my discourse with firms, I am very clear that this is not a trade-off between online safety and privacy: it is both. We are firmly expecting that companies take that forward and work out how they are going to adopt both a “privacy by design” and a “safety by design” approach to the delivery of their services. They must deliver both.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. My final question is this: do you feel the Bill has been constructed in such a way that it works consistently with the data protection provisions, such as UK GDPR and the Data Protection Act 2018?

Stephen Almond: In brief, yes. We feel that the Bill has been designed to work alongside data protection law, for which we remain the statutory regulator, but with appropriate mechanisms for co-operation with the ICO—so, with this series of consultation duties where codes of practice or guidance that could be issued by Ofcom may have an impact on privacy. We think that is the best way of assuring regulatory coherence in this area.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is very helpful. Thank you very much indeed.

None Portrait The Chair
- Hansard -

Mr Almond, we are trying to get a pint into a half-pint pot doing this, so we are rushing a bit. If, when you leave the room, you have a “I wish I’d said that” moment, please feel free to put it in writing to us. We are indebted to you. Thank you very much indeed.

Examination of Witnesses

Sanjay Bhandari and Lynn Perry gave evidence.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Should the Bill commit to that?

Lynn Perry: As a recommendation, we think that could only strengthen the protections of children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Picking up that last point about representation for particular groups of users including children, Ms Perry, do you agree that the ability to designate organisations that can make super-complaints might be an extremely valuable avenue, in particular for organisations that represent user groups such as children? Organisations such as yours could get designated and then speak on behalf of children in a formal context. You could raise super-complaints with the regulator on behalf of the children you speak for. Is that something to welcome? Would it address the point made by my colleague, Kim Leadbetter, a moment ago?

Lynn Perry: We would welcome provision to be able to bring particularly significant evidence of concern. That is certainly something that organisations, large charities in the sector and those responsible for representing the rights of children and young people would welcome. On some of these issues, we work in coalition to make representations on behalf of children and young people, as well as of parents and carers, who also raise some concerns. The ability to do that and to strengthen the response is something that would be welcomed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I am glad you welcome that. I have a question for both witnesses, briefly. You have commented in some detail on various aspects of the Bill, but do you feel that the Bill as a whole represents a substantial step forward in protecting children, in your case, Ms Perry, and those you speak for, Sanjay?

Sanjay Bhandari: Our beneficiaries are under-represented or minority communities in sports. I agree, I think that the Bill goes a substantial way to protecting them and to dealing with some of the issues that we saw most acutely after the Euro 2020 finals.

We have to look at the Bill in context. This is revolutionary legislation, which we are not seeing anywhere else in the world. We are going first. The basic sanctions framework and the 10% fines I have seen working in other areas—anti-trust in particular. In Europe, that has a long history. The definition of harm being in the manner of dissemination will pick up pile-ons and some forms of trolling that we see a lot of. Hate crime being designated as priority illegal content is a big one for us, because it puts the proactive duty on the platforms. That too will take away quite a lot of content, we think. The new threatening communications offence we have talked about will deal with rape and death threats. Often the focus is on, quite rightly, the experience of black professional footballers, but there are also other people who play, watch and work in the game, including our female pundits and our LGBT fan groups, who also get loads of this abuse online. The harm-based offence—communications sent to cause harm without reasonable excuse—will likely cover things such as malicious tagging and other forms of trolling. I have already talked about the identification, verification and anonymity provisions.

I think that the Bill will go a substantial way. I am still interested in what fits into that residual category of content harmful to adults, but rather than enter into an arid philosophical and theoretical debate, I will take the spirit of the Bill and try to tag it to real content.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Before I turn to Ms Perry with the same question about the Bill’s general effect, Sanjay, you mentioned the terrible incidence of abuse that the three England footballers got after the penalties last summer. Do you think the social media firms’ response to that incident was adequate, or anywhere close to adequate? If not, does that underline the need for this legislation?

Sanjay Bhandari: I do not think it was adequate because we still see stuff coming through. They have the greatest power to stop it. One thing we are interested in is improving transparency reporting. I have asked them a number of times, “Someone does not become a troll overnight, in the same way that someone does not become a heroin addict overnight, or commit an extremist act of terrorism overnight. There is a pathway where people start off, and you have that data. Can I have it?” I have lost count of the number of times that I have asked for that data. Now I want Ofcom to ask them for it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes. There are strong powers in the Bill for Ofcom to do precisely that. Ms Perry, may I ask you same general question? Do you feel that the Bill represents a very substantial step forward in protecting children?

Lynn Perry: We do. Barnardo’s really welcomes the Bill. We think it is a unique and once-in-a-generation opportunity to achieve some really long-term changes to protect children from a range of online harms. There are some areas in which the Bill could go further, which we have talked about today. The opportunity that we see here is to make the UK the safest place in the world for children to be online. There are some very important provisions that we welcome, not least on age verification, the ability to raise issues through super-complaints, which you have asked me about, and the accountability in various places throughout the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Ms Perry. Finally, Mr Bhandari, some people have raised concerns about free speech. I do not share those concerns—in fact, I rebutted them a Times article earlier this week—but does the Bill cause you any concern from a free-speech perspective?

Sanjay Bhandari: As I said earlier, there are no absolute rights. There is no absolute right to freedom of speech— I cannot shout “Fire!” here—and there is no absolute right to privacy; I cannot use my anonymity as a cloak for criminality. It is question of drawing an appropriate balance. In my opinion, the Bill draws an appropriate balance between the right to freedom of speech and the right to privacy. I believe in both, but in the same way that I believe in motherhood and apple pie: of course I believe in them. It is really about the balancing exercise, and I think this is a sensible, pragmatic balancing exercise.

None Portrait The Chair
- Hansard -

Ms Perry, I am very pleased that we were finally able to hear from you. Thank you very much indeed—you have been very patient. Thank you very much, Mr Bhandari. If either of you, as a result of what you have heard and been asked today, have any further thoughts that you wish to submit, please do so.

Examination of Witnesses

Eva Hartshorn-Sanders and Poppy Wood gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us this afternoon and for giving us your evidence so far. At the beginning of your testimony, Ms Hartshorn-Sanders, I think you mentioned—I want to ensure I heard correctly—that you believe, or have evidence, that Instagram is still, even today, failing to take down 90% of inappropriate content that is flagged to it.

Eva Hartshorn-Sanders: Our “Hidden Hate” report was on DMs—direct messages—that were shared by the participants in the study. One in 15 of those broke the terms and conditions that Instagram had set out related to misogynist abuse—sexual abuse. That was in the wake of the World cup, so after Instagram had done a big promotion about how great it was going to be in having policies on these issues going forward. We found that 90% of that content was not acted on when we reported it. This was not even them going out proactively to find the content and not doing anything with it; it was raised for their attention, using their systems.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That clearly illustrates the problem we have. Two parts of the Bill are designed to address this: first, the ability for designated user representation groups to raise super-complaints—an issue such as the one you just mentioned, a systemic issue, could be the subject of such a super-compliant to Ofcom, in this case about Instagram—and, secondly, at clause 18, the Bill imposes duties on the platforms to have proper complaints procedures, through which they have to deal with complaints properly. Do those two provisions, the super-complaints mechanism for representative groups and clause 18 on complaints procedures, go a long way towards addressing the issue that you helpfully and rightly identified?

Eva Hartshorn-Sanders: That will depend on transparency, as Poppy mentioned. How much of that information can be shared? We are doing research at the moment on data that is shared personally, or is publicly available through the different tools that we have. So it is strengthening access to that data.

There is this information asymmetry that happens at the moment, where big tech is able to see patterns of abuse. In some cases, as in the misogyny report, you have situations where a woman might be subject to abuse from one person over and over again. The way that is treated in the EU is that Instagram will go back and look at the last 30 historically to see the pattern of abuse that exists. They are not applying that same type of rigorousness to other jurisdictions. So it is having access to it in the audits that are able to happen. Everyone should be safe online, so this should be a safety-by-design feature that the companies have.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Meta claimed in evidence to the Committee on Tuesday that it gave researchers good access to its data. Do you think that is true?

Eva Hartshorn-Sanders: I think it depends on who the researchers are. I personally do not have experience of it, but I cannot speak to that. On transparency, at the moment, the platforms generally choose what they share. They do not necessarily give you the data that you need. You can hear from my accent that I am originally from New Zealand. I know that in the wake of the Christchurch mosque terrorist attack, they were not prepared to provide the independent regulator with data on how many New Zealanders had seen the footage of the livestream, which had gone viral globally. That is inexcusable, really.

None Portrait The Chair
- Hansard -

Q Ms Wood, do you want to comment on any of this before we move on?

Poppy Wood: On the point about access to data, I do not believe that the platforms go as far as they could, or even as far as they say they do. Meta have a tool called CrowdTangle, which they use to provide access to data for certain researchers who are privileged enough to have access. That does not even include comments on posts; it is only the posts themselves. The platforms pull the rug out all the time from under researchers who are investigating things that the platforms do not like. We saw that with Laura Edelson at New York University, who they just cut off—that is one of the most famous cases. I think it is quite egregious of Meta to say that they give lots of access to data.

We know from the revelations of whistleblowers that Meta do their own internal research, and when they do not like the results, they just bury it. They might give certain researchers access to data under certain provisions, but independent researchers who want to investigate a certain emergent harm or a certain problem are not being given the sort of access that they really need to get insights that move the needle. I am afraid that I just do not believe that at all.

The Bill could go much further. A provision on access to data in clause 136 states that Ofcom has two years to issue a report on whether researchers should get access to data. I think we know that researchers should have access to data, so I would, as a bare minimum, shorten the time that Ofcom has to do that report from two years to six months. You could turn that into a question of how to give researchers access to data rather than of whether they should get it. The Digital Services Act—the EU equivalent of the Bill—goes a bit further on access to data than our Bill. One result of that might be that researchers go to the EU to get their data because they can get it sooner.

Improving the Bill’s access to data provisions is a no-brainer. It is a good thing for the Government because we will see more stuff coming out of academia, and it is a good thing for the safety tech sector, because the more research is out there, the more tools can be built to tackle online harms. I certainly call on the Government to think about whether clause 136 could go further.

None Portrait The Chair
- Hansard -

Thank you. Last brief question, Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Goodness! There is a lot to ask about.

None Portrait The Chair
- Hansard -

Sorry, we are running out of time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I appreciate that; thank you, Sir Roger. Ms Wood, you mentioned misinformation in your earlier remarks—I say “misinformation” rather than “state-sponsored disinformation”, which is a bit different. It is very difficult to define that in statute and to have an approach that does not lead to bias or to what might be construed as censorship. Do you have any particular thoughts on how misinformation could be concretely and tangibly addressed?

Poppy Wood: It is not an easy problem to solve, for sure. What everybody is saying is that you do it in a content-neutral way, so that you are not talking about listing specific types of misinformation but about the risks that are built into your system and that need to be mitigated. This is a safety by design question. We have heard a lot about introducing more friction into the system, checking the virality threshold, and being more transparent. If you can get better on transparency, I think you will get better on misinformation.

If there is more of an obligation on the platforms to, first, do a broader risk assessment outside of the content that will be listed as priority content and, secondly, introduce some “harm reduction by design” mechanisms, through friction and stemming virality, that are not specific to certain types of misinformation, but are much more about safety by design features—if we can do that, we are part of the way there. You are not going to solve this problem straightaway, but you should have more friction in the system, be it through a code of practice or a duty somewhere to account for risk and build safer systems. It cannot be a content play; it has to be a systems play.

None Portrait The Chair
- Hansard -

Thank you. I am sorry, but that brings us to the end of the time allotted to this session. Ladies, if either of you wishes to make a submission in writing in the light of what you have not answered or not been able to answer, please do. Ms Wood, Ms Hartsholm-Sanders, thank you very much indeed for joining us.

Examination of Witnesses

Owen Meredith and Matt Rogerson gave evidence.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q My only concern is that someone who just decides to call themselves a journalist will be able to say what they want.

Owen Meredith: I do not think that would be allowable under the Bill, because of the distinction between a recognised news publisher publishing what we would all recognise as journalistic content, versus the journalistic content exemption. I think that is why they are treated differently.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by clarifying a comment that Owen Meredith made at the very beginning? You were commenting on where you would like the Bill to go further in protecting media organisations, and you said that you wanted there to be a wholesale exemption for recognised news publishers. I think there already is a wholesale exemption for recognised news publishers. The area where the Government have said they are looking at going further is in relation to what some people call a temporary “must carry” provision, or a mandatory right of appeal for recognised news publishers. Can I just clarify that that is what you meant?

Owen Meredith: Yes. I think the issue is how that exemption will work in practice. I think that what the Government have said they are looking at and will bring forward does address the operating in practice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Can I move on to the question that Kim Leadbeater asked a moment ago, and that a number of Members have raised? You very kindly said a moment ago that you thought that clause 50, which sets out the definition of “recognised news publisher”, works as drafted. I would like to test that a bit, because some witnesses have said that it is quite widely drawn, and suggested that it would be relatively easy for somebody to set themselves up in a manner that met the test laid out in clause 50. Given the criticism that we have heard a few times today and on Tuesday, can you just expand for the Committee why you think that is not the case?

Owen Meredith: As I alluded to earlier, it is a real challenge to set out this legal definition in a country that believes, rightly, in the freedom of the press as a fourth pillar of democracy. It is a huge challenge to start with, and therefore we have to set out criteria that cover the vast majority of news publishers but do not end up with a backdoor licensing system for the press, which I think we are all keen to avoid. I think it meets that criterion.

On the so-called bad actors seeking to abuse that, I have listened to and read some of the evidence that you have had from others—not extensively, I must say, due to other commitments this week—and I think that it would be very hard for someone to meet all those criteria as set out in order to take advantage of this. I think that, as Matt has said, there will clearly be tests and challenges to that over time. It will rightly be challenged in court or go through the usual judicial process.

Matt Rogerson: It seems to me that the whole Bill will be an iterative process. The internet will not suddenly become safe when the Bill receives Royal Assent, so there will be this process whereby guidance and case law are developed, in terms of what a newspaper is, against the criteria. There are exemptions for news publishers in a whole range of other laws that are perfectly workable. I think that Ofcom is perfectly well equipped to create guidance that enables it to be perfectly workable.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. So you are categorically satisfied about the risks that we have heard articulated; that maleficent actors would not be able to set themselves up in such a way that they benefit from this exemption.

Matt Rogerson: Subject to the guidance developed by Ofcom, which we will be engaged in developing, I do think so. The other thing to bear in mind is that the platforms already have lists of trusted publishers. For example, Google has a list in relation to Google News—I think it has about 65,000 publishers—which it automates to push through Google News as trusted news publishers. Similarly, Facebook has a list of trusted news publishers that it uses as a signal for the Facebook newsfeed. So I do not buy the idea that you can’t automate the use of trusted news sources within those products.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you; that is very helpful. I have only one other question. In relation to questions concerning freedom of speech, the Government believe, and I believe, that the Bill very powerfully protects freedom of speech. Indeed, it does so explicitly through clause 19, in addition to the protections for recognised news publishers that we have discussed already and the additional protections for content of journalistic and democratic importance, notwithstanding the definitional question that have been raised. Would you agree that this Bill respects and protects free speech, while also delivering the safety objectives that it quite rightly has?

Owen Meredith: If I can speak to the point that directly relates to my members and those I represent, which is “Does it protect press freedom?”, which is perhaps an extension of your question, I would say that it is seeking to. Given the assurances you have given about the detailed amendments that you intend to bring forward—if those are correct, and I am very happy to write to the Committee and comment once we have seen the detail, if it would be helpful to do so—and everything I have heard about what you are intending to do, I believe it will. But I do not believe that the current draft properly and adequately protects press freedom, which is why, I think, you will be bringing forward amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, but with the amendment committed to on Second Reading, you would say that the Bill does meet those freedom of speech objectives, subject to the detail.

Owen Meredith: Subject to seeing the drafting, but I believe the intention—yes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you. That is very helpful. Mr Rogerson?

Matt Rogerson: As we know, this is a world first: regulation of the internet, regulation of speech acts on the internet. From a news publisher perspective, I think all the principles are right in terms of what the Government are trying to do. In terms of free speech more broadly, a lot of it will come down to how the platforms implement the Bill in practice. Only time will tell in terms of the guidance that Ofcom develops and how the platforms implement that at vast scale. That is when we will see what impact the Bill actually has in practice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q From a general free speech perspective—which obviously includes the press’s freedom of speech, but everybody else’s as well—what do you think about the right enshrined in clause 19(2), where for the first time ever the platforms’ have to have regard to the importance of protecting users’ right to freedom of speech is put on the face of a Bill? Do you think that is helpful? It is a legal obligation they do not currently have, but they will have it after the passage of the Bill. In relation to “legal but harmful” duties, platforms will also have an obligation to be consistent in the application of their own terms and conditions, which they do not have to be at the moment. Very often, they are not consistent; very often, they are arbitrary. Do you think those two changes will help general freedom of speech?

Matt Rogerson: Yes. With the development of the online platforms to the dominant position they are in today, that will be a big step forward. The only thing I would add is that, as well as this Bill, the other Bill that will make a massive difference when it comes through is the digital markets unit Bill. We need competition to Facebook so that consumers have a choice and so that they can decide which social network they want to be on, not just the one dominant social network that is available to them in this country.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I commend your ingenuity in levering an appeal for more digital competition into this discussion. Thank you.

None Portrait The Chair
- Hansard -

One final quick question from the Opposition Front Bench.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q I thank the witnesses for coming. In terms of regulation, I was going to ask whether you believe that Ofcom is the most suitable regulator to operate in this area. You have almost alluded to the fact that you might not. On that basis, should we specify in the Bill a duty for Ofcom to co-operate with other regulators—for example, the Competition and Markets Authority, the Financial Conduct Authority, Action Fraud or whoever else?

Tim Fassam: I believe that would be helpful. I think Ofcom is the right organisation to manage the relationship with the platforms, because it is going to be much broader than the topics we are talking about in our session, but we do think the FCA, Action Fraud and potentially the CMA should be able to direct, and be very clear with Ofcom, that action needs to be taken. Ofcom should have the ability to ask for things to be reviewed to see whether they break the rules.

The other area where we think action probably needs to be taken is where firms are under investigation, because the Bill assumes it is clear cut whether something is fraud, a scam, a breach of the regulations or not. In some circumstances, that can take six months or a year to establish through investigation. We believe that if, for example, the FCA feels that something is high risk, it should be able to ask Ofcom to suspend an advert, or a firm from advertising, pending an investigation to assess whether it is a breach of the regulation.

Rocio Concha: I agree that Ofcom is the right regulator, the main regulator, but it needs to work with the other regulators—with the FCA, ASA and CMA—to enforce the Bill effectively. There is another area. Basically, we need to make sure that Ofcom and all the regulators involved have the right resources. When the initial version of the Bill was published, Ofcom got additional resources to enable it to enforce the Bill. But the Bill has increased in scope, because now it includes fraud and fraudulent advertising. We need to make sure that Ofcom has the right resources to enforce the full Bill effectively. That is something that the Government really need to consider.

Martin Lewis: I was going to make exactly that point, but it has just been made brilliantly so I will not waste your time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I thank the witnesses for joining us this afternoon, and particularly Martin Lewis for his campaigning in this area.

I will start by agreeing with the point that Martin Lewis made a minute or two ago—that we cannot trust these companies to work on their own. Mr Lewis, I am not sure whether you have had a chance to go through clause 34, which we inserted into the Bill following your evidence to the Joint Committee last year. It imposes a duty on these companies to take steps and implement systems to

“prevent individuals from encountering content consisting of fraudulent advertisements”.

There is a clear duty to stop them from doing this, rather as you were asking a minute ago when you described the presentation. Does that strong requirement in clause 34, to stop individuals from encountering fraudulent advertisement content, meet the objective that you were asking for last year?

Martin Lewis: Let me start by saying that I am very grateful that you have put it in there and thankful that the Government have listened to our campaign. What I am about to say is not intended as criticism.

It is very difficult to know how this will work in practice. The issue is all about thresholds. How many scam adverts can we stomach? I still have, daily—even from the platform that I sued, never mind the others—tens of reports directly to me of scam adverts with my face on. Even though there is a promise that we will try to mitigate that, the companies are not doing it. We have to have a legitimate understanding that we are not going to have zero scam adverts on these platforms; unless they were to pre-vet, which I do not think they will, the way they operate means that will not happen.

I am not a lawyer but my concern is that the Bill should make it clear, and that any interpretation of the Bill from Ofcom should be clear, about exactly what threshold of scam adverts is acceptable—we know that they are going to happen—and what threshold is not acceptable. I do not have the expertise to answer your question; I have to rely on your expertise to do that. But I ask the Committee to think properly about what the threshold level should be.

What is and is not acceptable? What counts as “doing everything they can”? They are going to get big lawyers involved if you say there must be zero scam adverts—that is not going to happen. How many scam adverts are acceptable and how many are not? I am so sorry to throw that back as a question when I am a witness, but I do not have the expertise to answer. But that is my concern: I am not 100% convinced of the threshold level that you are setting.

None Portrait The Chair
- Hansard -

Q Mr Fassam, do you have the answer?

Tim Fassam: I think we are positive about the actions that have been taken regarding social media; our concern is that the clause is not applied to search and that it excludes paid-for ads that are also user-generated content—promoted tweets or promoted posts, for example. We would ensure that that applied to all paid-for adverts and that it was consistent between social media and search.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Mr Fassam, I will address those two questions, if I may. Search is covered by clause 35 and user-generated content is subject to the Bill’s general provisions on user-generated content. Included in the scope of that are the priority illegal offences defined in schedule 7. Among those are included, on page 185—not that I expect you to have memorised the Bill—financial services offences that include a number of those offences to do with pretending to carry out regulated financial activity when in fact you are not regulated. Also included are the fraud offences—the various offences under the Fraud Act 2006. Do come back if you think I have this wrong, but I believe that we have search covered in clause 35 and promoted user-generated content covered via schedule 7 page 185.

Tim Fassam: You absolutely do, but to a weaker standard than in clause 34.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q In clause 35 there is the drafting point that we are looking at. It says “minimise the risk” instead of “prevent”. You are right to point out that drafting issue. In relation to the user-generated stuff, there is a duty on the platforms to proactively stop priority illegal content, as defined in schedule 7. I do take your drafting point on clause 35.

Tim Fassam: Thank you.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I want to pick up on Martin Lewis’s point about enforcement. He said that he had to sue Facebook himself, which was no doubt an onerous, painful and costly enterprise—at least costly initially, because hopefully you got your expenses back. Under the Bill, enforcement will fall to Ofcom. The penalties that social media firms could be handed by Ofcom for failing to meet the duties we have discussed include a fine amounting to 10% of global revenue as a maximum, which runs into billions of pounds. Do the witnesses feel that level of sanction—10% of global revenue and ultimately denial of service—is adequately punitive? Will it provide an adequate deterrent to the social media firms that we are considering?

None Portrait The Chair
- Hansard -

Mr Lewis, as you were named, I think you had better start.

Martin Lewis: Ten per cent. of the global revenue of a major social media or search player is a lot of money—it certainly would hit them in the pocket. I reiterate my previous point: it is all about the threshold at which that comes in and how rigidly Ofcom is enforcing it. There are very few organisations that have the resources, legally, to take on big institutions of state, regulators and Governments. If any does, it is the gigantic tech firms. Absolutely, 10% of global revenue sounds like a suitable wall to prevent them jumping over. That is the aim, because we want those companies to work for people; we don’t want them to do scam adds. We want them to work well and we want them never to be fined because is no reason to fine them.

The proof of the pudding will be in how robust Ofcom feels it can be, off the back of the Bill, taking those companies on. I go back to needing to understand how many scam ads you permit under the duty to prevent scam ads. It clearly is not zero—you are not going to tell me it is zero. So how many are allowed, what are the protocols that come into place and how quickly do they have to take the ads down? Ultimately, I think that is going to be a decision for Ofcom, but it is the level of stringency that you put on Ofcom in order for it to interpret how it takes that decision that is going to decide whether this works or not.

Rocio Concha: I completely agree with Martin. Ofcom needs to have the right resources in order to monitor how the platforms are doing that, and it needs to have the right powers. At the moment, Ofcom can ask for information in a number of areas, including fraud, but not advertising. We need to make sure that Ofcom can ask for that information so that it can monitor what the platforms are doing. We need to make sure that it has the right powers and the right resources to enforce the Bill effectively.

Tim Fassam: You would hope that 10% would certainly be a significant disincentive. Our focus would be on whether companies are contributing to compensating the victims of fraud and scams, and whether they have been brought into the architecture that is utilised to compensate victims of fraud and scams. That would be the right aim in terms of financial consequences for the firms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I have one final question that again relates to the question of reporting scams, which I think two or three witnesses have referred to. I will briefly outline the provisions in the Bill that address that. I would like to ask the witnesses if they think those provisions are adequate. First, in clause 18, the Bill imposes on large social media firms an obligation to have a proper complaints procedure so that complaints are not ignored, as appears to happen on a shockingly frequent basis. That is at the level of individual complaints. Of course, if social media firms do not do that, it will be for Ofcom to enforce against them.

Secondly, clauses 140 and 141 contain a procedure for so-called super-complaints, where a body that represents users—it could be Which? or an organisation like it—is able to bring something almost like a class action or group complaint to Ofcom if it thinks a particular social media firm has systemic problems. Will those two clauses address the issue of complaints not being properly handled or, in some cases, not being dealt with at all?

Martin Lewis: Everything helps. I think the super-complaint point is really important. We must remember that many victims of scams are not so good at complaining and, by the nature of the crossover of individuals, there is a huge mental health issue at stake with scams. There is both the impact on people with mental health issues and the impact on people’s mental health of being scammed, which means that they may not be as robust and up for the fight or for complaining. As long as it works and applies to all the different categories that are repeated here, the super-complaint status is a good measure.

We absolutely need proper reporting lines. I urge you, Minister—I am not sure that this is in the Bill—to standardise this so that we can talk about what someone should do when they report: the same imagery, the same button. With that, people will know what to do. The more we can do that, the easier and better the system will be.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That is a really important point—you made it earlier—about the complaints process being hidden. Clause 18(2)(c) says that the complaints system must be

“easy to access, easy to use (including by children) and transparent.”

The previous paragraph (b) states that the system must

“provides for appropriate action to be taken by the provider of the service in response to complaints of a relevant kind”.

The Bill is saying that a complaints process must do those two things, because if it does not, Ofcom will be on the company’s back.

Martin Lewis: I absolutely support all of that. I am just pushing for that tiny bit more leadership, whether it is from you or Ofcom, that comes up with a standardised system with standardised imagery and placing, so that everybody knows that on the top left of the advert you have the button that you click to fill in a form to report it. The more we have that cross-platform and cross-search and cross-social media, the easier it will be for people. I am not sure it is a position for the Bill in itself, but Government leadership would work really well on that.

Tim Fassam: They are both welcome—the super-complaint and the new complaints process. We want to ensure that we have a system that looks not just at weight of number of complaints, but at the content. In particular, you may find on the super-complaint point that, for example, the firm that a fraudster is pretending to be is the organisation that has the best grasp of the issue, so do not forget about commercial organisations as well as consumer organisations when thinking about who is appropriate to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Well, your organisation, as one that represents firms in this space, could in fact be designated as a super-complainant to represent your members, as much as someone like Which? could be designated to represent the man on the street like you or me.

Tim Fassam: Absolutely. We suggested to Meta when we met them about 18 months ago that we could be a clearing house to identify for them whether they need to take something seriously, because our members have analysed it and consider it to represent a real risk.

None Portrait The Chair
- Hansard -

Last word to Rocio Concha.

Rocio Concha: I completely agree about the super-complaint. We as a consumer organisation have super-complaint powers. As with other regulators, we would like to have it in this context as well. We have done many super-complaints representing consumers in particular areas with the regulators, so I think we need it in this Bill as well.

On reporting, I want to clarify something. At the moment, the Bill does not have a requirement for users to complain and report to platforms in relation to fraudulent advertising. It happens for priority illegal content, but our assessment of the Bill is that it is unclear whether it applies to fraudulent advertising. We probably do not have time to look at this now, but we sent you amendments to where we thought the Bill had weaknesses. We agree with you that users should have an easy and transparent way to report illegal or fraudulent advertising, and they should have an easy way to complain about it. At the moment, it is not clear that the Bill will require that for fraudulent advertising.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, that is a very good question. Please do write to us about that. Clause 140, on super-complaints, refers to “regulated services”. My very quick, off-the-cuff interpretation is that that would include everything covered and regulated by the Bill. I notice that there is a reference to user-to-user services in clause 18. Do write to us on that point. We would be happy to look at it in detail. Do not take my comment as definitive, because I have only just looked at it in the last 20 seconds.

Rocio Concha: My comment was in relation not to the super-complaints but to the requirements. We already sent you our comments with suggestions on how you can fix this in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very grateful. Thank you.

None Portrait The Chair
- Hansard -

Ms Concha and Mr Fassam, thank you very much. Do please write in if you have further comments. Mr Lewis, we are deeply grateful to you. You can now go back to your day job and tell us whether we are going to be worse or better off as a result of the statement today—please don’t answer that now.

Martin Lewis: I am interviewing the Chancellor in 15 minutes.

--- Later in debate ---
Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Thank you. That is very helpful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us and giving evidence, Frances; it is nice to see you again. We had evidence from Meta, your former employer, on Tuesday, in which its representative suggested that it engages in open and constructive co-operation with researchers. Do you think that testimony was true?

Frances Haugen: I think that shows a commendable level of chutzpah. Researchers have been trying to get really basic datasets out of Facebook for years. When I talk about a basic dataset, it is things as simple as, “Just show us the top 10,000 links that are distributed in any given week.” When you ask for information like that in a country like the United States, no one’s privacy is violated: every one of those links will have been viewed by hundreds of thousands, if not millions of people. Facebook will not give out even basic data like that, even though hundreds if not thousands of academics have begged for this data.

The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes—and remember, it does not even say that it will happen; Ofcom might say, “Oh, maybe not.” We need to take a page from the Digital Services Act and say, “On the day that the Bill passes, we get access to data,” or, at worst, “Within three months, we are going to figure out how to do it.” It needs to be not, “Should we do it?” but “How will we do it?”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q When I was asking questions on Tuesday, the representative of Meta made a second claim that raised my eyebrow. He claimed that, in designing its algorithms, it did not primarily seek to optimise for engagement. Do you think that was true?

Frances Haugen: First, I left the company a year ago. Because we have no transparency with these companies, they do not have to publish their algorithms or the consequences of their algorithms, so who knows? Maybe they use astrology now to rank the content. We have no idea. All I know is that Meta definitely still uses signals—did users click on it, did they dwell on it, did they re-share it, or did they put a comment on it? There is no way it is not using those. It is very unlikely that they do not still use engagement in their ranking.

The secondary question is, do they optimise for engagement? Are they trying to maximise it? It is possible that they might interpret that and say, “No, we have multiple things we optimise for,” because that is true. They look at multiple metrics every single time they try to decide whether or not to shift things. But I think it is very likely that they are still trying to optimise for engagement, either as their top metric or as one of their top metrics.

Remember, Meta is not trying to optimise for engagement to keep you there as long as possible; it is optimising for engagement to get you and your friends to produce as much content as possible, because without content production, there can be no content consumption. So that is another thing. They might say, “No, we are optimising for content production, not engagement,” but that is one step off.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The Bill contains provisions that require companies to do risk assessments that cover their algorithms, and then to be transparent about those risk assessments with Ofcom. Do you think those provisions will deliver the change required in the approach that the companies take?

Frances Haugen: I have a feeling that there is going to be a period of growing pains after the first time these risk assessments happen. I can almost entirely guarantee you that Facebook will try to give you very little. It will likely be a process of back and forth with the regulator, where you are going to have to have very specific standards for the level of transparency, because Facebook is always going to try to give you the least possible.

One of the things that I am actually quite scared about is that, in things like the Digital Services Act, penalties go up to 10% of global profits. Facebook as a company has something like 35% profit margins. One of the things I fear is that these reports may be so damning— that we have such strong opinions after we see the real, hard consequences of what they are doing—that Facebook might say, “This isn’t worth the risk. We’re just going to give you 10% of our profits.” That is one of the things I worry about: that they may just say, “Okay, now we’re 25% profitable instead of 35% profitable. We’re that ashamed.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Let me offer a word of reassurance on that. In this Bill, the penalties are up to 10% of global revenue, not profit. Secondly, in relation to the provision of information to Ofcom, there is personal criminal liability for named executives, with a period of incarceration of up to two years, for the reason you mentioned.

Frances Haugen: Oh, good. That’s wonderful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We had a case last year where Facebook—it was actually Facebook—failed to provide some information to the CMA in a takeover case, and it paid a £50 million fine rather than provide the information, hence the provision for personal criminal liability for failing to provide information that is now in this Bill.

My final question is a simple one. From your perspective, at the moment, when online tech companies are making product design decisions, what priority do they give to safety versus profit?

Frances Haugen: What I saw when I was at Facebook was that there was a culture that encouraged people to always have the most positive interpretation of things. If things are still the same as when I left—like I said, I do not know; I left last May—what I saw was that people routinely had to weigh little changes in growth versus changes in safety metrics, and unless they were major changes in safety metrics, they would continue to pursue growth. The only problem with a strategy like that is that those little deficits add up to very large harms over time, so we must have mandated transparency. The public have to have access to data, because unless Facebook has to add the public cost of the harm of its products, it is not going to prioritise enough those little incremental harms as they add up.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you very much.

None Portrait The Chair
- Hansard -

Ms Haugen, thank you very much indeed for joining us today, and thank you also for the candour with which you have answered your questions. We are very grateful to you indeed.

The Committee will meet again on Tuesday 7 June at 9.25 am for the start of its line-by-line consideration of the Bill. That session will be in Committee Room 14.

Ordered, That further consideration be now adjourned. —(Steve Double.)