The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Mrs Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
Russell, Dean (Watford) (Con)
Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Witnesses
Stephen Almond, Director of Technology and Innovation, Information Commissioner’s Office
Sanjay Bhandari, Chair, Kick It Out
Eva Hartshorn-Sanders, Head of Policy, Center for Countering Digital Hate
Poppy Wood, UK Director, Reset.tech
Owen Meredith, Chief Executive, News Media Association
Matt Rogerson, Director of Public Policy, Guardian Media Group
Tim Fassam, Director of Government Relations and Policy, Personal Investment Management and Financial Advice Association
Rocio Concha, Director of Policy and Advocacy, Which?
Martin Lewis CBE, Founder, MoneySavingExpert.com
Frances Haugen
Public Bill Committee
Thursday 26 May 2022
(Afternoon)
[Sir Roger Gale in the Chair]
Online Safety Bill
14:00
The Committee deliberated in private.
Examination of Witness
Stephen Almond gave evidence.
14:02
None Portrait The Chair
- Hansard -

Good afternoon, ladies and gentlemen. We are now sitting in public and the proceedings are being broadcast. Thank you all for joining us.

We will now hear oral evidence from Stephen Almond, the director of technology and innovation in the Information Commissioner’s Office. Mr Almond, thank you for coming. As I have introduced you, I am not going to ask you to introduce yourself, so we can go straight into the questions. I call the shadow Front-Bench spokesman.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Q 224 Thank you for coming to give evidence to us this afternoon, Mr Almond. There has been a lot of debate about the risk end-to-end encrypted platforms pose to online safety. What need is there to mitigate that risk in the Bill?

Stephen Almond: Let me start by saying that the ICO warmly welcomes the Bill and its mission to make the UK the safest place in the world to be online. End-to-end encryption supports the security and privacy of online communication and keeps people safe online, but the same characteristics that create a private space for the public to communicate can also provide a safe harbour for more malicious actors, and there are valid concerns that encrypted channels may be creating spaces where children are at risk.

Our view is that the Bill has the balance right. All services in scope, whether encrypted or not, must assess the level of risk that they present and take proportionate action to address it. Moreover, where Ofcom considers it necessary and proportionate, it will have the power to issue technology notices to regulated services to require them to deal with child sexual abuse and exploitation material. We think this presents a proportionate way of addressing the risk that is present on encrypted channels.

It is worth saying that I would not favour provisions that sought to introduce some form of outright ban on encryption in a generalised way. It is vital that the online safety regime does not seek to trade off one sort of online safety risk for another. Instead, I urge those advancing more fundamentalist positions around privacy or safety to move towards the question of how we can incentivise companies to develop technological innovation that will enable the detection of harmful content without compromising privacy. It is one reason why the ICO has been very pleased to support the Government’s safety tech challenge, which has really sought to incentivise the development of technological innovation in this area. Really what we would like to see is progress in that space.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q On that point around technological advances and enabling people to access the internet, people have raised concerns that tech-savvy children will be able to use VPNs, Tor Browser and other tricks to easily circumnavigate the measures that will be in the Bill, especially around age verification and user identity. How do you respond to that, and how do you suggest we close those loopholes, if we can?

Stephen Almond: First and foremost, it is incredibly important that the Bill has the appropriate flexibility to enable Ofcom as the online safety regulator to be agile in responding to technological advances and novel threats in this area. I think the question of VPNs is ultimately going to be one that Ofcom and the regulator services themselves are going to have to work around. VPNs play an important role in supporting a variety of different functions, such as the security of communications, but ultimately it is going to be critical to make sure that services are able to carry out their duties. That is going to require some questions to be asked in this area.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me. I would like to discuss your thoughts on transparency and how we can make social media companies like Meta be more transparent and open with their data, beyond the measures we currently have in the Bill. For instance, we could create statute to allow academics or researchers in to examine their data. Do you have any thoughts on how this can be incentivised?

Stephen Almond: Transparency is a key foundation of data protection law in and of itself. As the regulator in this space, I would say that there is a significant emphasis within the data protection regime on ensuring that companies are transparent about the processing of personal data that they undertake. We think that that provides proportionate safeguards in this space. I would not recommend an amendment to the Bill on this point, because I would be keen to avoid duplication or an overlap between the regimes, but it is critical; we want companies to be very clear about how people’s personal data is being processed. It is an area that we are going to continue to scrutinise.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

May I ask a supplementary to that before I come on to my main question?

None Portrait The Chair
- Hansard -

Absolutely.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Thank you so much for coming along. You spoke in your initial comments to my colleague about encryption. The challenges of encryption around child abuse images have been raised with us previously. How can we balance the need to allow people to have encrypted options, if possible, with the need to ensure that this does not adversely affect organisations such as the Internet Watch Foundation, which does so much good in protecting children and rooting out child abuse imagery?

Stephen Almond: I share your concern about this. To go back to what I was saying before, I think the approach that is set out in the Bill is proportionate and targeted. The granting of, ultimately, backstop powers to Ofcom to issue technology notices and to require services to deal with this horrendous material will have a significant impact. I think this will ensure that the regime operates in a risk-based way, where risks can be identified. There will be the firm expectation on service providers to take action, and that will require them to think about all the potential technological solutions that are available to them, be they content scanning or alternative ways of meeting their safety duties.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q My main question is about child safety, which is a prime objective for the Government in this legislation. Do you feel that the Bill’s definition of “likely to be accessed by children” should be more closely aligned with the one used in the ICO’s age-appropriate design code?

Stephen Almond: The objectives of both the Online Safety Bill and the children’s code are firmly aligned in respect of protecting children online. We have reviewed the definitions and, from our perspective, there are distinctions in the definition that is applied in the Bill and the children’s code, but we find no significant tension between them. My focus at the ICO, working in co-operation with Ofcom, will ultimately be on ensuring that there is clarity for business on how the definitions apply to their services, and that organisations know when they are in scope of the children’s code and what actions they should take.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Do you think any further aspects of the age-appropriate design code should be incorporated into the Bill?

Stephen Almond: We are not seeking to incorporate further aspects of the code into the Bill. We think it is important that the regimes fit together coherently, but that that is best achieved through regulatory co-operation between the ICO and Ofcom. The incorporation of the children’s code would risk creating some form of regulatory overlap and confusion.

I can give you a strong assurance that we have a good track record of working closely with Ofcom in this area. Last year, the children’s code came into force, and not too longer after it, Ofcom’s video-sharing platform regime came into force. We have worked very closely to make sure that those regimes are introduced in a harmonised way and that people understand how they fit together.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Working closely with Ofcom is really good, but do you think there needs to be a duty to co-operate with Ofcom, or indeed with other regulators—to be specified in the Bill—in case relations become more tense in future?

Stephen Almond: The Bill has, in my view, been designed to work closely alongside data protection law. It supports effective co-operation between us and Ofcom by requiring and setting out a series of duties for Ofcom to consult with the ICO on the development of any codes of practice or formal guidance with an impact on privacy. With that framework in mind, I do not think there is a case to instil further co-operation duties in that way. I hope I can give you confidence that we and Ofcom will be working tirelessly together to promote the safety and privacy of citizens online. It is firmly in our interests and in the interest of society as a whole to do so.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you for joining us, Mr Almond. You stated the aim of making the UK the

“safest place in the world to be online”.

In your view, what needs to be added or taken away from the Bill to achieve that?

Stephen Almond: I am not best placed to comment on the questions of online safety and online harms. You will speak to a variety of different experts who can comment on that point. From my perspective as a digital regulator, one of the most important things will be ensuring that the Bill is responsive to future challenges. The digital world is rapidly evolving, and we cannot necessarily envisage all the developments in technology that will come, or the emergence of new harms. The data protection regime is a principles-based piece of legislation. That gives us a great degree of flexibility and discretion to adapt to novel forms of technology and to provide appropriate guidance as challenges emerge. I really recommend retaining that risk-based, principles-based approach to regulation that is envisaged currently in the Online Safety Bill.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q There has been much talk about trying to future-proof the Bill. Is there anything you could recommend that should be in the Bill to try to help with that?

Stephen Almond: Again, I would say that the most important thing I can recommend around this is to retain that flexibility within the Bill. I know that a temptation will emerge to offer prescription, whether for the purpose of giving companies clarity today or for addressing present harms, but it is going to be really important to make sure that there is due flexibility to enable the legislation to be responsive to future harms.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Under clause 40, the Secretary of State can modify codes of practice to reflect public policy. How do you respond to criticism that this provision risks undermining the independence of the regulator?

Stephen Almond: Ultimately, it is for Ofcom to raise any concerns about the impact of the regime, as set out by its ability to apply its duties appropriately, independently and with due accountability to Parliament and the public. As a regulator, I would say that it is important to have a proper and proportionate degree of independence, so that businesses and the public can have trust in how regulation is carried out. Ultimately though, it is for Government and Parliament to determine what the right level of independence is.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q You have no concerns about that.

Stephen Almond: No.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Mr Almond, welcome to the Committee. Thank you for joining us this afternoon. Can I start with co-operation? You mentioned a moment ago in answer to Maria Miller that co-operation between regulators, particularly in this context the ICO and Ofcom, was going to be very important. Would you describe the co-operative work that is happening already and that you will be undertaking in the future, and comment on the role that the Digital Regulation Cooperation Forum has in facilitating that?

Stephen Almond: Thank you very much. I will start by explaining the Digital Regulation Cooperation Forum. It is a voluntary, not statutory, forum that brings together ourselves, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority—some of the regulators with the greatest interest in digital regulation—to make sure that we have a coherent approach to the regulation of digital services in the interests of the public and indeed the economy.

We are brought together through our common interest. We do not require a series of duties or statutory frameworks to make us co-operate, because the case for co-operation is very, very clear. We will deliver better outcomes by working together and by joining up where our powers align. I think that is what you are seeing in practice in some of the work we have done jointly—for example, around the implementation of the children’s code alongside Ofcom’s implementation of the video-sharing platform regime. A joined-up approach to questions about, for example, how you assure the age of children online is really important. That gives me real confidence in reassuring the Committee that the ICO, Ofcom and other digital regulators will be able to take a very joined-up approach to regulating in the context of the new online safety regime.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you very much. That is extremely helpful. From the perspective of privacy, how satisfied are you that the Bill as constructed gives the appropriate protections to users’ privacy?

Stephen Almond: In our view, the Bill strikes an appropriate balance between privacy and online safety. The duties in the Bill should leave service providers in no doubt that they must comply with data protection law, and that they should guard against unwarranted intrusion of privacy. In my discourse with firms, I am very clear that this is not a trade-off between online safety and privacy: it is both. We are firmly expecting that companies take that forward and work out how they are going to adopt both a “privacy by design” and a “safety by design” approach to the delivery of their services. They must deliver both.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. My final question is this: do you feel the Bill has been constructed in such a way that it works consistently with the data protection provisions, such as UK GDPR and the Data Protection Act 2018?

Stephen Almond: In brief, yes. We feel that the Bill has been designed to work alongside data protection law, for which we remain the statutory regulator, but with appropriate mechanisms for co-operation with the ICO—so, with this series of consultation duties where codes of practice or guidance that could be issued by Ofcom may have an impact on privacy. We think that is the best way of assuring regulatory coherence in this area.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is very helpful. Thank you very much indeed.

None Portrait The Chair
- Hansard -

Mr Almond, we are trying to get a pint into a half-pint pot doing this, so we are rushing a bit. If, when you leave the room, you have a “I wish I’d said that” moment, please feel free to put it in writing to us. We are indebted to you. Thank you very much indeed.

Examination of Witnesses

Sanjay Bhandari and Lynn Perry gave evidence.

14:22
None Portrait The Chair
- Hansard -

Moving, I hope, seamlessly on, we are now going to hear oral evidence from Sanjay Bhandari, who is the chairman of Kick It Out, and—as the Committee agreed this morning—after Tuesday’s technical problems, if we do not have further technical problems, we are going to hear from Lynn Perry from Barnardo’s, again by Zoom. Is Lynn Perry on the line? [Interruption.] Lynn Perry is not on the line. We’ve got pictures; now all we need is Lynn Perry in the pictures.

I am afraid we must start, but if Lynn Perry is able to join, we will be delighted to hear from her. We have Mr Bhandari, so we will press on, because we are very short of time as it is. We hope that Lynn Perry will join us.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good afternoon, Mr Bhandari; thank you for joining us. What response have you as a football charity seen from the social media companies to the abuse that has been suffered by our sports players online? We all saw the horrendous abuse that our football heroes suffered during the Euros last year. What has been the reaction of the social media companies when this has been raised? Why has it not been tackled?

Sanjay Bhandari: I think you would have to ask them why it has not been tackled. My perception of their reaction is that it has been a bit like the curate’s egg: it has been good in parts and bad in parts, and maybe like the original meaning of that allegory, it is a polite way of saying something is really terrible.

Before the abuse from the Euros, actually, we convened a football online hate working group with the social media companies. They have made some helpful interventions: when I gave evidence to the Joint Committee, I talked about wanting to have greater friction in the system, and they are certainly starting to do that with things like asking people, “Do you really want to send this?” before they post something. We understand that that is having some impact, but of course, it is against the backdrop of a growing number of trolls online. Also, we have had some experiences where we make a suggestion, around verification for instance, where we are introducing third-party companies to social media companies, and very often the response we get is different between London and California. London will say “maybe”, and California then says “no”. I have no reason to distrust the people we meet locally here, but I do not think they always have the power to actually help and respond. The short answer is that there are certainly good intentions from the people we meet locally and there is some action. However, the reality is that we still see quite a lot of content coming through.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you for that. The Centre for Countering Digital Hate, which we will hear from later this afternoon, has identified that, as well as a vast majority of abuse being directed on public profiles, it is also done via direct messaging, in private and sometimes on those smaller high-harm platforms. There are concerns raised by others that this would not be covered by the Bill. Do you have any thoughts on that and what would you like to see?

Sanjay Bhandari: I think we need to work that through. I am sorry that my colleagues from the Premier League and the Football Association could not be here today; I did speak to them earlier this week but unfortunately they have got some clashes. One thing we are talking about is how we tag this new framework to exist in content. We have a few hundred complaints that the Premier League investigates, and we have got a few thousand items that are proactively identified by Signify, working with us and the Professional Footballers’ Association. Our intention is to take that data and map it to the new framework and say, “Is this caught? What is caught by the new definition of harm? What is caught by priority illegal content? What is caught by the new communication offences, and what residue in that content might be harmful to adults?” We can then peg that dialogue to real-world content rather than theoretical debate. We know that a lot of complaints we receive are in relation to direct messaging, so we are going to do that exercise. It may take us a little bit of time, but we are going to do that.

None Portrait The Chair
- Hansard -

Lynn Perry is on the line, but we have lost her for the moment. I am afraid we are going to have to press on.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to focus on one particular issue, which is anonymity. Kick It Out has done so much with the FA to raise awareness of that issue. I was interested in your views on how the Bill treats that. The Bill mentions anonymity and pseudonymity, but it does so only once. Should the Bill take a clearer stance on online anonymity? Do you have any views on whether people should be able to use the internet fully anonymously, or should they disclose their identity to the platform? Do you have any thoughts on that? You have done a huge amount of work on it.

Sanjay Bhandari: There is quite a lot in that question. In terms of whether people should be fully anonymous or not, it depends on what you mean by fully. I am a lawyer, so I have 30 years specialising in the grey, rather than in the black and white. It really does depend on what you mean by fully. In my experience, nothing is absolute. There is no absolute right to freedom of speech; I cannot come in here and shout “Fire!” and make you all panic. There is also no absolute right to anonymity; I cannot use my anonymity online as a cloak to commit fraud. Everything is qualified. It is a question of what is the balance of those qualifications and what those qualifications should be, in the particular context of the problem that we are seeking to address.

The question in this context is around the fact that anonymity online is actually very important in some contexts. If you are gay in a country where that is illegal, being anonymous is a fantastic way to be able to connect with people like you. In a country that has a more oppressive regime, anonymity is another link to the outside world. The point of the Bill is to try to get the balance so that anonymity is not abused. For example, when a football player misses a penalty in a cup final, the point of the Bill is that you cannot create a burner account and instantly send them a message racially abusing them and then delete the account—because that is what happens now. The point of the Bill, which we are certainly happy with in general terms, is to draw a balance in the way that identity verification must be offered as an option, and to give users more power over who they interact with, including whether they wish to engage only with verified accounts.

We will come back and look in more detail at whether we would like more amendments, and we will also work with other organisations. I know that my colleague Stephen Kinsella of Clean up the Internet has been looking at those anonymity provisions and at whether verification should be defined and someone’s status visible on the face of the platforms, for instance. I hope that answers those two or three questions.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

That is very helpful; thank you.

None Portrait The Chair
- Hansard -

I saw you nodding, Ms Perry. Do you wish to add anything?

Lynn Perry: I agree. The important thing, particularly from the perspective of Barnardo’s as a children’s charity, is the right of children to remain safe and protected online and in no way compromised by privacy or anonymity considerations online. I was nodding along at certain points to endorse the need to ensure that the right balance is struck for protections for those who might be most vulnerable.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Q Lynn, does the Bill ensure that children are kept as safe as possible online? If not, what improvements need to be made to it so that they are?

Lynn Perry: There are several things that we welcome as a children’s charity. One of them, age verification, has just been mentioned. We are particularly concerned and have written about children’s access to harmful and extreme pornography—they are sometimes only a couple of clicks away from harmful online commercial pornography—and we welcome the age-verification measures in the Bill. However, we are concerned about the length of time that it may take to implement those measures, during which children and young people will remain at risk and exposed to content that is potentially harmful to their development. We would welcome measures to strengthen that and to compel those companies to implement the measures earlier. If there were a commencement date for that, those provisions could be strengthened.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q How much of an impact will the Bill have on the likelihood of children being subjected to online grooming and predatory behaviour?

Lynn Perry: There are some contextual considerations that we have been reflecting on as a charity, influenced by what we have heard from children, young people, parents and carers. We know that more children have had access to digital devices and have spent more time online over the last couple of years in particular. In that sense, we are concerned that the Bill needs to be strengthened because of the volume of access, the age at which children and young people now access digital content, and the amount of time that they spend online.

There are some other contextual things in respect of grooming. We welcome the fact that offences are named on the face of the Bill, for example, but one of the things that is not currently included is the criminal exploitation of children. We think that there is another opportunity to name criminal exploitation, where young people are often targeted by organised criminal gangs. We have seen more grooming of that type during the pandemic period as offenders have changed the ways in which they seek to engage young people. That is another area that we would welcome some consideration of.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q In terms of online gaming, and predators moving children from more mainstream to less regulated platforms, do you think there are improvements in the Bill that relate to that, or do you think more can be done?

Lynn Perry: Grooming does happen within gaming, and we know that online video games offer some user-to-user interaction. Users sometimes have the ability to create content within platforms, which is in scope for the Bill. The important thing will be enforcement and compliance in relation to those provisions. We work with lots of children and young people who have been sexually exploited and abused, and who have had contact through gaming sites. It is crucial that this area is in focus from the perspective of building in, by design, safety measures that stop perpetrators being able to communicate directly with children.

Private messaging is another area for focus. We also consider it important for Ofcom to have regulatory powers to compel firms to use technology that could identify child abuse and grooming.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q If I could address one question to each witness, that would be fantastic. I do a lot of work with women in sport, including football. Obviously, we have the Women’s Euros coming up, and I have my Panini sticker album at the ready. Do you think the Bill could do more to address the pervasive issue of online threats of violence and abuse against women and girls, including those directed at women in sport, be they players, officials or journalists?

Sanjay Bhandari: I can see that there is something specific in the communications offences and that first limb around threatening communications, which will cover a lot of the things we see directed at female football pundits, like rape threats. It looks as though it would come under that. With our colleagues in other civil society organisations, particularly Carnegie UK Trust, we are looking at whether more should be done specifically about tackling misogyny and violence against women and girls. It is something that we are looking at, and we will also work with our colleagues in other organisations.

None Portrait The Chair
- Hansard -

Q Ms Perry, do you want to add anything to that?

Lynn Perry: When we were looking at children and young people’s access to harmful pornographic content, one thing we were particularly concerned about related to seeing extreme harmful and violent content, often perpetrated towards women. In respect of younger children, violence against women and girls and gender-based violence considerations, it is something that we are concerned about in that context.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Do you have any thoughts on the Bill committing to a statutory user advocacy body representing the interests of children? If you do, how do you think that that could be funded?

Lynn Perry: I am sorry—that was a question about advocacy, I think.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Yes, the idea of having a statutory user advocacy body that would represent the interests of children. This is something that has been talked about. Is that something you have any thoughts about?

Lynn Perry: We certainly have a lot of representation from children and young people directly. Last year, we worked with more than 380,000 children and young people. We think that advocacy and representation on behalf of children and young people can be used to powerful effect. Making sure that the voices of children and young people, their views, wishes and experiences, are heard and influence legislation that could safeguard and protect them effectively is something that we are supportive of.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Should the Bill commit to that?

Lynn Perry: As a recommendation, we think that could only strengthen the protections of children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Picking up that last point about representation for particular groups of users including children, Ms Perry, do you agree that the ability to designate organisations that can make super-complaints might be an extremely valuable avenue, in particular for organisations that represent user groups such as children? Organisations such as yours could get designated and then speak on behalf of children in a formal context. You could raise super-complaints with the regulator on behalf of the children you speak for. Is that something to welcome? Would it address the point made by my colleague, Kim Leadbetter, a moment ago?

Lynn Perry: We would welcome provision to be able to bring particularly significant evidence of concern. That is certainly something that organisations, large charities in the sector and those responsible for representing the rights of children and young people would welcome. On some of these issues, we work in coalition to make representations on behalf of children and young people, as well as of parents and carers, who also raise some concerns. The ability to do that and to strengthen the response is something that would be welcomed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I am glad you welcome that. I have a question for both witnesses, briefly. You have commented in some detail on various aspects of the Bill, but do you feel that the Bill as a whole represents a substantial step forward in protecting children, in your case, Ms Perry, and those you speak for, Sanjay?

Sanjay Bhandari: Our beneficiaries are under-represented or minority communities in sports. I agree, I think that the Bill goes a substantial way to protecting them and to dealing with some of the issues that we saw most acutely after the Euro 2020 finals.

We have to look at the Bill in context. This is revolutionary legislation, which we are not seeing anywhere else in the world. We are going first. The basic sanctions framework and the 10% fines I have seen working in other areas—anti-trust in particular. In Europe, that has a long history. The definition of harm being in the manner of dissemination will pick up pile-ons and some forms of trolling that we see a lot of. Hate crime being designated as priority illegal content is a big one for us, because it puts the proactive duty on the platforms. That too will take away quite a lot of content, we think. The new threatening communications offence we have talked about will deal with rape and death threats. Often the focus is on, quite rightly, the experience of black professional footballers, but there are also other people who play, watch and work in the game, including our female pundits and our LGBT fan groups, who also get loads of this abuse online. The harm-based offence—communications sent to cause harm without reasonable excuse—will likely cover things such as malicious tagging and other forms of trolling. I have already talked about the identification, verification and anonymity provisions.

I think that the Bill will go a substantial way. I am still interested in what fits into that residual category of content harmful to adults, but rather than enter into an arid philosophical and theoretical debate, I will take the spirit of the Bill and try to tag it to real content.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Before I turn to Ms Perry with the same question about the Bill’s general effect, Sanjay, you mentioned the terrible incidence of abuse that the three England footballers got after the penalties last summer. Do you think the social media firms’ response to that incident was adequate, or anywhere close to adequate? If not, does that underline the need for this legislation?

Sanjay Bhandari: I do not think it was adequate because we still see stuff coming through. They have the greatest power to stop it. One thing we are interested in is improving transparency reporting. I have asked them a number of times, “Someone does not become a troll overnight, in the same way that someone does not become a heroin addict overnight, or commit an extremist act of terrorism overnight. There is a pathway where people start off, and you have that data. Can I have it?” I have lost count of the number of times that I have asked for that data. Now I want Ofcom to ask them for it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes. There are strong powers in the Bill for Ofcom to do precisely that. Ms Perry, may I ask you same general question? Do you feel that the Bill represents a very substantial step forward in protecting children?

Lynn Perry: We do. Barnardo’s really welcomes the Bill. We think it is a unique and once-in-a-generation opportunity to achieve some really long-term changes to protect children from a range of online harms. There are some areas in which the Bill could go further, which we have talked about today. The opportunity that we see here is to make the UK the safest place in the world for children to be online. There are some very important provisions that we welcome, not least on age verification, the ability to raise issues through super-complaints, which you have asked me about, and the accountability in various places throughout the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Ms Perry. Finally, Mr Bhandari, some people have raised concerns about free speech. I do not share those concerns—in fact, I rebutted them a Times article earlier this week—but does the Bill cause you any concern from a free-speech perspective?

Sanjay Bhandari: As I said earlier, there are no absolute rights. There is no absolute right to freedom of speech— I cannot shout “Fire!” here—and there is no absolute right to privacy; I cannot use my anonymity as a cloak for criminality. It is question of drawing an appropriate balance. In my opinion, the Bill draws an appropriate balance between the right to freedom of speech and the right to privacy. I believe in both, but in the same way that I believe in motherhood and apple pie: of course I believe in them. It is really about the balancing exercise, and I think this is a sensible, pragmatic balancing exercise.

None Portrait The Chair
- Hansard -

Ms Perry, I am very pleased that we were finally able to hear from you. Thank you very much indeed—you have been very patient. Thank you very much, Mr Bhandari. If either of you, as a result of what you have heard and been asked today, have any further thoughts that you wish to submit, please do so.

Examination of Witnesses

Eva Hartshorn-Sanders and Poppy Wood gave evidence.

14:48
None Portrait The Chair
- Hansard -

We will hear oral evidence first from Eva Hartshorn-Sanders, who is the head of policy at the Centre for Countering Digital Hate. We shall be joined in due course by Poppy Wood. Without further ado, I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you for joining us this afternoon. I have quoted a lot of the stats that the Centre for Countering Digital Hate has produced on online abuse directed at individuals with protected characteristics. In the previous panel, I mentioned that the vast majority is done via direct messaging, sometimes through end-to-end encryption on platforms. What are your concerns about this issue in the Bill? Does the Bill adequately account for tackling that form of abuse?

Eva Hartshorn-Sanders: That is obviously an important area. The main mechanism to look at are the complaints pathways and ensuring that when reports are made, action is taken, and that that is included in risk assessments as well. In our “Hidden Hate” report, we found that 90% of misogynist abuse, which included quite serious sexual harassment and abuse, videos and death threats, was not acted on by Instagram, even when we used the current pathways for the complainant. This is an important area.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Part of the issue is that the regulated service providers have to rely heavily on the use of AI to facilitate monitoring and take down problematic content in order to comply with the Bill, but, as several stakeholders have said, algorithmic moderation is inadequate for recognising the nuance and subtleties, in order to actively and effectively take down the content. What more would you like to see in the Bill to counteract that issue?

Eva Hartshorn-Sanders: There has to be human intervention as part of that process as well. Whatever system is in place—the relationship between Ofcom and the provider is going to vary by platform and by search provider too, possibly—if you are making those sorts of decisions, you want to have it adequately resourced. That is what we are saying is not happening at the moment, partly because there is not yet the motivation or the incentives there for them to be doing any differently. They are doing the minimum; what they say they are going to do often comes out through press releases or policies, and then it is not followed through.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You mentioned that there is not adequate transparency and openness on how these things work. What systems would you like to see the Bill put the place to ensure the transparency, independence and accountability of Ofcom, but also the transparency and openness of the tech companies and the platforms that we are seeking to regulate?

Eva Hartshorn-Sanders: I think there is a role for independent civil society, working with the regulator, to hold those companies to account and to be accessing that data in a way that can be used to show how they are performing against their responsibilities under the Bill. I know Poppy from Reset.tech will talk to this area a bit more. We have just had a global summit on online harms and misinformation. Part of the outcome of that was looking at a framework for how we evaluate global efforts at legislation and the transparency of algorithms and rules enforcement, and the economics that are driving online harms and misinformation. That is an essential part of ensuring that we are dealing with the problems.

None Portrait The Chair
- Hansard -

May I say, for the sake of the record, that we have now been joined by Poppy Wood, the UK director of Reset.tech? Ms Wood, you are not late; we were early. We are trying to make as much use as we can of the limited time. I started with the Opposition Front Bencher. If you have any questions for Poppy Wood, go ahead.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I do—thank you, Sir Roger. I am not sure if you managed to hear any of that interaction, Poppy. Do you have any comments to make on those points before I move on?

Poppy Wood: I did not hear your first set of questions—I apologise.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

That is fine. I will just ask you what you think the impact is of the decision to remove misinformation and disinformation from the scope of the Bill, particularly in relation to state actors?

Poppy Wood: Thank you very much, and thank you for having me here today. There is a big question about how this Bill tackles co-ordinated state actors—co-ordinated campaigns of disinformation and misinformation. It is a real gap in the Bill. I know you have heard from Full Fact and other groups about how the Bill can be beefed up for mis- and disinformation. There is the advisory committee, but I think that is pretty weak, really. The Bill is sort of saying that disinformation is a question that we need to explore down the line, but we all know that it is a really live issue that needs to be tackled now.

First of all, I would make sure that civil society organisations are on that committee and that its report is brought forward in months, not years, but then I would say there is just a real gap about co-ordinated inauthentic behaviour, which is not referenced. We are seeing a lot of it live with everything that is going on with Russia and Ukraine, but it has been going on for years. I would certainly encourage the Government to think about how we account for some of the risks that the platforms promote around co-ordinated inauthentic behaviour, particularly with regard to disinformation and misinformation.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q We have heard a lot from other witnesses about the ability of Ofcom to regulate the smaller high-risk platforms. What is your view on that?

Poppy Wood: Absolutely, and I agree with what was said earlier, particularly by groups such as HOPE not hate and Antisemitism Policy Trust. There are a few ways to do this, I suppose. As we are saying, at the moment the small but high-risk platforms just are not really caught in the current categorisation of platforms. Of course, the categories are not even defined in the Bill; we know there are going to be categories, but we do not know what they will be.

I suppose there are different ways to do this. One is to go back to where this Bill started, which was not to have categories of companies at all but to have a proportionality regime, where depending on your size and your functionality you had to account for your risk profile, and it was not set by Ofcom or the Government. The problem of having very prescriptive categories—category 1, category 2A, category 2B—is, of course, that it becomes a race to the bottom in getting out of these regulations without having to comply with the most onerous ones, which of course are category 1.

There is also a real question about search. I do not know how they have wriggled out of this, but it was one of the biggest surprises in the latest version of the Bill that search had been given its own category without many obligations around adult harm. I think that really should be revisited. All the examples that were given earlier today are absolutely the sort of thing we should be worrying about. If someone can google a tractor in their workplace and end up looking at a dark part of the web, there is a problem with search, and I think we should be thinking about those sorts of things. Apologies for the example, but it is a really, really live one and it is a really good thing to think about how search promotes these kinds of content.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to touch on something we have not talked about a lot today, which is enforcement and the enforcement powers in the Bill. There are significant enforcement powers in the Bill, but do our two witnesses here which those enforcement powers are enough. Eva?

Eva Hartshorn-Sanders: Are you specifically asking about the takedown notices and the takedown powers?

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

No, I am talking about director liability and the enforcement on companies.

Eva Hartshorn-Sanders: Right. I think the responsibility on both companies and senior executives is a really critical part of this legislative package. You see how adding liability alongside financial penalties works in health and safety legislation and corporate manslaughter provisions to motivate changes not only within company culture but in the work that they are doing and what they factor into the decisions they make. It is a critical part of this Bill.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Is there more that could or should be added to the Bill?

Eva Hartshorn-Sanders: I think it is a good start. I would want to have another look at it to say more. There is a review after two years, as set out in clause 149, so there could be a factor that gets added into that, as well.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Poppy, do you have anything to add?

Poppy Wood: Yes. I think we could go much further on enforcement. One of the things that I really worry about is that if the platforms make an inadequate risk assessment, there is not much that Ofcom can do about it. I would really like to see powers for Ofcom to say, “Okay, your risk assessment hasn’t met the expectations that we put on you, so we want you to redo it. And while you’re redoing it, we may want to put you into a different category, because we may want to have higher expectations of you.” That way, you cannot start a process where you intentionally make an inadequate risk assessment in order to extend the process of you being properly regulated. I think that is one thing.

Then, going back to the point about categorisation, I think that Ofcom should be given the power to recategorise companies quickly. If you think that a category 2B company should be a category 1 company, what powers are there for Ofcom to do that? I do not believe that there are any for Ofcom to do that, certainly not to do it quickly, and when we are talking about small but high-risk companies, that is absolutely the sort of thing that Ofcom should be able to do—to say, “Okay, you are now acting like a category 1 company.” TikTok, Snapchat—they all started really small and they accelerated their growth in ways that we just could not have predicted. When we are talking about the emergence of new platforms, we need to have a regulator that can account for the scale and the pace at which these platforms grow. I think that is a place where I would really like to see Ofcom focusing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a question for the Centre for Countering Digital Hate. I raised some of your stats on reporting with Meta—Facebook—when they were here, such as the number of reports that are responded to. They basically said, “This is not true any more; we’re now great”—I am paraphrasing, obviously. Could you please let us know whether the reporting mechanism on major platforms—particularly Facebook—is now completely fixed, or whether there are still lots of issues with it?

Eva Hartshorn-Sanders: There are still lots of issues with it. We recently put a report out on anti-Muslim hatred and found that 90% of the content that was reported was not acted on. That was collectively, across the platforms, so it was not just Facebook. Facebook was in the mid-90s, I think, in terms of its failure to act on that type of harmful content. There are absolutely still issues with it, and this regulation—this law—is absolutely necessary to drive change and the investment that needs to go into it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question for Poppy, although I am afraid it might not have a quick answer. How much of an impact does the algorithmic categorisation of things—the way we are fed things on social media—have on our lives? Do you think it is steering people towards more and more extreme content? Or is it a totally capitalist thing that is not harmful, and just something that sells us things every so often?

Poppy Wood: I think it goes without saying that the algorithmic promotion of harmful content is one of the biggest issues with the model we have in big tech today. It is not the individual pieces of content in themselves that are harmful. It is the scale over which they spread out—the amplification of them; the targeting; the bombardment.

If I see one piece of flat-earth content, that does not necessarily harm me; I probably have other counter-narratives that I can explore. What we see online, though, is that if you engage with that one piece of flat-earth content, you are quickly recommended something else—“You like this, so you’ll probably like that”—and then, before you know it, you are in a QAnon conspiracy theory group. I would absolutely say that the algorithmic promotion of harmful content is a real problem. Does that mean we ban algorithms? No. That would be like turning off the internet. You have to go back and ask, how it is that that kind of harm is promoted, and how is it that we are exploiting human behaviour? It is human nature to be drawn to things that we cannot resist. That is something that the Bill really needs to look at.

In the risk assessments, particularly for illegal content and content that is harmful to children, it explicitly references algorithmic promotion and the business model. Those are two really big things that you touched on in the question. The business model is to make money from our time spent online, and the algorithms serve us up the content that keeps us online. That is accounted for very well in the risk assessments. Some of the things around the safety duties do not necessarily account for that, just because you are risk assessing for it. Say you identify that our business model does promote harmful content; under the Bill, you do not have to mitigate that all the time. So I think there are questions around whether the Bill could go further on algorithmic promotion.

If you do not mind, I will quickly come back to the question you asked Eva about reporting. We just do not know whether reporting is really working because we cannot see—we cannot shine a light into these platforms. We just have to rely on them to tell us, “Hey, reporting is working. This many pieces of content were reported and this many pieces of content were taken down.” We just do not know if that is true. A big part of this regime has to be about transparency. It already is, but I think it could go much further in enabling Ofcom, Government, civil society and researchers to say, “Hey, you said that many pieces of content were reported and that many pieces of content were taken down, but actually, it turns out that none of that is true. We are still seeing that stuff online.” Transparency is a big part of the solution around understanding whether reporting is really working and whether the platforms are true to their word.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

Q May I ask a follow-up question on that? Poppy, you referenced risk assessments. Would you value and welcome more specifics around quality standards and minimum requirements on risk assessments? My main question is about privacy and anonymity, but I would appreciate a word on risk assessments.

Poppy Wood: Absolutely. I know that children’s groups are asking for minimum standards for children’s risk assessments, but I agree that they should be across the board. We should be looking for the best standards that we can get. I really do not trust the platforms to do these things properly, so I think we have to be really tough with them about what we expect from them. We should absolutely see minimum standards.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Do you think Ofcom has the resources that it would require to push for an independent audit of risk assessments?

Poppy Wood: Obviously Ofcom is growing. The team at Ofcom are fantastic, and they are hiring really top talent. They have their work cut out in dealing with some of the biggest and wealthiest companies in the world. They need to be able to rely on civil society and researchers to help them to do their job, but I do not think we should rule out Ofcom being able to do these things. We should give it the powers to do them, because that makes this regime have proper teeth. If we find down the line that, actually, it is too much, that is for the Government to sort out with resourcing, or for civil society and researchers to support, but I would not want to rule things out of the Bill just because we think Ofcom cannot do them.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q What are your thoughts on the balance between privacy and anonymity?

Poppy Wood: Of course, the Bill has quite a unique provision for looking at anonymity online. We have done a big comparison of online safety regulations across the world, and nobody is looking at anonymity in the same way as the UK. It is novel, and with that comes risk. Let us remember that anonymity is a harm reduction mechanism. For lots of people in authoritarian regimes, and even for those in the UK who are survivors of domestic abuse or who want to explore their sexuality, anonymity is a really powerful tool for reducing harm, so we need to remember that when we are talking about anonymity online.

One of my worries about the anonymity agenda in the Bill is that it sounds really good and will resonate really well with the public, but it is very easy to get around, and it would be easy to oversell it as a silver bullet for online harm. VPNs exist so that you can be anonymous. They will continue to exist, and people will get around the rules, so we need to be really careful with the messaging on what the clauses on anonymity really do. I would say that the whole regime should be a privacy-first regime. There is much more that the regime can do on privacy. With age verification, it should be privacy first, and anonymity should be privacy first.

I also have some concerns about the watering down of privacy protections from the draft version of the Bill. I think the language was “duty to account for the right to privacy”, or something, and that right-to-privacy language has been taken out. The Bill could do more on privacy, remembering that anonymity is a harm-reducing tool.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Eva, there is just one reference to anonymity in the Bill currently. Do you think there is an opportunity to express a fuller, more settled opinion and potentially expand on that juxtaposition?

Eva Hartshorn-Sanders: I heard the advice that the representative of the Information Commissioner’s Office gave earlier—he feels that the balance is right at the moment. It is important to incorporate freedom of speech and privacy within this framework in a democratic country. I do not think we need to add anything more than that.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us this afternoon. May I ask for your views on the clauses on journalistic content exemption and democratic content exemption? Do you think that these measures are likely to be effective?

Poppy Wood: I know you have spoken a lot about this over the past few days, but the content of democratic importance clause is a layer of the Bill that makes the Bill very complicated and hard to implement. My concern about these layers of free speech—whether it is the journalistic exemption, the news media exemption or the content of democratic importance clause—is that, as you heard from the tech companies, they just do not really know what to do with it. What we need is a Bill that can be implemented, so I would definitely err on the side of paring back the Bill so that it is easy to understand and clear. We should revisit anything that causes confusion or is obscure.

The clause on content of democratic importance is highly problematic—not just because it makes the Bill hard to implement and we are asking the platforms to decide what democratic speech is, but because I think it will become a gateway for the sorts of co-ordinated disinformation that we spoke about earlier. Covid disinformation for the past two years would easily have been a matter of public policy, and I think the platforms, because of this clause, would have said, “Well, if someone’s telling you to drink hydroxychloroquine as a cure for covid, we can’t touch that now, because it’s content of democratic importance.”

I have another example. In 2018, Facebook said that it had identified and taken down a Facebook page called “Free Scotland 2014”. In 2018—four years later—Facebook identified it. It was a Russian/Iranian-backed page that was promoting falsehoods in support of Scottish independence using fake news websites, with articles about the Queen and Prince Philip wanting to give themselves a pay rise by stealing from the poor. It was total nonsense, but that is easily content of democratic importance. Even though it was backed by fake actors—as we have said, I do not think there is anything in the Bill to preclude that at the moment, or at least to get the companies to focus on it—in 2014, that content would have been content of democratic importance, and the platforms took four years to take it down.

I think this clause would mean that that stuff became legitimate. It would be a major loophole for hate and disinformation. The best thing to do is to take that clause out completely. Clause 15(3) talks about content of democratic importance applying to speech across a diverse range of political opinion. Take that line in that subsection and put it in the freedom of expression clause—clause 19. What you then have is a really beefed-up freedom of expression clause that talks about political diversity, but you do not have layers on top of it that mean bad actors can promote hate and disinformation. I would say that is a solution, and that will make the Bill much easier to implement.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Poppy. Eva?

Eva Hartshorn-Sanders: I think the principle behind the duty is correct and that they should consider the democratic importance of content when they are making moderation decisions, but what we know from our work is that misinformation and disinformation on social media poses a real threat to elections and democracies around the world. As an international organisation, we have studied the real harms caused by online election disinformation in countries like the US. We saw websites like The Gateway Pundit profit from Google ads to the tune of over $1 million while spreading election disinformation. That has led to real-world death threats sent to election officials and contributed to the events of 6 January. It is not something we want to see replicated in the UK.

The problem with the democratic importance duty is that it is framed negatively about preventing platforms from removing content, rather than positively about addressing content that undermines elections. That is concerning because it is the latter that has proved to be damaging in the real world. I think where we are getting to is that there should be a positive duty on platforms to act on content that is designed and intended to undermine our democracy and our elections.

To add to that, the Joint Committee on the draft Bill looked specifically at having misinformation and disinformation on elections and public health on the face of the Bill rather than leaving it to secondary legislation. That is a position that we would support. The type of harm we have seen over the last couple of years through covid is a known harm and it is one that we should be addressing. It has led to the deaths of millions of people around the world.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q That is really helpful; thank you. You raised the point about the abuse that was directed at election officials in America. Do you think it should almost be a stand-alone offence to send harmful or threatening communications to elected people—MPs, councillors, mayors or police and crime commissioners—or possibly even election officials, the people who are involved in the democratic process, because of the risk that that abuse and threats could have on democracy?

Eva Hartshorn-Sanders: Obviously abuse is unacceptable, and there have been real issues with that globally and I know in the UK from the work we have done with MPs here, including through the misogyny research. I guess this is the balance—if people have concerns about legitimate political decisions that are being made—but that is why you have an independent regulator who can assess that content.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Poppy, do you have any thoughts on that?

Poppy Wood: We are seeing people who put themselves forward in public life receiving all sorts of horrible abuse, which was cited as a big reason for women and people of colour removing themselves from public life in recent elections. My understanding is that the threatening communications offences brought in under the illegal duties will probably cover quite a lot of that. The idea that Eva just gave of an election risk assessment or something might, coupled with the threatening communications offences, mean that you are accounting for how your platform promotes that sort of hate.

One of the things that you would want to try to avoid is making better protections for politicians than for everyone else, but I think that threatening communications already covers some of that stuff. Coupled with an elections risk assessment, that would hopefully mean that there are mitigating effects on the risks identified in those risk assessments to tackle the sorts of things that you were just talking about.

Eva Hartshorn-Sanders: Just to add to that, from our work on “Don’t Feed the Trolls”, we know that a lot of these hate campaigns are quite co-ordinated. There is a whole lot of supporting evidence behind that. They will often target people who raise themselves up in whatever position, whether elected or a different type. The misogyny report we have just done had a mix of women who were celebrities or just had a profile and a large Instagram following and who were, again, subject to that abuse.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Should there be more in the Bill with a specific reference to violence against women and girls, abuse and threats, and misogyny?

Eva Hartshorn-Sanders: There are definitely parts of the Bill that could be strengthened in that area. Part of that relates to incels and how they are treated, or not, as a terrorist organisation; or how small sites might be treated under the Bill. I can elaborate on that if you like.

None Portrait The Chair
- Hansard -

Thank you. Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us this afternoon and for giving us your evidence so far. At the beginning of your testimony, Ms Hartshorn-Sanders, I think you mentioned—I want to ensure I heard correctly—that you believe, or have evidence, that Instagram is still, even today, failing to take down 90% of inappropriate content that is flagged to it.

Eva Hartshorn-Sanders: Our “Hidden Hate” report was on DMs—direct messages—that were shared by the participants in the study. One in 15 of those broke the terms and conditions that Instagram had set out related to misogynist abuse—sexual abuse. That was in the wake of the World cup, so after Instagram had done a big promotion about how great it was going to be in having policies on these issues going forward. We found that 90% of that content was not acted on when we reported it. This was not even them going out proactively to find the content and not doing anything with it; it was raised for their attention, using their systems.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That clearly illustrates the problem we have. Two parts of the Bill are designed to address this: first, the ability for designated user representation groups to raise super-complaints—an issue such as the one you just mentioned, a systemic issue, could be the subject of such a super-compliant to Ofcom, in this case about Instagram—and, secondly, at clause 18, the Bill imposes duties on the platforms to have proper complaints procedures, through which they have to deal with complaints properly. Do those two provisions, the super-complaints mechanism for representative groups and clause 18 on complaints procedures, go a long way towards addressing the issue that you helpfully and rightly identified?

Eva Hartshorn-Sanders: That will depend on transparency, as Poppy mentioned. How much of that information can be shared? We are doing research at the moment on data that is shared personally, or is publicly available through the different tools that we have. So it is strengthening access to that data.

There is this information asymmetry that happens at the moment, where big tech is able to see patterns of abuse. In some cases, as in the misogyny report, you have situations where a woman might be subject to abuse from one person over and over again. The way that is treated in the EU is that Instagram will go back and look at the last 30 historically to see the pattern of abuse that exists. They are not applying that same type of rigorousness to other jurisdictions. So it is having access to it in the audits that are able to happen. Everyone should be safe online, so this should be a safety-by-design feature that the companies have.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Meta claimed in evidence to the Committee on Tuesday that it gave researchers good access to its data. Do you think that is true?

Eva Hartshorn-Sanders: I think it depends on who the researchers are. I personally do not have experience of it, but I cannot speak to that. On transparency, at the moment, the platforms generally choose what they share. They do not necessarily give you the data that you need. You can hear from my accent that I am originally from New Zealand. I know that in the wake of the Christchurch mosque terrorist attack, they were not prepared to provide the independent regulator with data on how many New Zealanders had seen the footage of the livestream, which had gone viral globally. That is inexcusable, really.

None Portrait The Chair
- Hansard -

Q Ms Wood, do you want to comment on any of this before we move on?

Poppy Wood: On the point about access to data, I do not believe that the platforms go as far as they could, or even as far as they say they do. Meta have a tool called CrowdTangle, which they use to provide access to data for certain researchers who are privileged enough to have access. That does not even include comments on posts; it is only the posts themselves. The platforms pull the rug out all the time from under researchers who are investigating things that the platforms do not like. We saw that with Laura Edelson at New York University, who they just cut off—that is one of the most famous cases. I think it is quite egregious of Meta to say that they give lots of access to data.

We know from the revelations of whistleblowers that Meta do their own internal research, and when they do not like the results, they just bury it. They might give certain researchers access to data under certain provisions, but independent researchers who want to investigate a certain emergent harm or a certain problem are not being given the sort of access that they really need to get insights that move the needle. I am afraid that I just do not believe that at all.

The Bill could go much further. A provision on access to data in clause 136 states that Ofcom has two years to issue a report on whether researchers should get access to data. I think we know that researchers should have access to data, so I would, as a bare minimum, shorten the time that Ofcom has to do that report from two years to six months. You could turn that into a question of how to give researchers access to data rather than of whether they should get it. The Digital Services Act—the EU equivalent of the Bill—goes a bit further on access to data than our Bill. One result of that might be that researchers go to the EU to get their data because they can get it sooner.

Improving the Bill’s access to data provisions is a no-brainer. It is a good thing for the Government because we will see more stuff coming out of academia, and it is a good thing for the safety tech sector, because the more research is out there, the more tools can be built to tackle online harms. I certainly call on the Government to think about whether clause 136 could go further.

None Portrait The Chair
- Hansard -

Thank you. Last brief question, Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Goodness! There is a lot to ask about.

None Portrait The Chair
- Hansard -

Sorry, we are running out of time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I appreciate that; thank you, Sir Roger. Ms Wood, you mentioned misinformation in your earlier remarks—I say “misinformation” rather than “state-sponsored disinformation”, which is a bit different. It is very difficult to define that in statute and to have an approach that does not lead to bias or to what might be construed as censorship. Do you have any particular thoughts on how misinformation could be concretely and tangibly addressed?

Poppy Wood: It is not an easy problem to solve, for sure. What everybody is saying is that you do it in a content-neutral way, so that you are not talking about listing specific types of misinformation but about the risks that are built into your system and that need to be mitigated. This is a safety by design question. We have heard a lot about introducing more friction into the system, checking the virality threshold, and being more transparent. If you can get better on transparency, I think you will get better on misinformation.

If there is more of an obligation on the platforms to, first, do a broader risk assessment outside of the content that will be listed as priority content and, secondly, introduce some “harm reduction by design” mechanisms, through friction and stemming virality, that are not specific to certain types of misinformation, but are much more about safety by design features—if we can do that, we are part of the way there. You are not going to solve this problem straightaway, but you should have more friction in the system, be it through a code of practice or a duty somewhere to account for risk and build safer systems. It cannot be a content play; it has to be a systems play.

None Portrait The Chair
- Hansard -

Thank you. I am sorry, but that brings us to the end of the time allotted to this session. Ladies, if either of you wishes to make a submission in writing in the light of what you have not answered or not been able to answer, please do. Ms Wood, Ms Hartsholm-Sanders, thank you very much indeed for joining us.

Examination of Witnesses

Owen Meredith and Matt Rogerson gave evidence.

15:25
None Portrait The Chair
- Hansard -

We shall now hear from Owen Meredith, chief executive of News Media Association, and Matt Rogerson, director of public policy at Guardian Media Group.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good afternoon, both, and thank you for coming this afternoon. We have heard a lot about the journalistic content exemption. What is your view of the current measures in the Bill and their likely consequences?

Owen Meredith: You may be aware that we submitted evidence to the Joint Committee that did prelegislative scrutiny of the draft Bill, because we think that although the Government’s stated intention to have content from recognised news media publishers, who I represent, outside the scope of the Bill, we do not believe that the drafting, as it was and still is, achieves that. Ministers and the Secretary of State have confirmed, both in public appearances and on Second Reading, that they wish to table further amendments to achieve the aim that the Government have set out, which is to ensure that content from recognised news publishers is fully out of scope of the Bill. It needs to go further, but I understand that there will be amendments coming before you at some point to achieve that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q What further would you like to see?

Owen Meredith: I would like to see a full exemption for recognised news publisher content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You would like to see a full exemption. Matt, do you have any thoughts on that?

Matt Rogerson: Yes. I would step back a bit and point to the evidence that a few of your witnesses gave today and Tuesday. I think Fair Vote gave evidence on this point. At the moment, our concern is that we do not know what the legal but harmful category of content that will be included in the Bill will look like. That is clearly going to be done after the event, through codes of practice. There is definitely a danger that news publisher content gets caught by the platforms imposing that. The reason for having a news publisher exemption is to enable users of platforms such as Facebook, Twitter and others to access the same news as they would via search. I agree with Owen’s point. I think the Government are going in the right direction with the exemption for broadcasters such as the BBC, The Times and The Guardian, but we would like to see it strengthened a bit to ensure a cast-iron protection.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Currently, is the definition of journalistic content used in the Bill clear, or do you find it ambiguous?

Matt Rogerson: I think it is quite difficult for platforms to interpret that. It is a relatively narrow version of what journalism is—it is narrower than the article 10 description of what journalism is. The legal definitions of journalism in the Official Secrets Act and the Information Commissioner’s Office journalism code are slightly more expansive and cover not just media organisations but acts of journalism. Gavin Millar has put together a paper for Index on Censorship, in which he talks about that potentially being a way to expand the definition slightly.

The challenge for the platforms is, first, that they have to take account of journalistic content, and there is not a firm view of what they should do with it. Secondly, defining what a piece of journalism or an act of journalism is takes a judge, generally with a lot of experience. Legal cases involving the media are heard through a specific bench of judges—the media and communications division—and they opine on what is and is not an act of journalism. There is a real challenge, which is that you are asking the platforms to—one assumes—use machine learning tools to start with to identify what is a potential act of journalism. Then an individual, whether they are based in California or, more likely, outsourced via an Accenture call centre, then determines within that whether it is an act of journalism and what to do with it. That does place quite a lot of responsibility on the platforms to do that. Again, I would come back to the fact that I think if the Bill was stripped back to focus on illegal content, rather than legal but harmful content, you would have less of these situations where there was concern that that sort of content was going to be caught.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q We have heard a lot of concern about disinformation by state actors purporting to be journalists and using that exemption, which could cause harm. Do you have any thoughts on that?

Matt Rogerson: Yes, a few. The first thing that is missing from the Bill is a focus on advertising. The reason we should focus on advertising is that that is why a lot of people get involved in misinformation. Ad networks at the moment are able to channel money to “unknown” sites in ways that mean that disinformation or misinformation is highly profitable. For example, a million dollars was spent via Google’s ad exchanges in the US; the second biggest recipient of that million dollars was “Unknown sites”—sites that do not categorise themselves as doing anything of any purpose. You can see how the online advertising market is channelling cash to the sort of sites that you are talking about.

In terms of state actors, and how they relate to the definition, the definition is set out quite broadly in the Bill, and it is more lengthy than the definition in the Crime and Courts Act 2013. On top of that definition, Ofcom would produce guidance, which is subject to a full and open public consultation, which would then work out how you are going to apply the definition in practice. Even once you have that guidance in place, there will be a period of case law developing where people will appeal to be inside of that exemption and people will be thrown out of that exemption. Between the platforms and Ofcom, you will get that iteration of case law developing. So I suppose I am slightly more confident that the exemption would work in practice and that Ofcom could find a workable way of making sure that bad actors do not make use of it.

None Portrait The Chair
- Hansard -

Mr Meredith, do you wish to add to that?

Owen Meredith: No, I would echo almost entirely what Matt has said on that. I know you are conscious of time.

None Portrait The Chair
- Hansard -

Thank you. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q A great deal of the discussion we are having about this Bill is its scope—what is covered and what is not covered. Many of us will look regularly at newspapers online, particularly the comments sections, which can be quite colourful. Should comments on newspaper publisher platforms be included in the scope of the Bill?

Owen Meredith: Yes, I think they should be included within the news publisher exemption as it is spelt out. As far as I understand, that has always been the intention, since the original White Paper many years ago that led to where we are today. There is a very good reason for that, not least the fact that the comments on news publisher websites are still subject to the responsibility of the editor and the publisher; they are subject to the regulation of the Independent Press Standards Organisation, in the case of those publishers who are regulated under the self-regulation system by IPSO, as the majority of my members are. There is a very different environment in news publisher websites’ comments sections, where you are actively seeking to engage with those and read those as a user, whereas on social media platforms that content can come to you without you wishing to engage with it.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Can I just probe on that slightly? You say the comments are the responsibility of the editor. Does that mean that if something is published on there that is defamatory, it would then be attributed to the editor?

Owen Meredith: Everything published by the news site is ultimately the responsibility of the editor.

Matt Rogerson: I think there are various cases. I think Delfi is the relevant case in relation to comments, where if a publisher is notified of a defamatory comment within their comments section, they are legally liable for it if they do not take it down. To speak from a Guardian perspective, we would like comments sections to be included within the exemption. The self-regulation we have in place for our comments section has been quite a journey. We undertook quite a big bit of research on all the comments that had been left over an 11-year period. We tightened up significantly the processes that we had in place. We currently use a couple of steps to make sure those comments sections are well moderated. We use machine learning against very tightly defined terms, and then every single comment that is taken down is subject to human review. I think that works in the context of a relatively small website such as The Guardian, but it would be a much bigger challenge for a platform of the size of Facebook.

None Portrait The Chair
- Hansard -

Kim Leadbeater?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you to the witnesses. I just want to clarify something. We were talking about the journalistic content definition as it is. You are saying that you do not think it is reasonable to expect service providers to identify journalistic content using the definition contained in the Bill. Do you think the Bill should be clearer about what it means by journalistic content and journalism?

Matt Rogerson: My point is that for news publishers there is a lack of definition in the journalistic content exemption, and that platforms without the exemption would have to identify whether every piece of content on their platform was journalism, so it would be very difficult for the platforms to implement. That is why for trusted news brands such as the BBC, The Times, and The Guardian, the news media exemption is really important.

What we do not know, and what Gavin Millar suggested in his paper to Index on Censorship, is how that journalistic content exemption will be interpreted by the platforms. His fear in the paper is that the current definition means that the content has to be UK-linked. It could mean, for example, that a blog or a journalist that talks about issues in the Gulf or Ukraine would not be seen as journalistic content and therefore would not be able to take advantage of the systems that the platforms put in place. I think his view is that it should be in line with the article 10 definition of journalistic content, which would seem to make sense.

Owen Meredith: If I could add to that, speaking from my members’ perspective, they would all fall under the recognised news publisher definition. I think that is why it is an important definition. It is not an easy thing to get right, and I think the Department has done a good job in drafting the Bill. I think it captures everyone we would expect it to capture. I think actually it does set a relatively high bar for anyone else who is seeking to use that. I do not think it is possible for someone to simply claim that they are a recognised news publisher if they are operating in a way that we would not expect of such a person or entity. I think it is very important that that definition is clear. I think it is clear and workable.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q I suppose there are two separate clauses there. There is the news publisher clause and the journalistic content clause. Just so I am clear, you are happy with the news publisher clause?

Owen Meredith: Yes.

Matt Rogerson: Yes.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q What about the journalistic content clause? This is an expression that was new to me—this idea of a citizen journalist. I do not even know what that means. Are we confident that this clause, which talks about journalistic content, is the worrying one?

Owen Meredith: Matt spoke to this a little bit, but from my perspective, my focus has been on making sure that the recognised news publisher clause is right, because everything that my members publish is journalistic content. Therefore, the bulk of journalistic content that is out there will be covered by that. I think where there are elements of what else could be considered journalistic content, the journalistic content clause will pick those up.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q As journalists, does that worry you?

Matt Rogerson: I wish I was a journalist.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Sorry, as representatives of journalists.

Matt Rogerson: It worries me in the sense that we want a plural media ecosystem in this country, and we want individuals who are journalists to have their content published on platforms, so that it can be read by the 50% of the UK population that get their news from Facebook. I think it is potentially problematic that they won’t be able to publish on that platform if they talk about issues that are in the “legal but harmful” bucket of harms, as defined after the Bill is passed. I think there is concern for those groups.

There are suggestions for how you could change the clause to enable them to have more protection. As I say, Gavin Millar has outlined that in his paper. Even then, once you have got that in place, if you have a series of legal but harmful harms that are relatively unclear, the challenge for the platforms will be interpreting that and interpreting it against the journalistic content clause.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q My only concern is that someone who just decides to call themselves a journalist will be able to say what they want.

Owen Meredith: I do not think that would be allowable under the Bill, because of the distinction between a recognised news publisher publishing what we would all recognise as journalistic content, versus the journalistic content exemption. I think that is why they are treated differently.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by clarifying a comment that Owen Meredith made at the very beginning? You were commenting on where you would like the Bill to go further in protecting media organisations, and you said that you wanted there to be a wholesale exemption for recognised news publishers. I think there already is a wholesale exemption for recognised news publishers. The area where the Government have said they are looking at going further is in relation to what some people call a temporary “must carry” provision, or a mandatory right of appeal for recognised news publishers. Can I just clarify that that is what you meant?

Owen Meredith: Yes. I think the issue is how that exemption will work in practice. I think that what the Government have said they are looking at and will bring forward does address the operating in practice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Can I move on to the question that Kim Leadbeater asked a moment ago, and that a number of Members have raised? You very kindly said a moment ago that you thought that clause 50, which sets out the definition of “recognised news publisher”, works as drafted. I would like to test that a bit, because some witnesses have said that it is quite widely drawn, and suggested that it would be relatively easy for somebody to set themselves up in a manner that met the test laid out in clause 50. Given the criticism that we have heard a few times today and on Tuesday, can you just expand for the Committee why you think that is not the case?

Owen Meredith: As I alluded to earlier, it is a real challenge to set out this legal definition in a country that believes, rightly, in the freedom of the press as a fourth pillar of democracy. It is a huge challenge to start with, and therefore we have to set out criteria that cover the vast majority of news publishers but do not end up with a backdoor licensing system for the press, which I think we are all keen to avoid. I think it meets that criterion.

On the so-called bad actors seeking to abuse that, I have listened to and read some of the evidence that you have had from others—not extensively, I must say, due to other commitments this week—and I think that it would be very hard for someone to meet all those criteria as set out in order to take advantage of this. I think that, as Matt has said, there will clearly be tests and challenges to that over time. It will rightly be challenged in court or go through the usual judicial process.

Matt Rogerson: It seems to me that the whole Bill will be an iterative process. The internet will not suddenly become safe when the Bill receives Royal Assent, so there will be this process whereby guidance and case law are developed, in terms of what a newspaper is, against the criteria. There are exemptions for news publishers in a whole range of other laws that are perfectly workable. I think that Ofcom is perfectly well equipped to create guidance that enables it to be perfectly workable.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. So you are categorically satisfied about the risks that we have heard articulated; that maleficent actors would not be able to set themselves up in such a way that they benefit from this exemption.

Matt Rogerson: Subject to the guidance developed by Ofcom, which we will be engaged in developing, I do think so. The other thing to bear in mind is that the platforms already have lists of trusted publishers. For example, Google has a list in relation to Google News—I think it has about 65,000 publishers—which it automates to push through Google News as trusted news publishers. Similarly, Facebook has a list of trusted news publishers that it uses as a signal for the Facebook newsfeed. So I do not buy the idea that you can’t automate the use of trusted news sources within those products.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you; that is very helpful. I have only one other question. In relation to questions concerning freedom of speech, the Government believe, and I believe, that the Bill very powerfully protects freedom of speech. Indeed, it does so explicitly through clause 19, in addition to the protections for recognised news publishers that we have discussed already and the additional protections for content of journalistic and democratic importance, notwithstanding the definitional question that have been raised. Would you agree that this Bill respects and protects free speech, while also delivering the safety objectives that it quite rightly has?

Owen Meredith: If I can speak to the point that directly relates to my members and those I represent, which is “Does it protect press freedom?”, which is perhaps an extension of your question, I would say that it is seeking to. Given the assurances you have given about the detailed amendments that you intend to bring forward—if those are correct, and I am very happy to write to the Committee and comment once we have seen the detail, if it would be helpful to do so—and everything I have heard about what you are intending to do, I believe it will. But I do not believe that the current draft properly and adequately protects press freedom, which is why, I think, you will be bringing forward amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, but with the amendment committed to on Second Reading, you would say that the Bill does meet those freedom of speech objectives, subject to the detail.

Owen Meredith: Subject to seeing the drafting, but I believe the intention—yes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you. That is very helpful. Mr Rogerson?

Matt Rogerson: As we know, this is a world first: regulation of the internet, regulation of speech acts on the internet. From a news publisher perspective, I think all the principles are right in terms of what the Government are trying to do. In terms of free speech more broadly, a lot of it will come down to how the platforms implement the Bill in practice. Only time will tell in terms of the guidance that Ofcom develops and how the platforms implement that at vast scale. That is when we will see what impact the Bill actually has in practice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q From a general free speech perspective—which obviously includes the press’s freedom of speech, but everybody else’s as well—what do you think about the right enshrined in clause 19(2), where for the first time ever the platforms’ have to have regard to the importance of protecting users’ right to freedom of speech is put on the face of a Bill? Do you think that is helpful? It is a legal obligation they do not currently have, but they will have it after the passage of the Bill. In relation to “legal but harmful” duties, platforms will also have an obligation to be consistent in the application of their own terms and conditions, which they do not have to be at the moment. Very often, they are not consistent; very often, they are arbitrary. Do you think those two changes will help general freedom of speech?

Matt Rogerson: Yes. With the development of the online platforms to the dominant position they are in today, that will be a big step forward. The only thing I would add is that, as well as this Bill, the other Bill that will make a massive difference when it comes through is the digital markets unit Bill. We need competition to Facebook so that consumers have a choice and so that they can decide which social network they want to be on, not just the one dominant social network that is available to them in this country.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I commend your ingenuity in levering an appeal for more digital competition into this discussion. Thank you.

None Portrait The Chair
- Hansard -

One final quick question from the Opposition Front Bench.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Mr Rogerson, you mentioned that platforms and tech companies currently have a list of approved broadcasters that they are enabled to use, to ensure they have that content. Isn’t it true that one of those broadcasters was Russia Today, and it was only because Ofcom intervened to remove it from social media that it was taken down, but under the current provisions in this Bill, Ofcom would not be able to do that and Russia Today would be allowed to spread disinformation on social media platforms?

Matt Rogerson: On the Russia Today problem, I think Russia Today had a licence from Ofcom, so the platforms probably took their cue from the fact that Russia Today was beamed into British homes via Freeview. Once that changed, the position of having their content available on social media changed as well. Ultimately, if it was allowed to go via broadcast, if it had a broadcast licence, I would imagine that social media companies took that as meaning that it was a—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q But under the new Bill, as journalistic content, it would be allowed to remain on those social media platforms.

Matt Rogerson: I think that would be subject to the guidance that Ofcom creates and the consultation on that guidance. I do not believe that Russia Today would be allowed under the definitions. If it is helpful, I could write to you to set out why.

None Portrait The Chair
- Hansard -

Mr Meredith, Mr Rogerson, thank you very much. If you have any further comments that you wish to make, you are free to put them in writing.

Examination of Witnesses

Tim Fassam, Rocio Concha and Martin Lewis gave evidence.

15:50
None Portrait The Chair
- Hansard -

We will now hear from Tim Fassam, the director of government relations and policy at PIMFA, the Personal Investment Management & Financial Advice Association, and from Rocio Concha, director of policy and advocacy at Which? We will be joined by Martin Lewis, of MoneySavingExpert, in due course. Thank you to the witnesses for joining us. I call the Opposition Front Bench.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you for joining us this afternoon. As a constituency MP, I am sure I am not alone in saying that a vast amount of my casework comes from members of my community writing to me to say that they have been scammed online, that they have been subject to fraud and that they feel horrendous about it. They feel shame and they do not know what to do about it. It is the single biggest crime in the UK, with victims losing an estimated £2.3 billion. In your opinion, does the Bill go far enough to tackle that?

Rocio Concha: This Bill is very important in tackling fraud. It is very important for Which? We were very pleased when fraud was included to tackle the issue that you mentioned and also when paid-for advertising was included. It was a very important step, and it is a very good Bill, so we commend DCMS for producing it.

However, we have found some weakness in the Bill, and those can be solved with very simple amendments, which will have a big impact on the Bill in terms of achieving its objective. For example, at the moment in the Bill, search engines such as Google and Yahoo! are not subject to the same duties in terms of protecting consumers from fraudulent advertising as social media platforms are. There is no reason for Google and Yahoo! to have weaker duties in the Bill, so we need to solve that.

The second area is booster content. Booster content is user-generated content, but it is also advertising. In the current definition of fraudulent advertising in the Bill, booster content is not covered. For example, if a criminal makes a Facebook page and starts publishing things about fake investments, and then he pays Facebook to boost that content in order to reach more people, the Bill, at the moment, does not cover that fraudulent advertising.

The last part is that, at the moment, the risk checks that platforms need to do for priority illegal content, the transparency reporting that they need to do to basically say, “We are finding this illegal content and this is what we are doing about it,” and the requirement to have a way for users to tell them about illegal content or complain about something that they are not doing to tackle this, only apply to priority illegal content. They do not apply to fraudulent advertising, but we think they need to.

Paid-for advertising is the most expensive way that criminals have to reach out to a lot of people. The good news, as I said before, is that this can be solved with very simple amendments to the Bill. We will send you suggestions for those amendments and, if we fix the problem, we think the Bill will really achieve its objective.

None Portrait The Chair
- Hansard -

One moment—I think we have been joined by Martin Lewis on audio. I hope you can hear us, Mr Lewis. You are not late; we started early. I will bring you in as soon as we have you on video, preferably, but otherwise on audio.

Tim Fassam: I would echo everything my colleague from Which? has said. The industry, consumer groups and the financial services regulators are largely in agreement. We were delighted to see fraudulent advertising and wider issues of economic crime included in the Bill when they were not in the initial draft. We would also support all the amendments that Which? are putting forward, especially the equality between search and social media.

Our members compiled a dossier of examples of fraudulent activity, and the overwhelming examples of fraudulent adverts were on search, rather than social media. We would also argue that search is potentially higher risk, because the act of searching is an indication that you may be ready to take action. If you are searching “invest my pension”, hopefully you will come across Martin’s site or one of our members’ sites, but if you come across a fraudulent advert in that moment, you are more likely to fall foul of it.

We would also highlight two other areas where we think the Bill needs further work. These are predominantly linked to the interaction between Ofcom, the police and the Financial Conduct Authority, because the definitions of fraudulent adverts and fraudulent behaviour are technical and complex. It is not reasonable to expect Ofcom to be able to ascertain whether an advert or piece of content is in breach of the Financial Services and Markets Act 2000; that is the FCA’s day job. Is it fraud? That is Action Fraud’s and the police’s day job. We would therefore suggest that the Bill go as far as allowing the police and the FCA to direct Ofcom to have content removed, and creating an MOU that enables Ofcom to refer things to the FCA and the police for their expert analysis of whether it breaches those definitions of fraudulent adverts or fraudulent activity.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, both. You mentioned that search is a concern, especially because it is currently out of scope of the Bill in terms of this issue. Another issue is that when people do use search to look for a financial service or something that they wish to purchase, the cookies are remembered. The algorithms on social media platforms are then triggered to promote specific adverts to them as a result of that search history or things they have mentioned via voice control to their home help devices. That is a concern. Digital advertising that you see on third-party websites is also not within scope. That has been raised as well. Do you have any thoughts on those points?

Rocio Concha: Yes. Open-display advertising is not part of the Bill. That also needs to be tackled. I think the online advertising programme should be considered, to tackle this issue. I agree with you: this is a very important step in the right direction, and it will make a huge difference if we fix this small weakness in terms of the current scope. However, there are still areas out there that need to be tackled.

None Portrait The Chair
- Hansard -

Mr Lewis, I am living in hope that we may be able to see you soon—although that may be a forlorn hope. However, I am hoping that you can hear us. Do you want to come in and comment at all at this point? [Interruption.] Oh, we have got you on the screen. Thank you very much for joining us.

Martin Lewis: Hurrah. I am so sorry, everybody—for obvious reasons, it has been quite a busy day on other issues for me, so you’ll forgive me.

None Portrait The Chair
- Hansard -

I can’t think why it has been.

Martin Lewis: I certainly agree with the other two witnesses. Those three issues are all very important to be brought in. From a wider perspective, I was vociferously campaigning to have scam adverts brought within the scope of the Online Safety Bill. I am delighted that that has happened, but let us be honest among ourselves: it is far from a panacea.

Adverts and scams come in so many places—on social media, in search engines and in display advertising, which is very common and is not covered. While I accept that the online advertising programme will address that, if I had my way I would be bringing it all into the Online Safety Bill. However, the realpolitik is that that is not going to happen, so we have to have the support in the OAP coming later.

It is also worth mentioning just for context that, although I think there is little that we can do about this—or it would take brighter people than me—one of the biggest routes for scams is email. Everybody is being emailed—often with my face, which is deeply frustrating. We have flaccid policing of what is going on on social media, and I hope the Bill will improve it, but at least there is some policing, even though it is flaccid, and it is the same on search engines. There is nothing on email, so whatever we do in this Bill, it will not stop scams reaching people. There are many things that would improve that, certainly including far better resourcing for policing so that people who scam individuals get at least arrested and possibly even punished and sentenced. Of course, that does not happen at the moment, because scamming is a crime that you can undertake with near impunity.

There is a lot that needs to be done to make the situation work, but in general the moves in the Online Safety Bill to include scam advertising are positive. I would like to see search engines and display advertising brought into that. I absolutely support the call for the FCA to be involved, because what is and is not a scam can certainly be complicated. There are more obvious ones and less obvious ones. We saw that with the sale of bonds at 5% or 6%, which pretend to be deposit bonds but are nothing of the sort. That might get a bit more difficult for Ofcom, and it would be great to see the regulator involved. I support all the calls of the other witnesses, but we need to be honest with ourselves: even if we do all that, we are still a long way from seeing the back of all scam adverts and all scams.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Mr Lewis. My final question is not necessarily about financial services advertising. With the rise of influencer culture, specifically on social media platforms such as TikTok and Instagram, we are seeing a failure to disclose adverts correctly and the potential for harmful advertising. Slimming products, for example, that are not particularly safe, especially for children, are being targeted at children. What more would you like to see this Bill do to tackle some of that? I know the ASA has taken action against some prolific offenders, but what more would you like to see in this Bill to tackle that and keep children safe from adverts that are not marked as such?

Rocio Concha: To be honest, in this area we do not have any specific proposals. I completely agree with you that this is an area that needs to be tackled, but I do not have a specific proposal for this Bill.

Tim Fassam: This is an area that we have raised with the Financial Conduct Authority—particularly the trend for financial advice TikTok and adverts for non-traditional investments, such as whisky barrels or wine, which do not meet the standards required by the FCA for other investment products. That is also true of a number of cryptocurrency adverts and formats. We have been working with the FCA to try to identify ways to introduce more consistency in the application of the rule. There has been a welcome expansion by the Treasury on the promotion of high-risk investments, which is now a regulated activity in and of itself.

I go back to my initial point. We do not believe that there is any circumstance in which the FCA would want content in any place taken down where that content should not be removed, because they are the experts in identifying consumer harm in this space.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Mr Lewis, do you have anything to add?

Martin Lewis: I still believe that most of this comes down to an issue of policing. The rules are there and are not being enforced strongly enough. The people who have to enforce the rules are not resourced well enough to do that. Therefore, you get people who are able to work around the rules with impunity.

Advertising in the UK, especially online, has been the wild west for a very long time, and it will continue to be so for quite a while. The Advertising Standards Authority is actually better at dealing with the influencer issue, because of course it is primarily strong at dealing with people who listen to the Advertising Standards Authority. It is not very good at dealing with criminal scammers based outside the European Union, who frankly cannot be bothered and will not reply—they are not going to stop—but it is better at dealing with influencers who have a reputation.

We all know it is still extremely fast and loose out there. We need to adequately resource it; putting rules and laws in place is only one step. Resourcing the policing and the execution of those rules and laws is a secondary step, and I have doubts that we will ever quite get there, because resources are always squeezed and put on the back burner.

None Portrait The Chair
- Hansard -

Thank you. Do I have any questions from Government Back Benchers? No. Does anyone have any further questions?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Yes, I do. If nobody else has questions, I will have another bite of the cherry.

None Portrait The Chair
- Hansard -

The Minister is going to come in in a minute.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I would just like to query your thoughts on a right to redress for victims. Do you think that having an ombudsman in the Bill would be appropriate, and what would you like to see to support victims of fraud?

Martin Lewis: As you will know, I had to sue Facebook for defamation, which is a ridiculous thing to do in order to stop scam adverts. I was unable to report the scam adverts to the police, because I had not been scammed—even though it was my face that was in them—and many victims were not willing to come forward. That is a rather bizarre situation, and we got Facebook to put forward £3 million to set up Citizens Advice Scam Action—that is what I settled for, as well as a scam ad reporting tool.

There are two levels here. The problem is who is at fault. Of course, those mainly at fault for scams are the scammers. They are criminals and should be prosecuted, but not enough of them are. You have times when it is the bank’s fault. If a company has not put proper precautions in place, and people have got scammed because it has put up adverts or posts that it should have prevented, they absolutely need to have some responsibility for that. I think you will struggle to have a direct redress system put in place. I would like to see it, but it would be difficult.

It is rather interesting to me that I am worried that the £3 million for Citizens Advice Scam Action, which was at least meant to provide help and support for victims of scams, is going to run out. I have not seen any more money coming from Facebook, Google or any of the other big players out there. If we are not going to fund direct redress, we could at least make sure that they fund a collective form of redress and help for the victims of scams, as a bare minimum. It is very strange that these firms go so quiet on this, and what they say is, “We are doing everything we can.”

From my meetings with these firms—these are meetings with lawyers in the room, so I have to be slightly careful—one of the things that I would warn the Committee about is that they tend to get you in and give you a presentation on all the technological reasons why they cannot stop scam adverts. My answer to them after about 30 seconds, having stopped what was meant to be an hour-long presentation, is, “I have not framed the fact that you need a technological solution. I have said you need a solution. If the answer to stopping scam adverts, and to stopping scams, is that you have to pre-vet every single advert, as old-fashioned media did, and that every advert that you put up has to have been vetted by a human being, so be it. You’re making it a function of technology, but let’s be honest: this is a function of profitability.” We have to look at the profitability of these companies when it comes to redress. What your job is—if you forgive me saying this—is to make sure that it costs them more money to let people be scammed than it does to stop people being scammed. If we solve that, we will have a lot fewer scams on social media and on the search advertising.

Rocio Concha: I completely agree with everything that Martin says. At the moment, the provisions in the Bill for “priority illegal content” require the platforms to publish reports that say, “This is how much illegal content we are seeing on the platform, and these are the measures that we are going to take.” They are also required to have a way for users to report it and to complain when they think that the platforms are not doing the right thing. At the moment, that does not apply to fraudulent advertising, so you have an opportunity to fix that in the Bill very easily, to at least get the transparency out there. The platform has to say, “We are finding this”—that puts pressure on the platform, because it is there and is also with the regulator—“and these are the measures that we are taking.” That gives us transparency to say, “Are these measures enough?” There should also be an easy way for the user to complain when they think that platforms are not doing the right thing. It is a complex question, but there are many things in the Bill that you can improve in order to improve the situation.

Tim Fassam: I wonder if it would be useful to give the Committee a case study. Members may be familiar with London Capital & Finance. Now, London Capital & Finance is one of the most significant recent scams. It sold mini-bonds fraudulently, at a very high advertised return, which then collapsed, with individuals losing all their money.

Those individuals were compensated through two vehicles. One was a Government Bill; so, they were compensated by the taxpayer. The others, because they were found to have been given financial advice despite LCF not having advice permissions or operating through a regulated product, went on to the Financial Services Compensation Scheme, which, among others, our members pay for; legitimate financial services companies pay for it. The most recent estimate is over £650 million. The expectation is that that will reach £1 billion at some point over the next few years, in terms of cost to the economy.

LCF was heavily driven by online advertising, and we would argue that the online platforms were in fact probably the only people who could have stopped it happening. They have profited from those adverts and they have not contributed anything to either of those two schemes. We would argue—possibly not for this Bill—that serious consideration should be given to the tech platforms being part of the financial services compensation scheme architecture and contributing to the costs of scams that individuals have fallen foul of, as an additional incentive for them to get on top of this problem.

Martin Lewis: That is a very important point, but I will just pick up on what Rocio was saying. One of the things that I would like to see, as well as much more rigid requirements of how reporting scams can be put in place—because I cannot see proper pre-vetting happening with these technology companies, but we can at least rely on social policing and reporting of scams. There are many people who recognise a scam, just as there are many people who do not recognise a scam.

However, I also think this is a wonderful opportunity to make sure that the method, the language and the symbols used for reporting scams are universal in the UK, so that whatever site you are on, if you see an advert you click the same symbol, and the process is unified and universal, and works in a very similar way, so that you can report a scam the same way on every site, which makes it simpler, and we can train people in how to do it and we can make the processes work.

Then, of course, we have to make sure that they act on the back of reports, but simply the various ways it is reported, and the complexity, and the number of clicks that you need to make mean it is a lot easier generally to click on an advert than it is to click to report an advert that is a scam. And with so many scams out there, I think there should be a parity of ease between those two factors.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q May I ask, directly related to that, about the complaints procedure? What would you like to see in terms of changes there, to make it more unified, more universal and simpler? It has been suggested that it is not robust enough, not dynamic enough and not fast enough.

Rocio Concha: There were complaints from the users. At the moment, this Bill will not allow this for fraudulent advertising. So, we need to make sure that it is a requirement for the platforms to allow and to have an easy tool for people to complain and to report when they see something that is fraudulent. At the moment, the Bill does not do that. It is an easy fix; you can do it. And then the user will have that tool. It would also give us transparency for the regulator and for organisations such as ours, to see what is happening and to see what measures the platforms are taking.

Tim Fassam: I would agree with that. I would also highlight a particular problem that our members have flagged, and we have flagged directly with Meta and Instagram. Within the definition in the Bill of individuals who can raise concern about social media platforms, our members find they fall between two stools, because quite often what is happening is that people are claiming an association with a legitimate firm. So they will have a firm’s logo, or a firm’s web address, in their profile for their social media and then they will not directly claim to be a financial adviser but imply an association with a legitimate financial advice firm. This happens surprisingly frequently.

Our members find it incredibly difficult to get those accounts taken down, because it is not a fraudulent account; that individual is not pretending to be someone else and they are not the individual claiming pretence. They are not directly claiming to be an employee; they could just say they are a fan of the company. And they are not a direct victim of this individual. What happens is that when they report, it goes into a volume algorithm, and only if a very large number of complaints are made does that particular site get taken down. I think that could be expanded to include complaints from individuals affected by the account, rather than directly believing they are pretending to be that.

None Portrait The Chair
- Hansard -

Mr Lewis, you were nodding.

Martin Lewis: I was nodding—I was smiling and thinking, “If it makes you feel any better, Tim, I have pictures of me that tell people to invest money that are clearly fake, because I don’t do any adverts, and it still is an absolute pain in the backside for me to get them taken down, having sued Facebook.” So, if your members want to feel any sense of comradeship, they are not alone in this; it is very difficult.

I think the interesting thing is about that volumetric algorithm. Of course, we go back to the fact that these big companies like to err on the side of making money and err away from the side of protecting consumers, because those two, when it comes to scams, are diametrically opposed. The sooner we tidy it up, the better. You could have a process where once there has been a certain number of reports—I absolutely get Tim’s point that in certain cases there is not a big enough volume—the advert is taken down and then the company has to proactively decide to put it back up and effectively say, “We believe this is a valid advert.” Then the system would certainly work better, especially if you bring down the required number of reports. At the moment, I think, there tends to be an erring on the side of, “Keep it up as long as it’s making us money, unless it absolutely goes over the top.”

Many tech experts have shown me adverts with my face in on various social media platforms. They say it would take them less than five minutes to write a program to screen them out, but those adverts continue to appear. We just have to be conscious here that—there is often a move towards self-regulation. Let me be plain, as I am giving evidence. I do not trust any of these companies to have the user and the consumer interest at heart when it comes to their advertising; what they have at heart is their own profits, so if we want to stop them, we have to make this Bill robust enough to stop them, because that is the only way it will stop. Do not rely on them trying to do good, because they are trying to make profit and they will err on the side of that over the side of protecting individuals from scam adverts.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q I thank the witnesses for coming. In terms of regulation, I was going to ask whether you believe that Ofcom is the most suitable regulator to operate in this area. You have almost alluded to the fact that you might not. On that basis, should we specify in the Bill a duty for Ofcom to co-operate with other regulators—for example, the Competition and Markets Authority, the Financial Conduct Authority, Action Fraud or whoever else?

Tim Fassam: I believe that would be helpful. I think Ofcom is the right organisation to manage the relationship with the platforms, because it is going to be much broader than the topics we are talking about in our session, but we do think the FCA, Action Fraud and potentially the CMA should be able to direct, and be very clear with Ofcom, that action needs to be taken. Ofcom should have the ability to ask for things to be reviewed to see whether they break the rules.

The other area where we think action probably needs to be taken is where firms are under investigation, because the Bill assumes it is clear cut whether something is fraud, a scam, a breach of the regulations or not. In some circumstances, that can take six months or a year to establish through investigation. We believe that if, for example, the FCA feels that something is high risk, it should be able to ask Ofcom to suspend an advert, or a firm from advertising, pending an investigation to assess whether it is a breach of the regulation.

Rocio Concha: I agree that Ofcom is the right regulator, the main regulator, but it needs to work with the other regulators—with the FCA, ASA and CMA—to enforce the Bill effectively. There is another area. Basically, we need to make sure that Ofcom and all the regulators involved have the right resources. When the initial version of the Bill was published, Ofcom got additional resources to enable it to enforce the Bill. But the Bill has increased in scope, because now it includes fraud and fraudulent advertising. We need to make sure that Ofcom has the right resources to enforce the full Bill effectively. That is something that the Government really need to consider.

Martin Lewis: I was going to make exactly that point, but it has just been made brilliantly so I will not waste your time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I thank the witnesses for joining us this afternoon, and particularly Martin Lewis for his campaigning in this area.

I will start by agreeing with the point that Martin Lewis made a minute or two ago—that we cannot trust these companies to work on their own. Mr Lewis, I am not sure whether you have had a chance to go through clause 34, which we inserted into the Bill following your evidence to the Joint Committee last year. It imposes a duty on these companies to take steps and implement systems to

“prevent individuals from encountering content consisting of fraudulent advertisements”.

There is a clear duty to stop them from doing this, rather as you were asking a minute ago when you described the presentation. Does that strong requirement in clause 34, to stop individuals from encountering fraudulent advertisement content, meet the objective that you were asking for last year?

Martin Lewis: Let me start by saying that I am very grateful that you have put it in there and thankful that the Government have listened to our campaign. What I am about to say is not intended as criticism.

It is very difficult to know how this will work in practice. The issue is all about thresholds. How many scam adverts can we stomach? I still have, daily—even from the platform that I sued, never mind the others—tens of reports directly to me of scam adverts with my face on. Even though there is a promise that we will try to mitigate that, the companies are not doing it. We have to have a legitimate understanding that we are not going to have zero scam adverts on these platforms; unless they were to pre-vet, which I do not think they will, the way they operate means that will not happen.

I am not a lawyer but my concern is that the Bill should make it clear, and that any interpretation of the Bill from Ofcom should be clear, about exactly what threshold of scam adverts is acceptable—we know that they are going to happen—and what threshold is not acceptable. I do not have the expertise to answer your question; I have to rely on your expertise to do that. But I ask the Committee to think properly about what the threshold level should be.

What is and is not acceptable? What counts as “doing everything they can”? They are going to get big lawyers involved if you say there must be zero scam adverts—that is not going to happen. How many scam adverts are acceptable and how many are not? I am so sorry to throw that back as a question when I am a witness, but I do not have the expertise to answer. But that is my concern: I am not 100% convinced of the threshold level that you are setting.

None Portrait The Chair
- Hansard -

Q Mr Fassam, do you have the answer?

Tim Fassam: I think we are positive about the actions that have been taken regarding social media; our concern is that the clause is not applied to search and that it excludes paid-for ads that are also user-generated content—promoted tweets or promoted posts, for example. We would ensure that that applied to all paid-for adverts and that it was consistent between social media and search.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Mr Fassam, I will address those two questions, if I may. Search is covered by clause 35 and user-generated content is subject to the Bill’s general provisions on user-generated content. Included in the scope of that are the priority illegal offences defined in schedule 7. Among those are included, on page 185—not that I expect you to have memorised the Bill—financial services offences that include a number of those offences to do with pretending to carry out regulated financial activity when in fact you are not regulated. Also included are the fraud offences—the various offences under the Fraud Act 2006. Do come back if you think I have this wrong, but I believe that we have search covered in clause 35 and promoted user-generated content covered via schedule 7 page 185.

Tim Fassam: You absolutely do, but to a weaker standard than in clause 34.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q In clause 35 there is the drafting point that we are looking at. It says “minimise the risk” instead of “prevent”. You are right to point out that drafting issue. In relation to the user-generated stuff, there is a duty on the platforms to proactively stop priority illegal content, as defined in schedule 7. I do take your drafting point on clause 35.

Tim Fassam: Thank you.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I want to pick up on Martin Lewis’s point about enforcement. He said that he had to sue Facebook himself, which was no doubt an onerous, painful and costly enterprise—at least costly initially, because hopefully you got your expenses back. Under the Bill, enforcement will fall to Ofcom. The penalties that social media firms could be handed by Ofcom for failing to meet the duties we have discussed include a fine amounting to 10% of global revenue as a maximum, which runs into billions of pounds. Do the witnesses feel that level of sanction—10% of global revenue and ultimately denial of service—is adequately punitive? Will it provide an adequate deterrent to the social media firms that we are considering?

None Portrait The Chair
- Hansard -

Mr Lewis, as you were named, I think you had better start.

Martin Lewis: Ten per cent. of the global revenue of a major social media or search player is a lot of money—it certainly would hit them in the pocket. I reiterate my previous point: it is all about the threshold at which that comes in and how rigidly Ofcom is enforcing it. There are very few organisations that have the resources, legally, to take on big institutions of state, regulators and Governments. If any does, it is the gigantic tech firms. Absolutely, 10% of global revenue sounds like a suitable wall to prevent them jumping over. That is the aim, because we want those companies to work for people; we don’t want them to do scam adds. We want them to work well and we want them never to be fined because is no reason to fine them.

The proof of the pudding will be in how robust Ofcom feels it can be, off the back of the Bill, taking those companies on. I go back to needing to understand how many scam ads you permit under the duty to prevent scam ads. It clearly is not zero—you are not going to tell me it is zero. So how many are allowed, what are the protocols that come into place and how quickly do they have to take the ads down? Ultimately, I think that is going to be a decision for Ofcom, but it is the level of stringency that you put on Ofcom in order for it to interpret how it takes that decision that is going to decide whether this works or not.

Rocio Concha: I completely agree with Martin. Ofcom needs to have the right resources in order to monitor how the platforms are doing that, and it needs to have the right powers. At the moment, Ofcom can ask for information in a number of areas, including fraud, but not advertising. We need to make sure that Ofcom can ask for that information so that it can monitor what the platforms are doing. We need to make sure that it has the right powers and the right resources to enforce the Bill effectively.

Tim Fassam: You would hope that 10% would certainly be a significant disincentive. Our focus would be on whether companies are contributing to compensating the victims of fraud and scams, and whether they have been brought into the architecture that is utilised to compensate victims of fraud and scams. That would be the right aim in terms of financial consequences for the firms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I have one final question that again relates to the question of reporting scams, which I think two or three witnesses have referred to. I will briefly outline the provisions in the Bill that address that. I would like to ask the witnesses if they think those provisions are adequate. First, in clause 18, the Bill imposes on large social media firms an obligation to have a proper complaints procedure so that complaints are not ignored, as appears to happen on a shockingly frequent basis. That is at the level of individual complaints. Of course, if social media firms do not do that, it will be for Ofcom to enforce against them.

Secondly, clauses 140 and 141 contain a procedure for so-called super-complaints, where a body that represents users—it could be Which? or an organisation like it—is able to bring something almost like a class action or group complaint to Ofcom if it thinks a particular social media firm has systemic problems. Will those two clauses address the issue of complaints not being properly handled or, in some cases, not being dealt with at all?

Martin Lewis: Everything helps. I think the super-complaint point is really important. We must remember that many victims of scams are not so good at complaining and, by the nature of the crossover of individuals, there is a huge mental health issue at stake with scams. There is both the impact on people with mental health issues and the impact on people’s mental health of being scammed, which means that they may not be as robust and up for the fight or for complaining. As long as it works and applies to all the different categories that are repeated here, the super-complaint status is a good measure.

We absolutely need proper reporting lines. I urge you, Minister—I am not sure that this is in the Bill—to standardise this so that we can talk about what someone should do when they report: the same imagery, the same button. With that, people will know what to do. The more we can do that, the easier and better the system will be.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That is a really important point—you made it earlier—about the complaints process being hidden. Clause 18(2)(c) says that the complaints system must be

“easy to access, easy to use (including by children) and transparent.”

The previous paragraph (b) states that the system must

“provides for appropriate action to be taken by the provider of the service in response to complaints of a relevant kind”.

The Bill is saying that a complaints process must do those two things, because if it does not, Ofcom will be on the company’s back.

Martin Lewis: I absolutely support all of that. I am just pushing for that tiny bit more leadership, whether it is from you or Ofcom, that comes up with a standardised system with standardised imagery and placing, so that everybody knows that on the top left of the advert you have the button that you click to fill in a form to report it. The more we have that cross-platform and cross-search and cross-social media, the easier it will be for people. I am not sure it is a position for the Bill in itself, but Government leadership would work really well on that.

Tim Fassam: They are both welcome—the super-complaint and the new complaints process. We want to ensure that we have a system that looks not just at weight of number of complaints, but at the content. In particular, you may find on the super-complaint point that, for example, the firm that a fraudster is pretending to be is the organisation that has the best grasp of the issue, so do not forget about commercial organisations as well as consumer organisations when thinking about who is appropriate to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Well, your organisation, as one that represents firms in this space, could in fact be designated as a super-complainant to represent your members, as much as someone like Which? could be designated to represent the man on the street like you or me.

Tim Fassam: Absolutely. We suggested to Meta when we met them about 18 months ago that we could be a clearing house to identify for them whether they need to take something seriously, because our members have analysed it and consider it to represent a real risk.

None Portrait The Chair
- Hansard -

Last word to Rocio Concha.

Rocio Concha: I completely agree about the super-complaint. We as a consumer organisation have super-complaint powers. As with other regulators, we would like to have it in this context as well. We have done many super-complaints representing consumers in particular areas with the regulators, so I think we need it in this Bill as well.

On reporting, I want to clarify something. At the moment, the Bill does not have a requirement for users to complain and report to platforms in relation to fraudulent advertising. It happens for priority illegal content, but our assessment of the Bill is that it is unclear whether it applies to fraudulent advertising. We probably do not have time to look at this now, but we sent you amendments to where we thought the Bill had weaknesses. We agree with you that users should have an easy and transparent way to report illegal or fraudulent advertising, and they should have an easy way to complain about it. At the moment, it is not clear that the Bill will require that for fraudulent advertising.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, that is a very good question. Please do write to us about that. Clause 140, on super-complaints, refers to “regulated services”. My very quick, off-the-cuff interpretation is that that would include everything covered and regulated by the Bill. I notice that there is a reference to user-to-user services in clause 18. Do write to us on that point. We would be happy to look at it in detail. Do not take my comment as definitive, because I have only just looked at it in the last 20 seconds.

Rocio Concha: My comment was in relation not to the super-complaints but to the requirements. We already sent you our comments with suggestions on how you can fix this in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very grateful. Thank you.

None Portrait The Chair
- Hansard -

Ms Concha and Mr Fassam, thank you very much. Do please write in if you have further comments. Mr Lewis, we are deeply grateful to you. You can now go back to your day job and tell us whether we are going to be worse or better off as a result of the statement today—please don’t answer that now.

Martin Lewis: I am interviewing the Chancellor in 15 minutes.

None Portrait The Chair
- Hansard -

Thank you all very much.

Examination of Witness

Frances Haugen gave evidence.

16:36
None Portrait The Chair
- Hansard -

We now have Frances Haugen, a former Facebook employee. Thank you for joining us.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good afternoon, Frances. Thank you for joining us.

Frances Haugen: Thank you so much for inviting me.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

No problem. Could you give us a brief overview of how, in your opinion, platforms such as Meta will be able to respond to the Bill if it is enacted in its current form?

Frances Haugen: There are going to be some pretty strong challenges in implementing the Bill as it is currently written. I want to be really honest with you about the limitations of artificial intelligence. We call it artificial intelligence, but people who actually build these systems call it machine learning, because it is not actually intelligent. One of the major limitations in the Bill is that there are carve-outs, such as “content of democratic importance”, that computers will not be able to distinguish. That might have very serious implications. If the computers cannot differentiate between whether something is or is not hate speech, imagine a concept even more ambiguous that requires even more context, such as defining what is of democratic importance. If we have carve-outs like that, it may actually prevent the platforms from doing any content moderation, because they will never know whether a piece of content is safe or not safe.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You have just answered my question on AI and algorithmic intention. When I questioned Meta in Tuesday’s oral evidence session, they were unable to tell me how many human moderators they had directly working for them and how many had abided by a UK standard and code of conduct. Do you see the lack of human moderators being a problem as the Bill is enacted by platforms such as Meta?

Frances Haugen: I think it is unacceptable that large corporations such as this do not answer very basic questions. I guarantee you that they know exactly how many moderators they have hired—they have dashboards to track these numbers. The fact that they do not disclose those numbers shows why we need to pass laws to have mandatory accountability. The role of moderators is vital, especially for things like people questioning judgment decisions. Remember, no AI system is going to be perfect, and one of the major ways people can have accountability is to be able to complain and say, “This was inaccurately judged by a computer.” We need to ensure that there is always enough staffing and that moderators can play an active role in this process.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me, because I know others will want to come in. How do you think platforms such as Meta—I know we have used Meta as an example, but there are others—can be incentivised, beyond the statutory duty that we are currently imposing, to publish their data to allow academics and researchers into their platforms to examine exactly what is going on? Or is this the only way?

Frances Haugen: All industries that live in democratic societies must live within democratic processes, so I do believe that it is absolutely essential that we the public, through our democratic representatives like yourself, have mandatory transparency. The only two other paths I currently see towards getting any transparency out of Meta, because Meta has demonstrated that it does not want to give even the slightest slivers of data—for example, how many moderators there are—are via ESG, so we can threaten then with divestment by saying, “Prosocial companies are transparent with their data,” and via litigation. In the United States, sometimes we can get data out of these companies through the discovery process. If we want consistent and guaranteed access to data, we must put it in the Bill, because those two routes are probabilistic—we cannot ensure that we will get a steady, consistent flow of data, which is what we need to have these systems live within a democratic process.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Turning to the issue of child safety and online abuse with images involving children, what should be added to or removed from the Bill to improve how it protects children online? Have you got any thoughts on that? Some groups have described the Bill’s content as overly broad. Would you make any comments on how effective it will be in terms of online safety for children?

Frances Haugen: I am not well versed on the exact provisions in the Bill regarding child safety. What I can say is that one of the most important things that we need to have in there is transparency around how the platforms in general keep children under the age of 13 off their systems—transparency on those processes—because we know that Facebook is doing an inadequate job. That is the single biggest lever in terms of child safety.

I have talked to researchers at places like Oxford and they talk about how, with social media, one of the critical windows is when children transition through puberty, because they are more sensitive on issues, they do not have great judgment yet and their lives are changing in really profound ways. Having mandatory transparency on what platforms are doing to keep kids off their platforms, and the ability to push for stronger interventions, is vital, because keeping kids off them until they are at least 13, if not 16, is probably the biggest single thing we can do to move the ball down the field for child safety.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q You say that transparency is so important. Can you give us any specifics about particular areas that should be subject to transparency?

Frances Haugen: Specifically for children or across the whole platform?

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Specifically for children.

Frances Haugen: I will give you an example. Facebook has estimated ages for every single person on the platform, because the reality is that lots of adults also lie about their ages when they join, and advertisers want to target very specific demographics—for example, if you are selling a kit for a 40th birthday, you do not want to mis-target that by 10 years. Facebook has estimated ages for everyone on the platform. It could be required to publish every year, so that we could say, “Hey, there are four kids on the platform who you currently believe, using your estimated ages, are 14 years old—based not on how old they say they are, but on your estimate that this person is 14 years old. When did they join the platform? What fraction of your 14-year-olds have been on the platform since they were 10?” That is a vital statistic.

If the platforms were required to publish that every single quarter, we could say, “Wow! You were doing really badly four years ago, and you need to get a lot better.” Those kinds of lagging metrics are a way of allowing the public to grade Facebook’s homework, instead of just trusting Facebook to do a good job.

Facebook already does analyses like this today. They already know that on Facebook Blue, for example, for some age cohorts, 20% of 11-year-olds were on the platform—and back then, not that many kids were online. Today, I would guess a much larger fraction of 11-year-olds are on Instagram. We need to have transparency into how badly they are doing their jobs.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Frances, do you think that the Bill needs to set statutory minimum standards for things such as risk assessments and codes of practice? What will a company such as Facebook do without a minimum standard to go by?

Frances Haugen: It is vital to get into the statute minimum standards for things such as risk assessments and codes of conduct. Facebook has demonstrated time and again—the reality is that other social media platforms have too—that it does the bare minimum to avoid really egregious reputational damage. It does not ensure the level of quality needed for public safety. If you do not put that into the Bill, I worry that it will be watered down by the mountains of lobbyists that Facebook will throw at this problem.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you. You alluded earlier to the fact that the Bill contains duties to protect content of democratic importance and journalistic content. What is your view on those measures and their likely effectiveness?

Frances Haugen: I want to reiterate that AI struggles to do even really basic tasks. For example, Facebook’s own document said that it only took down 0.8% of violence-inciting content. Let us look at a much broader category, such as content of democratic importance—if you include that in the Bill, I guarantee you that the platforms will come back to you and say that they have no idea how to implement the Bill. There is no chance that AI will do a good job of identifying content of democratic importance at any point in the next 30 years.

The second question is about carve-outs for media. At a minimum, we need to greatly tighten the standards for what counts as a publication. Right now, I could get together with a friend and start a blog and, as citizen journalists, get the exact same protections as an established, thoughtful, well-staffed publication with an editorial board and other forms of accountability. Time and again, we have seen countries such as Russia use small media outlets as part of their misinformation and disinformation strategies. At a minimum, we need to really tighten that standard.

We have even seen situations where they will use very established publications, such as CNN. They will take an article that says, “Ukrainians destroyed a bunch of Russian tanks,” and intentionally have their bot networks spread that out. They will just paste the link and say, “Russia destroyed a bunch of tanks.” People briefly glance at the snippet, they see the picture of the tank, they see “CNN”, and they think, “Ah, Russia is winning.” We need to remember that even real media outlets can be abused by our enemies to manipulate the public.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Good afternoon, Frances. I want to ask you about anonymity and striking a balance. We have heard variously that anonymity affords some users safe engagement and actually reduces harm, while for others anonymity has been seen to fuel abuse. How do you see the balance, and how do you see the Bill striving to achieve that?

Frances Haugen: It is important for people to understand what anonymity really is and what it would really mean to have confirmed identities. Platforms already have a huge amount of data on their users. We bleed information about ourselves on to these platforms. It is not about whether the platforms could identify people to the authorities; it is that they choose not to do that.

Secondly, if we did, say, mandate IDs, platforms would have two choices. The first would be to require IDs, so that every single user on their platform would have to have an ID that is verifiable via a computer database—you would have to show your ID and the platform would confirm it off the computer. Platforms would suddenly lose users in many countries around the world that do not have well-integrated computerised databases. The platforms will come back to you and say that they cannot lose a third or half of their users. As long as they are allowed to have users from countries that do not have those levels of sophisticated systems, users in the UK will just use VPNs—a kind of software that allows you to kind of teleport to a different place in the world—and pretend to be users from those other places. Things such as ID identification are not very effective.

Lastly, we need to remember that there is a lot of nuance in things like encryption and anonymity. As a whistleblower, I believe there is a vital need for having access to private communications, but I believe we need to view these things in context. There is a huge difference between, say, Signal, which is open source and anyone in the world can read the code for it—the US Department of Defence only endorses Signal for its employees, because it knows exactly what is being used—and something like Messenger. Messenger is very different, because we have no idea how it actually works. Facebook says, “We use this protocol,” but we cannot see the code; we have no idea. It is the same for Telegram; it is a private company with dubious connections.

If people think that they are safe and anonymous, but they are not actually anonymous, they can put themselves at a lot of risk. The secondary thing is that when we have anonymity in context with more sensitive data—for example, Instagram and Facebook act like directories for finding children—that is a very different context for having anonymity and privacy from something like Signal, where you have to know someone’s phone number in order to contact them.

These things are not cut-and-dried, black-or-white issues. I think it is difficult to have mandatory identity. I think it is really important to have privacy. We have to view them in context.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Thank you. That is very helpful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us and giving evidence, Frances; it is nice to see you again. We had evidence from Meta, your former employer, on Tuesday, in which its representative suggested that it engages in open and constructive co-operation with researchers. Do you think that testimony was true?

Frances Haugen: I think that shows a commendable level of chutzpah. Researchers have been trying to get really basic datasets out of Facebook for years. When I talk about a basic dataset, it is things as simple as, “Just show us the top 10,000 links that are distributed in any given week.” When you ask for information like that in a country like the United States, no one’s privacy is violated: every one of those links will have been viewed by hundreds of thousands, if not millions of people. Facebook will not give out even basic data like that, even though hundreds if not thousands of academics have begged for this data.

The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes—and remember, it does not even say that it will happen; Ofcom might say, “Oh, maybe not.” We need to take a page from the Digital Services Act and say, “On the day that the Bill passes, we get access to data,” or, at worst, “Within three months, we are going to figure out how to do it.” It needs to be not, “Should we do it?” but “How will we do it?”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q When I was asking questions on Tuesday, the representative of Meta made a second claim that raised my eyebrow. He claimed that, in designing its algorithms, it did not primarily seek to optimise for engagement. Do you think that was true?

Frances Haugen: First, I left the company a year ago. Because we have no transparency with these companies, they do not have to publish their algorithms or the consequences of their algorithms, so who knows? Maybe they use astrology now to rank the content. We have no idea. All I know is that Meta definitely still uses signals—did users click on it, did they dwell on it, did they re-share it, or did they put a comment on it? There is no way it is not using those. It is very unlikely that they do not still use engagement in their ranking.

The secondary question is, do they optimise for engagement? Are they trying to maximise it? It is possible that they might interpret that and say, “No, we have multiple things we optimise for,” because that is true. They look at multiple metrics every single time they try to decide whether or not to shift things. But I think it is very likely that they are still trying to optimise for engagement, either as their top metric or as one of their top metrics.

Remember, Meta is not trying to optimise for engagement to keep you there as long as possible; it is optimising for engagement to get you and your friends to produce as much content as possible, because without content production, there can be no content consumption. So that is another thing. They might say, “No, we are optimising for content production, not engagement,” but that is one step off.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The Bill contains provisions that require companies to do risk assessments that cover their algorithms, and then to be transparent about those risk assessments with Ofcom. Do you think those provisions will deliver the change required in the approach that the companies take?

Frances Haugen: I have a feeling that there is going to be a period of growing pains after the first time these risk assessments happen. I can almost entirely guarantee you that Facebook will try to give you very little. It will likely be a process of back and forth with the regulator, where you are going to have to have very specific standards for the level of transparency, because Facebook is always going to try to give you the least possible.

One of the things that I am actually quite scared about is that, in things like the Digital Services Act, penalties go up to 10% of global profits. Facebook as a company has something like 35% profit margins. One of the things I fear is that these reports may be so damning— that we have such strong opinions after we see the real, hard consequences of what they are doing—that Facebook might say, “This isn’t worth the risk. We’re just going to give you 10% of our profits.” That is one of the things I worry about: that they may just say, “Okay, now we’re 25% profitable instead of 35% profitable. We’re that ashamed.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Let me offer a word of reassurance on that. In this Bill, the penalties are up to 10% of global revenue, not profit. Secondly, in relation to the provision of information to Ofcom, there is personal criminal liability for named executives, with a period of incarceration of up to two years, for the reason you mentioned.

Frances Haugen: Oh, good. That’s wonderful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We had a case last year where Facebook—it was actually Facebook—failed to provide some information to the CMA in a takeover case, and it paid a £50 million fine rather than provide the information, hence the provision for personal criminal liability for failing to provide information that is now in this Bill.

My final question is a simple one. From your perspective, at the moment, when online tech companies are making product design decisions, what priority do they give to safety versus profit?

Frances Haugen: What I saw when I was at Facebook was that there was a culture that encouraged people to always have the most positive interpretation of things. If things are still the same as when I left—like I said, I do not know; I left last May—what I saw was that people routinely had to weigh little changes in growth versus changes in safety metrics, and unless they were major changes in safety metrics, they would continue to pursue growth. The only problem with a strategy like that is that those little deficits add up to very large harms over time, so we must have mandated transparency. The public have to have access to data, because unless Facebook has to add the public cost of the harm of its products, it is not going to prioritise enough those little incremental harms as they add up.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you very much.

None Portrait The Chair
- Hansard -

Ms Haugen, thank you very much indeed for joining us today, and thank you also for the candour with which you have answered your questions. We are very grateful to you indeed.

The Committee will meet again on Tuesday 7 June at 9.25 am for the start of its line-by-line consideration of the Bill. That session will be in Committee Room 14.

Ordered, That further consideration be now adjourned. —(Steve Double.)

16:58
Adjourned till Tuesday 7 June at twenty-five minutes past Nine o’clock.
Written evidence to be reported to the House
OSB24 The Investment Association
OSB25 Jeremy Peckam
OSB26 Mid-Sized Platform Group
OSB27 Carnegie UK
OSB28 Full Fact
OSB29 Together Association
OSB30 The Christian Institute
OSB31 Clean up the Internet
OSB32 Joint Submission on Children's Amendments on the Online Safety Bill submitted by 5Rights Foundation, NSPCC and Children’s Charities’ Coalition on Internet Safety (CHIS) (and others)
OSB33 Internet Advertising Bureau UK (IAB UK)
OSB33A Annex - IAB UK Digital advertising industry commitment to tackle scam advertising online
OSB34 Victims’ Commissioner
OSB35 The British Psychological Society
OSB36 Paul Wragg
OSB37 Joint submission from Global Encryption Coalition signatories
OSB38 Internet Matters