All 2 Public Bill Committees debates in the Commons on 24th May 2022

Tue 24th May 2022
Tue 24th May 2022

Online Safety Bill (First sitting)

The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Mrs Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Witnesses
Kevin Bakhurst, Group Director, Broadcasting and Online Content, Ofcom
Richard Wronka, Director for Online Harms, Ofcom
Dame Rachel de Souza, Children’s Commissioner, Office of the Children’s Commissioner for England
Andy Burrows, Head of Child Safety, NSPCC
Lynn Perry MBE, CEO, Barnardo’s
Ben Bradley, Government Relations and Public Policy Manager, TikTok
Katy Minshall, Head of UK Public Policy, Twitter
Public Bill Committee
Tuesday 24 May 2022
(Morning)
[Christina Rees in the Chair]
Online Safety Bill
09:25
None Portrait The Chair
- Hansard -

We are now sitting in public and the proceedings are being broadcast. Before we begin, I have a few announcements. Hansard colleagues would be grateful if Members could email their speaking notes to hansardnotes@parliament.uk. Please switch electronic devices to silent. Tea, coffee, and other drinks, apart from the water provided, are not allowed during sittings.

Today, we will first consider the programme motion on the amendment paper. We will then consider a motion to enable the reporting of written evidence for publication, and a motion to allow us to deliberate in private about our questions before the oral evidence session. In view of the timetable available, I hope that we can take these matters formally without debate. I first call the Minister to move the programme motion standing in his name, which was discussed on Thursday by the Programming Sub-Committee for this Bill.

Ordered,

That—

(1) the Committee shall (in addition to its first meeting at 9.25 am on Tuesday 24 May) meet—

(a) at 2.00 pm on Tuesday 24 May;

(b) at 11.30 am and 2.00 pm on Thursday 26 May;

(c) at 9.25 am and 2.00 pm on Tuesday 7 June;

(d) at 11.30 am and 2.00 pm on Thursday 9 June;

(e) at 9.25 am and 2.00 pm on Tuesday 14 June;

(f) at 11.30 am and 2.00 pm on Thursday 16 June;

(g) at 9.25 am and 2.00 pm on Tuesday 21 June;

(h) at 11.30 am and 2.00 pm on Thursday 23 June;

(i) at 9.25 am and 2.00 pm on Tuesday 28 June;

(j) at 11.30 am and 2.00 pm on Thursday 30 June;

(2) the Committee shall hear oral evidence in accordance with the following Table:

Date

Time

Witness

Tuesday 24 May

Until no later than 10.05 am

Ofcom

Tuesday 24 May

Until no later than 10.50 am

Dame Rachel de Souza, Children’s Commissioner for England; Barnado’s; National Society for the Prevention of Cruelty to Children (NSPCC)

Tuesday 24 May

Until no later than 11.25 am

TikTok; Twitter

Tuesday 24 May

Until no later than 2.45 pm

Meta; Microsoft; Google

Tuesday 24 May

Until no later than 3.30 pm

Professor Clare McGlynn, Professor of Law, Durham University; Refuge; End Violence Against Women

Tuesday 24 May

Until no later than 4.15 pm

techUK; Online Safety Tech Industry Association (OSTIA); Crisp

Tuesday 24 May

Until no later than 5.00 pm

Match Group; Bumble; TrustElevate

Tuesday 24 May

Until no later than 5.30 pm

Marie Collins Foundation; Internet Watch Foundation (IWF)

Tuesday 24 May

Until no later than 6.00 pm

Demos; FairVote

Thursday 26 May

Until no later than 12.15 pm

Catch22; Full Fact; Carnegie UK Trust

Thursday 26 May

Until no later than 1.00 pm

Antisemitism Policy Trust; Clean up the Internet; HOPE not hate

Thursday 26 May

Until no later than 2.25 pm

Information Commissioner’s Office

Thursday 26 May

Until no later than 2.55 pm

Kick It Out; The Football Association

Thursday 26 May

Until no later than 3.25 pm

Center for Countering Digital Hate; Reset

Thursday 26 May

Until no later than 3.55 pm

News Media Association; Guardian Media Group

Thursday 26 May

Until no later than 4.40 pm

Personal Investment Management & Financial Advice Association (PIMFA); Which?; Money Saving Expert

Thursday 26 May

Until no later than 5.05 pm

Frances Haugen



(3) proceedings on consideration of the Bill in Committee shall be taken in the following order: Clauses 1 to 3; Schedules 1 and 2; Clauses 4 to 32; Schedule 3; Clauses 33 to 38; Schedule 4; Clauses 39 to 52; Schedules 5 to 7; Clauses 53 to 64; Schedule 8; Clauses 65 to 67; Schedule 9; Clauses 68 to 80; Schedule 10; Clauses 81 to 91; Schedule 11; Clauses 92 to 122; Schedule 12; Clauses 123 to 158; Schedule 13; Clauses 159 to 161; Schedule 14; Clauses 162 to 194; new Clauses; new Schedules; remaining proceedings on the Bill;

(4) the proceedings shall (so far as not previously concluded) be brought to a conclusion at 5.00 pm on Thursday 30 June.—(Chris Philp.)

None Portrait The Chair
- Hansard -

The Committee will therefore proceed to line-by-line consideration of the Bill on Tuesday 7 June at 9.25 am. I call the Minister to move the motion about written evidence.

Resolved,

That, subject to the discretion of the Chair, any written evidence received by the Committee shall be reported to the House for publication.—(Chris Philp.)

None Portrait The Chair
- Hansard -

Copies of written evidence to the Committee will be made available in the Committee room each day and will be circulated to Members by email. I call the Minister to move the motion about deliberating in private.

Resolved,

That, at this and any subsequent meeting at which oral evidence is to be heard, the Committee shall sit in private until the witnesses are admitted.—(Chris Philp.)

09:27
The Committee deliberated in private.
Examination of Witnesses
Kevin Bakhurst and Richard Wronka gave evidence.
09:30
None Portrait The Chair
- Hansard -

We are now sitting in public again, and the proceedings are being broadcast. Before we start hearing from the witnesses, do any Members wish to make declarations of interest in connection with the Bill?

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

The witness on Thursday’s sitting, Danny Stone from the Antisemitism Policy Trust, is an informal secretariat in a personal capacity to the all-party parliamentary group on wrestling, which I co-chair.

None Portrait The Chair
- Hansard -

That is noted.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I refer Members to my entry in the Register of Members’ Financial Interests regarding work I did six months ago for a business called DMA.

None Portrait The Chair
- Hansard -

We will now hear oral evidence from Kevin Bakhurst, group director of broadcasting and online content at Ofcom, and Richard Wronka, director of Ofcom’s online harms policy. Before calling the first Member to ask a question, I remind all Members that questions should be limited to matters within the scope of the Bill, and we must stick to the timings in the programme motion that the Committee has agreed. For this witness panel, we have until 10.05 am. Could the witnesses please introduce themselves for the record?

Kevin Bakhurst: Good morning. I am Kevin Bakhurst, group director at Ofcom for broadcasting and online content.

Richard Wronka: I am Richard Wronka, a director in Ofcom’s online safety policy team.

None Portrait The Chair
- Hansard -

I will open up to the floor for questions now. I call Alex Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good morning, both, and welcome to the Committee. The Bill as it stands places responsibility on Ofcom to regulate the 25,000 tech companies and the tens—if not hundreds—of thousands of websites within the UK. How does that look in practice? What technical and administrative capacity do you have to carry that function out, realistically?

Kevin Bakhurst: We should say that we feel the Bill has given us a very good framework to regulate online safety. We have been working closely with the Department for Digital, Culture, Media and Sport to make sure that the Bill gives us a practical, deliverable framework. There is no doubt that it is a challenge. As you rightly say, there will be potentially 25,000 platforms in scope, but we feel that the Bill sets out a series of priorities really clearly in terms of categories.

It is also for us to set out—we will be saying more about this in the next couple of months—how we will approach this, and how we will prioritise certain platforms and types of risk. It is important to say that the only way of achieving online safety is through what the Bill sets out, which is to look at the systems in place at the platforms, and not the individual pieces of content on them, which would be unmanageable.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Kevin. You mentioned the categorisation of platforms. A number of stakeholders, including the platforms themselves and charities, have quite rightly raised some serious concerns around the categorisation of platforms. Would you, the regulator, prefer a risk-based approach, or the categorisation as it stands within the Bill?

Richard Wronka: We completely recognise the concerns that have been raised by stakeholders, and we have been speaking to many of them ourselves, so we have first-hand experience. I think my starting point is that the Bill captures those high-risk services, which is a really important feature of it. In particular, responsibilities around the legal content apply across all services in scope. That means that, in practice, when we are regulating, we will take a risk-based approach to whom we choose to engage with, and to where we focus our effort and attention.

We recognise that some of the debate has been about the categorisation process, which is intended to pick up high-risk and high-reach services. We understand the logic behind that. Indeed, I think we would have some concerns about the workability of an approach that was purely risk-based in its categorisation. We need an approach that we can put into operation. Currently, the Bill focuses on the reach of services and their functionality. We would have some concerns about a purely risk-based approach in terms of whether it was something that we could put into practice, given the number of services in scope.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q May I bring you back to putting this into practice, and to the recategorisation of platform and practice? If a category 2B platform as it stands in the Bill grows exponentially in size, and is spreading disinformation and incredibly harmful content quite quickly, how quickly would you be able to react as a regulator to recategorise that platform and bring it into scope as a category 1 platform? How long would that process take, and what would happen in the interim?

Richard Wronka: At the moment, the category 2B service would have transparency reporting requirements. That would be helpful, because it would be one way that the nature of harmful content on that platform could be brought to our attention, and to the public’s attention. We would also be looking at approaches that we could use to monitor the whole scope of the services, to ensure that we had a good grip of who was growing quickest and where the areas of risk were. Some of that is through engaging with the platforms themselves and a whole range of stakeholders, and some of it is through more advanced data and analytical techniques—“supervision technology”, as it is known in the regulatory jargon.

On the specifics of your question, if a company was growing very quickly, the Bill gives us the ability to look at that company again, to ask it for information to support a categorisation decision, and to recategorise it if that is the right approach and if it has met the thresholds set out by the Secretary of State. One of the thresholds regards the number of users, so if a company has moved over that threshold, we look to act as quickly as possible while running a robust regulatory process.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So while that process is under way, there is no mechanism for you to take action against the platform.

Kevin Bakhurst: May I answer this? We have some experience of this already in the video-sharing platform regime, which is much more limited in scope, and we are already regulating a number of platforms, ranging from some very big ones such as Twitch, TikTok and Snap, down to some much smaller platforms that have caused us some concerns. We think we have the tools, but part of our approach will also be to focus on high-risk and high-impact content, even if it comes through small platforms. That is what we have already done with the video-sharing platform regime. We have to be agile enough to capture that and to move resources to it. We are doing that already with the video-sharing platform regime, even though we have only been regulating it for less than a year.

None Portrait The Chair
- Hansard -

Maria Miller has indicated that she would like to ask a question, so if I may, I will bring her in.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Not immediately —go on please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you, Maria.

I am just trying to get to the intricacies of this, and of what would happen during the time that it would take for you to recategorise. This platform, which is disseminating harm to both children and adults, would be allowed to carry on while the recategorisation process is under way. There is no mechanism in the Bill to stop that from happening.

Richard Wronka: A really important point here is that we will be regulating that platform from the outset for illegal content and, potentially, for how it protects children on its platform, irrespective of the categorisation approach. That is really important. We will be able to take action, and take action quickly, irrespective of how the platform is categorised. Categorisation really determines whether the adult “legal but harmful” provisions apply. That is the bit that really matters in this context.

It is worth reminding ourselves what those provisions mean: they are more a transparency and accountability measure. Those categorised category 1 platforms will need to have clear terms and conditions applied to adult “legal but harmful” content, and they will need to implement those consistently. We would expect the really serious and egregious concerns to be picked up by the “illegal” part of the regime, and the protection-of-children part of the regime. The categorisation process may go on. It may take a little time, but we will have tools to act in those situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q May I bring you on to the powers of the Secretary of State and the question of the regulator’s independence? The Bill will see the Secretary of State, whoever that may be, have a huge amount of personal direction over Ofcom. Do you have any other experience of being directed by a Secretary of State in this way, and what are the consequences of such an approach?

Kevin Bakhurst: We do have some experience across the various sectors that we regulate, but being directed by the Secretary of State does not happen very often. Specifically on the Bill, our strong feeling is that we think it entirely appropriate, and that the Secretary of State should be able to direct us on matters of national security and terrorist content. However, we have some concerns about the wider direction powers of the Secretary of State, and particularly the grounds on which the Secretary of State can direct public policy, and we have expressed those concerns previously.

We feel it is important that the independence of a regulator can be seen to be there and is there in practice. Legally, we feel it important that there is accountability. We have some experience of being taken to judicial review, and there must be accountability for the codes of practice that we put in place. We must be able to show why and how we have created those codes of practice, so that we can be accountable and there is absolute clarity between regulator and Government.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Thank you very much to the witnesses who have taken the time to be with us today. We are really grateful. You have already alluded to the fact that you have quite extensive experience in regulation, even in social media spaces. I think the Committee would be really interested in your view, based on your experience, about what is not in the Bill that should be.

Kevin Bakhurst: Richard has been leading this process, so he can give more detail on it, but suffice to say, we have been engaging closely with DCMS over the last year or so, and we appreciate the fact that it has taken on board a number of our concerns. What we felt we needed from the Bill was clarity as far as possible, and a balance between clarity and flexibility for this regime, which is a very fast-moving field. We feel, by and large, that the Bill has achieved that.

We still have concerns about one or two areas, to pick up on your question. We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of “illegal content” is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.

Richard Wronka: I completely agree with Kevin that the Bill as it stands gives us a good framework. I think the pre-legislative scrutiny process has been really helpful in getting us there, and I point out that it is already quite a broad and complex regime. We welcome the introduction of issues such as fraudulent advertising and the regulation of commercial pornographic providers, but I think there is a point about ensuring that the Bill does not expand too much further, because that might raise some practical and operational issues for us.

I completely agree with Kevin that clarity in the Bill regarding illegal content and what constitutes that is really important. An additional area that requires clarity is around some of the complex definitions in the Bill, such as journalistic content and democratically important content. Those are inherently tricky issues, but any extra clarity that Parliament can provide in those areas would be welcome.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q You talk about illegal content and that Ofcom would not have a view on particular laws, but do you think there are harmful areas of content that are not currently covered by the law? I am thinking particularly about the issue of intimate image abuse, which is currently under Law Commission review, with recommendations expected very soon. Have you had any thoughts, particularly in the area of policy, about how you deal with issues that should be against the law but currently are not, given that part of your regulatory process is to determine whether companies are operating within the law?

Richard Wronka: I would start by saying that this is a fluid area. We have had a number of conversations with the Law Commission in particular and with other stakeholders, which has been really helpful. We recognise that the Bill includes four new offences, so there is already some fluidity in this space. We are aware that there are other Law Commission proposals that the Government are considering. Incitement to self-harm and flashing imagery that might trigger epilepsy are a couple of issues that come to mind there. Ultimately, where the criminal law sits is a matter for Parliament. We are a regulator: our role here is to make sure that the criminal law is reflected in the regulatory regime properly, rather than to determine or offer a view on where the criminal law should sit. Linking back to our point just a minute ago, we think it is really important that there is as much clarity as possible about how platforms can take some of those potentially quite tricky decisions about whether content meets the criminal threshold.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q May I press a little further? The four new offences that you talked about, and others, and just the whole approach of regulation will lead more individuals to seek redress and support. You are not responsible for individuals; you are responsible for regulation, but you must have some thoughts on whether the current system of victim support will cope with the changes in the law and the new regulatory process. What might you want to see put in place to ensure that those victims are not all landing at your door, erroneously thinking that Ofcom will provide them with individual redress? Do you have any thoughts on that?

Kevin Bakhurst: One area that is very important and which is in the Bill and one of our responsibilities is to make sure there is a sufficiently robust and reactive complaints process from the platforms—one that people feel they can complain to and be heard—and an appeals process. We feel that that is in the Bill. We already receive complaints at Ofcom from people who have issues about platforms and who have gone to the platforms but do not feel their complaints have been properly dealt with or recognised. That is within the video-sharing platform regime. Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly. It will be a really important part of the regime to make sure that platforms provide a complaints process that is easy to navigate and that people can use quite quickly and accessibly.

Richard Wronka: An additional point I would make, building on that, is that this is a really complex ecosystem. We understand that and have spent a lot of the last two or three years trying to get to grips with that complex ecosystem and building relationships with other participants in the ecosystem. It brings in law enforcement, other regulators, and organisations that support victims of crime or online abuse. We will need to find effective ways to work with those organisations. Ultimately, we are a regulator, so there is a limit to what we can do. It is important that those other organisations are able to operate effectively, but that is perhaps slightly outside our role.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Q To what extent do you think services should publish publicly the transparency and risk assessments that they will be providing to Ofcom?

Richard Wronka: I think our starting point here is that we think transparency is a really important principle within the regime—a fundamental principle. There are specific provisions in the Bill that speak to that, but more generally we are looking for this regime to usher in a new era of transparency across the tech sector, so that users and other participants in this process can be clearer about what platforms are doing at the moment, how effective that is and what more might be done in the future. That is something that will be a guiding principle for us as we pick up regulation.

Specifically, the Bill provides for transparency reports. Not all services in scope will need to provide transparency reports, but category 1 and 2 services will be required to produce annual transparency reports. We think that is really important. At the moment, risk assessments are not intended to be published—that is not provided for in the Bill—but the transparency reports will show the effectiveness of the systems and processes that those platforms have put in place.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q That was to be my next question: do you think it is an issue that category 1 services will not have to publish child risk assessments? It seems to me that it would be better if they did.

Richard Wronka: I think what is important for us as a regulator is that we are able to access those risk assessments; and for the biggest services, the category 1 services, we would be expecting to do that routinely through a supervisory approach. We might even do that proactively, or where services have come to us for dialogue around those—

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q But would it not improve transparency if they did have to publish them? Why would they not want to publish them?

Richard Wronka: Some services may wish to publish the risk assessments. There is nothing in the Bill or in our regulated approach that would prevent that. At the moment, I do not see a requirement in the Bill to do that. Some services may have concerns about the level of confidential information in there. The important point for us is that we have access to those risk assessments.

Kevin Bakhurst: Picking up on the risk assessments, it is a tricky question because we would expect those assessments to be very comprehensive and to deal with issues such as how algorithms function, and so on. There is a balance between transparency, which, as Richard says, we will drive across the regime—to address information that can harm, or people who are trying to behave badly online or to game the system—and what the regulator needs in practical terms. I am sure the platforms will be able to talk to you more about that.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q May I ask follow-up some questions about resources and timing once the Bill has gone through? You said you are going to open a new digital and technology hub in Manchester, with the creation of 150 jobs. I have a couple of questions on that. Do you think that what is set out in the proposal will be enough? Will you have the resources to carry out the duties set out in the Bill? This is a follow-up point from my colleague’s question earlier.

There is also a question of timing. The reports suggested that the new hub and jobs will come into play in 2025. I am sure that everyone here wants to see the Bill taking effect sooner. Ofcom will need to do a lot of reviews and reporting in the first year after the Bill receives Royal Assent. How will that be possible if people are not in post until 2025?

Kevin Bakhurst: They are both big questions. I will take the first part and maybe Richard can take the second one about the timing. On the resourcing, it is important to say publicly that we feel strongly that, very unusually, we have had funding from Government to prepare for this regime. I know how unusual that is; I was at a meeting with the European regulators last week, and we are almost unique in that we have had funding and in the level of funding that we have had.

The funding has meant that we are already well advanced in our preparations. We have a team of around 150 people working on online safety across the organisation. A number are in Manchester, but some are in London or in our other offices around the UK. It is important to say that that funding has helped us to get off to a really strong start in recruiting people across the piece—not just policy people. Importantly, we have set up a new digital function within Ofcom and recruited a new chief technology officer, who came from Amazon Alexa, to head up that function.

The funding has allowed us to really push hard into this space, which is not easy, and to recruit some of the skills we feel we need to deliver this regime as effectively and rapidly as possible. I know that resourcing is not a matter within the Bill; it is a separate Treasury matter. Going forward though, we feel that, in the plans, we have sufficient resourcing to deliver what we are being asked to deliver. The team will probably double in size by the time we actually go live with the regime. It is a significant number of people.

Some significant new duties have been added in, such as fraudulent advertising, which we need to think carefully about. That is an important priority for us. It requires a different skillset. It was not in the original funding plan. If there are significant changes to the Bill, it is important that we remain alive to having the right people and the right number of people in place while trying to deliver with maximum efficiency. Do you want to talk about timing, Richard?

Richard Wronka: All I would add to that, Kevin, is that we are looking to front-load our recruitment so that we are ready to deliver on the Bill’s requirements as quickly as possible once it receives Royal Assent and our powers commence. That is the driving motivation for us. In many cases, that means recruiting people right now, in addition to the people we have already recruited to help with this.

Clearly there is a bit of a gating process for the Bill, so we will need a settled legislative framework and settled priority areas before we can get on with the consultation process. We will look to run that consultation process as swiftly as possible once we have those powers in place. We know that some stakeholders are very keen to see the Bill in place and others are less enthusiastic, so we need to run a robust process that will stand the test of time.

The Bill itself points us towards a phased process. We think that illegal content, thanks to the introduction of priority illegal content in the Bill, with those priority areas, is the area on which we can make the quickest progress as soon as the Bill achieved Royal Assent.

None Portrait The Chair
- Hansard -

Thank you. I intend to bring in the Minister at about 10 o’clock. Kirsty Blackman, Kim Leadbeater and Dean Russell have indicated that they wish to ask questions, so let us try to keep to time.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Q I have a few questions, but I will ask them in a short way, and hopefully the witnesses can answer them in a fairly short way too. The chief executive of Ofcom told the Joint Committee on the draft Bill that the Secretary of State’s powers were extremely open ended. You have already touched on this, but do you feel that this will impede Ofcom’s independence as a regulator?

Kevin Bakhurst: There is a particular area on reasons of public policy for the Secretary of State to direct us on codes that we have some concern about. It is more on practicality than independence, but clearly for the platforms, and we have had a lot of discussions with them, the independence of a regulator—that is, a regulatory regime that is essentially about content—is absolutely critical, and it is a priority for us to show that we are independent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Do you feel that the Bill will allow you to adequately regulate online gaming, which is how an awful lot of young people use the internet, in a way that will keep them safer than they currently are?

Richard Wronka: Yes, we fully anticipate that gaming services, and particularly the messaging functionality that is often integrated into those services, will be captured within the scope of the regime. We do think that the Bill, on the whole, gives us the right tools to regulate those services.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My last question is about future-proofing the Bill. Obviously, an awful lot of things will happen in the online world that do not currently happen there, and some of those we cannot foresee. Do you think the Bill is wide enough and flexible enough to allow changes to be made so that new and emerging platforms can be regulated?

Kevin Bakhurst: Overall, we feel that it is. By and large, the balance between certainty and flexibility in the Bill is probably about right and will allow some flexibility in future, but it is very hard to predict what other harms may emerge. We will remain as flexible as possible.

Richard Wronka: There are some really important updating tools in the Bill. The ability for the Secretary of State to introduce new priority harms or offences—with the approval of Parliament, of course—is really important.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Ofcom is required to produce certain codes, for example on terrorism, but others that were floated in the Green Paper are no longer in the Bill. Are you working on such codes, for example on hate crime and wider harm, and if not, what happens in the meantime? I guess that links to my concerns about the democratic importance and journalistic content provisions in the Bill, to which you have alluded. They are very vague protections and I am concerned that they could be exploited by extremists who suddenly want to identify as a journalist or a political candidate. Could you say a little about the codes and about those two particular clauses and what more you think we could do to help you with those?

Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.

A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.

Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Do the powers in the Bill cover enough to ensure that people will not be sent flashing images if they have photosensitive epilepsy?

Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.

Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q But as the Bill stands, there is a very clear point about stopping harmful content being sent to people, so I imagine that would cover it at least in that sense, would it not?

Kevin Bakhurst: This is a really important point, which Richard just tried to make. The Bill gives us a great range of tools to try and prevent harm as far as possible; I just think we need to get expectations right here. Unfortunately, this Bill will not result in no harm of any type, just because of the nature of the internet and the task that we face. We are ambitious about driving constant improvement and stopping and addressing the main harms, but it is not going to stop any particular harm. We will absolutely focus on the ones that have a significant impact, but unfortunately that is the nature of the web.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Just to continue the point made by my colleague, you are right to say that Ministry of Justice colleagues are considering the flashing image offence as a separate matter. But would you agree that clause 150, on harmful communications, does criminalise and therefore place into the scope of the Bill communications intended to cause harm to a “likely audience” where such harm is

“psychological harm amounting to serious distress”?

Therefore, sending somebody a flashing image with the intention of inducing an epileptic fit would be likely caught under this new harmful communications offence in clause 150, even before a separate future offence that may be introduced.

Richard Wronka: I think we can certainly understand the argument. I think it is important that the Bill is as clear as possible. Ultimately, it is for the courts to decide whether that offence would pick up these kinds of issues that we are talking about around flashing imagery.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I would suggest that the definition in clause 150 would cover epilepsy trolling.

You mentioned that you met recently with European regulators. Briefly, because we are short of time, were there any particular messages, lessons or insights you picked up in those meetings that might be of interest to the Committee?

Kevin Bakhurst: Yes, there were a number, and liaising with European regulators and other global regulators in this space is a really important strand of our work. It often said that this regime is a first globally. I think that is true. This is the most comprehensive regime, and it is therefore potentially quite challenging for the regulator. That is widely recognised.

The second thing I would say is that there was absolute recognition of how advanced we are in terms of the recruitment of teams, which I touched on before, because we have had the funding available to do it. There are many countries around Europe that have recruited between zero and 10 and are imminently going to take on some of these responsibilities under the Digital Services Act, so I think they are quite jealous.

The last thing is that we see continued collaboration with other regulators around the world as a really important strand, and we welcome the information-sharing powers that are in the Bill. There are some parallels, and we want to take similar approaches on areas such as transparency, where we can collaborate and work together. I think it is important—

None Portrait The Chair
- Hansard -

Order. I am afraid we have come to the end of the allotted time for questions. On behalf of the Committee, I thank our witnesses for their evidence.

Examination of Witnesses

Dame Rachel de Souza, Lynn Perry MBE and Andy Burrows gave evidence.

10:05
None Portrait The Chair
- Hansard -

We will now hear from the Children’s Commissioner, Dame Rachel de Souza; Lynn Perry, chief executive officer of Barnardo’s, who will be appearing via Zoom; and Andy Burrows, head of child safety at the National Society for the Prevention of Cruelty to Children. Could the new witnesses take their places, please?

We have until 10.50 am for this panel. Could the witnesses please introduce themselves for the record? We will take the witnesses in the room first.

Andy Burrows: I am Andy Burrows, the head of online safety policy at the NSPCC.

Dame Rachel de Souza: I am Rachel de Souza, Children’s Commissioner for England.

None Portrait The Chair
- Hansard -

And on the screen—[Interruption.] Uh-oh, it has frozen. We will have to come back to that. We will take evidence from the witnesses in the room until we have sorted out the problem with the screen.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Do you think there is enough in the Bill to make sure that the voices of children at risk of online harms are heard? There is a super-complaints mechanism, but do you think it goes far enough for children, and are you confident that the regime will be able to quickly respond to new and emerging harms to children? Could Andy Burrows start?

Andy Burrows: Thank you for the question. We think that more could be built into the Bill to ensure that children’s needs and voices can be fed into the regime.

One of the things that the NSPCC would particularly like to see is provision for statutory user advocacy arrangements, drawing on the examples that we see in multiple other regulated sectors, where we have a model by which the levy on the firms that will cover the costs of the direct regulation also provides for funded user advocacy arrangements that can serve as a source of expertise, setting out children’s needs and experiences.

A comparison here would be the role that Citizens Advice plays in the energy and postal markets as the user voice and champion. We think that would be really important in bolstering the regulatory settlement. That can also help to provide an early warning function—particularly in a sector that is characterised by very rapid technological and market change—to identify new and emerging harms, and bolster and support the regulator in that activity. That, for us, feels like a crucial part of this jigsaw.

Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.

Dame Rachel de Souza: I was very pleased when the Government asked me, when I came into the role, to look at what more could be done to keep children safe online and to make sure that their voices went right through the passage of the Bill. I am committed to doing that. Obviously, as Children’s Commissioner, my role is to elevate children’s voices. I was really pleased to convene a large number of charities, internet safety organisations and violence against women and girls experts in a joint briefing to MPs to try to get children’s voices over.

I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and talking their complaints into account. I know you have a busy day, but that is the key point that I want to get across.

None Portrait The Chair
- Hansard -

Lynn Perry is back on the screen—welcome. Would you like to introduce yourself for the record and then answer the question? [Interruption.] Oh, she has gone again. Apparently the problem is at Lynn’s end, so we will just have to live with it; there is nothing we can do on this side.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Is the Bill future-proof? If you think it is not, how can we ensure that it is responsive to future risks and harms?

Andy Burrows: The systemic regime is important. That will help to ensure that the regime can be future-proofed; clearly, it is important that we are not introducing a set of proposals and then casting them in aspic. But there are ways that the Bill could be more strongly future-proofed, and that links to ensuring that the regime can effectively map on to the dynamics of the child sexual abuse problem in particular.

Let me give a couple of examples of where we think the Bill could be bolstered. One is around placing a duty on companies to consider the cross-platform nature of harm when performing their risk assessment functions, and having a broad, overarching duty to ask companies to work together to tackle the child sexual abuse threat. That is very important in terms of the current dynamics of the problem. We see, for example, very well-established grooming pathways, where abusers will look to exploit the design features of open social networks, such as on Instagram or Snapchat, before moving children and abuse on to perhaps live-streaming sites or encrypted messaging sites.

The cross-platform nature of the threat is only going to intensify in the years ahead as we start to look towards the metaverse, for example. It is clear that the metaverse will be built on the basis of being cross-platform and interdependent in nature. We can also see the potential for unintended consequences from other regulatory regimes. For example, the Digital Markets Act recently passed by the EU has provisions for interoperability. That effectively means that if I wanted to send you a message on platform A, you could receive it on platform B. There is a potential unintended consequence there that needs to be mitigated; we need to ensure that there is a responsibility to address the harm potential that could come from more interoperable services.

This is a significant area where the Bill really can be bolstered to address the current dynamics of the problem and ensure that legislation is as effective as it possibly can be. Looking to the medium to long term, it is crucial to ensure that we have arrangements that are commensurate to the changing nature of technology and the threats that will emerge from that.

Dame Rachel de Souza: A simple answer from me: of course we cannot future-proof it completely, because of the changing nature of online harms and technology. I talked to a large number of 16 to 21-year-olds about what they wished their parents had known about technology and what they had needed to keep them safe, and they listed a range of things. No. 1 was age assurance—they absolutely wanted good age assurance.

However, the list of harms and things they were coming across—cyber-flashing and all this—is very much set in time. It is really important that we deal with those things, but they are going to evolve and change. That is why we have to build in really good cross-platform work, which we have been talking about. We need these tech companies to work together to be able to stay live to the issues. We also need to make sure that we build in proper advocacy and listen to children and deal with the issues that come up, and that the Bill is flexible enough to be able to grow in that way. Any list is going to get timed out. We need to recognise that these harms are there and that they will change.

None Portrait The Chair
- Hansard -

I will bring in Kim Leadbeater and then Maria Miller and Kirsty Blackman, but I will definitely bring in the Minister at 10.45 am.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Ms Rees, and thank you to the witnesses. Many websites host pornography without necessarily being pornographic websites, meaning that children can easily stumble across it. Does the Bill do enough to tackle pornography when it is hosted on mainstream websites?

Dame Rachel de Souza: I have argued hard to get pornographic sites brought into the Bill. That is something very positive about the Bill, and I was really pleased to see that. Why? I have surveyed more than half a million children in my Big Ask survey and spoken recently to 2,000 children specifically about this issue. They are seeing pornography, mainly on social media sites—Twitter and other sites. We know the negative effects of that, and it is a major concern.

I am pleased to see that age assurance is in the Bill. We need to challenge the social media companies—I pull them together and meet them every six months—on getting this stuff off their sites and making sure that under-age children are not on their sites seeing some of these things. You cannot go hard enough in challenging the social media companies to get pornography off their sites and away from children.

Andy Burrows: Just to add to that, I would absolutely echo that we are delighted that part 5 of the Bill, with measures around commercial pornography, has been introduced. One of our outstanding areas of concern, which applies to pornography but also more broadly, is around clause 26, the children’s access assessment, where the child safety duties will apply not to all services but to services where there is a significant number of child users or children comprise a significant part of the user base. That would seem to open the door to some small and also problematic services being out of scope. We have expressed concerns previously about whether OnlyFans, for example, which is a very significant problem as a user-generated site with adult content, could be out of scope. Those are concerns that I know the Digital, Culture, Media and Sport Committee has recognised as well. We would very much like to see clause 26 removed from the Bill, which would ensure that we have a really comprehensive package in this legislation that tackles both commercial pornography and user-generated material.

None Portrait The Chair
- Hansard -

I think Lynn Perry is back. Are you with us, Lynn? [Interruption.] No—okay. We will move on to Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I have a question for the Children’s Commissioner. You talked just now about doing more on the advocacy of individual cases. I asked a question of Ofcom in the first panel about the issue of support for victims. Its response was that complaints processes will be part of what it will regulate. Do you think that will be enough to answer your concerns, or are you expecting more than simply ensuring that platforms do what they should do?

Dame Rachel de Souza: I absolutely think that we need to look at independent advocacy and go further. I do not think the Bill does enough to respond to individual cases of abuse and to understand issues and concerns directly from children. Children should not have to exhaust platforms’ ineffective complaints routes. It can take days, weeks, months. Even a few minutes or hours of a nude image being shared online can be enormously traumatising for children.

That should inform Ofcom’s policies and regulation. As we know, the risks and harms of the online world are changing constantly. It serves a useful purpose as an early warning mechanism within online safety regulation. I would like to see independent advocacy that allows a proper representation service for children. We need to hear from children directly, and I would like to see the Bill go further on this.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Is there capacity in the sector to deliver what you are talking about?

Dame Rachel de Souza: I think we need to make capacity. There is some—the NSPCC has its Childline and, as Children’s Commissioner, I have my own advocacy service for children in care. I think this should function in that way, with direct access. So I think that we can create it.

Andy Burrows: May I come in briefly? Our proposals for user advocacy reflect the clear “polluter pays” principle that we think should apply here, to help build and scale up that capacity, but the levy that is covering the direct cost of regulation should also provide really effective user advocacy. That is really important not only to help to give victims what they need in frontline services, but in ensuring that there is a strong counterbalance to some of the largest companies in the world for our sector, which has clear ambition but self-evident constraints.

Dame Rachel de Souza: One of the concerns that has come to me from children—I am talking about hundreds of thousands of children—over the past year is that there is not strong enough advocacy for them and that their complaints are not being met. Girls in particular, following the Everyone’s Invited concerns, have tried so hard to get images down. There is this almost medieval bait-out practice of girls’ images being shared right across platforms. It is horrendous, and the tech firms are not acting quickly enough to get those down. We need proper advocacy and support for children, and I think that they would expect that of us in this groundbreaking Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q There has not been a huge amount of discussion of online gaming in the context of the Bill, despite the fact that for many young people that is the way in which they interact with other people online. Do you think the Bill covers online gaming adequately? A lot of interaction in online gaming is through oral communication—voice chat messages. Do you think that it is possible to properly regulate oral communications in gaming?

Dame Rachel de Souza: Good question. I applaud the Bill for what it does cover. We are looking at a Bill that, for the first time, is going to start protecting children’s rights online, so I am really pleased to see that. We have looked a bit at gaming in the past. In terms of harms, obviously the Bill does not cover gaming in full, but it does cover the safety aspects of children’s experience.

It is always good for us to be looking further. Gaming, we know, has some extremely harmful and individualistic issues with it, particularly around money and the profile of potential grooming and safety. In terms of communications, one of the reasons that I am so concerned about encryption and communications online is that it happens through gaming. We need to make sure that those elements are really firm.

Andy Burrows: It is vitally important that the gaming sector is in scope. We know that there are high-risk gaming sites—for example, Twitch—and gaming-adjacent services such as Discord. To go back to my earlier point about the need for cross-platform provisions to apply here, in gaming we can see grooming pathways that can take on a different character from those on social networks, for example, where we might see abuse pathways where that grooming is taking place at the same time, rather than sequentially from a gaming streaming service, say, to a gaming-adjacent platform such as Discord. I think it is very important that a regulator is equipped to understand the dynamics of the harms and how they will perhaps apply differently on gaming services. That is a very strong and important argument for use advocacy.

I would say a couple of things on oral communications. One-to-one oral communication are excluded from the Bill’s scope—legitimately—but we should recognise that there is a grooming risk there, particularly when that communication is embedded in a platform of wider functionality. There is an argument for a platform to consider all aspects of its functionality within the risk assessment process. Proactive scanning is a different issue.

There is a broader challenge for the Bill, and this takes us back to the fundamental objectives and the very welcome design based around systemic risk identification and mitigation. We know that right now, in respect of oral communications and livestream communications, the industry response is not as developed in terms of detecting and disrupting harm as it is for, say, text-based chat. In keeping with the risk assessment process, it should be clear that if platforms want to offer that functionality, they should have to demonstrate through the risk assessment process that they have high-quality, effective arrangements in place to detect and disrupt harm, and that should be the price of admission. If companies cannot demonstrate that, they should not be offering their services, because there is a high risk to children.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Do you think it would be reasonable for gaming companies in particular to have a setting whereby children or young people can choose to interact only with people in their friends list? Would that be helpful?

Andy Burrows: I think that aspect is certainly worthy of consideration, because the key objective is that platforms should be incentivised to deliver safety by design initiatives. One area in the Bill that we would like to be amended is the user empowerment mechanism. That gives adults the ability to screen out anonymous accounts, for example, but those provisions do not apply to children. Some of those design features that introduce friction to the user experience are really important to help children, and indeed parents, have greater ownership of their experience.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Finally, could you explain breadcrumbing a little further? What does it mean and does the Bill tackle it adequately?

Andy Burrows: Child abuse breadcrumbing is a major area of concern for us. The term captures a range of techniques whereby abusers are able to use social networks to facilitate the discovery and the dissemination of child sexual abuse. The activity does not meet the criminal threshold in and of itself, but it effectively enables abusers to use online services as a shop window to advertise their sexual interest in children.

I will give a couple of fairly chilling examples of what I mean by that. There is a phenomenon called “tribute sites”. Abusers open social media accounts in the guise of well-known survivors of child sexual abuse. To all of us in this room, that would look perfectly innocuous, but if you are an abuser, the purpose of those accounts is very clear. In the first quarter of last year, those types of accounts received 6 million interactions.

Another example is Facebook groups. We have seen evidence of Facebook refusing to take down groups that have a common interest in, for example, children celebrating their 8th, 9th and 10th birthdays. That is barely disguised at all; we can all see what the purpose is. Indeed, Facebook’s algorithms can see the purpose there, because research has shown that, within a couple of hours of use of the service, the algorithms identify the common characteristic of interest, which is child sexual abuse, and then start recommending accounts in multiple other languages.

We are talking about a significant way in which abusers are able to organise abuse and migrate it to encrypted chat platforms, to the dark web, and to offender fora, where it is, by definition, much harder to catch that activity, which happens after harm has occurred—after child abuse images have been circulated. We really want breadcrumbing to be brought unambiguously into the scope of the Bill. That would close off tens of millions of interactions with accounts that go on to enable abusers to discover and disseminate material and to form offender networks.

We have had some good, constructive relationships with the Home Office in recent weeks. I know that the Home Office is keen to explore how this area can be addressed, and it is vital that it is addressed. If we are going to see the Bill deliver the objective of securing a really effective upstream response, which I think is the clear legislative ambition, this is an area where we really need to see the Bill be amended.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q You mostly talked about Facebook. Is it mostly the largest social media platforms, or are we talking about some of the smaller ones, such as Discord, which you mentioned? Would you like to see those in scope as well, or is it just the very biggest ones?

Andy Burrows: Those provisions should apply broadly, but it is a problem that we see particularly on those large sites because of the scale and the potential for algorithmic amplification.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I want to ask about the many tragic cases of teenagers who have died by suicide after viewing self-harm material online. Do you think coroners have sufficient powers to access digital data after the death of a child, and should parents have the right to access their children’s digital data following their death?

Dame Rachel de Souza: Baroness Kidron has done some fantastic work on this, and I really support her work. I want to tell you why. I am a former headteacher—I worked for 30 years in schools as a teacher and headteacher. Only in the last five or six years did I start seeing suicides of children and teenagers; I did not see them before. In the year just before I came to be Children’s Commissioner, there was a case of a year 11 girl from a vulnerable family who had a relationship with a boy, and it went all over the social media sites. She looked up self-harm material, went out to the woods and killed herself. She left a note that basically said, “So there. Look what you’ve done.”

It was just horrendous, having to pick up the family and the community of children around her, and seeing the long-term effects of it on her siblings. We did not see things like that before. I am fully supportive of Baroness Kidron and 5Rights campaigning on this issue. It is shocking to read about the enormous waiting and wrangling that parents must go through just to get their children’s information. It is absolutely shocking. I think that is enough from me.

Andy Burrows: I absolutely agree. One of the things we see at the NSPCC is the impact on parents and families in these situations. I think of Ian Russell, whose daughter Molly took her own life, and the extraordinarily protracted process it has taken to get companies to hand over her information. I think of the anguish and heartbreak that comes with this process. The Bill is a fantastic mechanism to be able to redress the balance in terms of children and families, and we would strongly support the amendments around giving parents access to that data, to ensure that this is not the protracted process that it currently all too often is.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Just quickly, do coroners have sufficient powers? Should they have more powers to access digital data after the death of a child?

Andy Burrows: We can see what a protracted process it has been. There have been improvements to the process. It is currently a very lengthy process because of the mutual legal assistance treaty arrangements—MLAT, as they are known—by which injunctions have to be sought to get data from US companies. It has taken determination from some coroners to pursue cases, very often going up against challenges. It is an area where we think the arrangements could certainly be streamlined and simplified. The balance here should shift toward giving parents and families access to the data, so that the process can be gone through quickly and everything can be done to ease the heartbreak for families having to go through those incredibly traumatic situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Very briefly, Dame Rachel, I will build on what you were just saying, based on your experience as a headteacher. When I make my school visits, the teachers overwhelmingly tell me how, on a daily basis, they have to deal with the fallout from an issue that has happened online or on social media. On that matter, the digital media literacy strategy is being removed from the Bill. What is your thinking on that? How important do you see a digital media literacy strategy being at the heart of whatever policy the Government try to make regarding online safety for children?

Dame Rachel de Souza: There is no silver bullet. This is now a huge societal issue and I think that some of the things that I would want to say would be about ensuring that we have in our educational arsenal, if you like, a curriculum that has a really strong digital media literacy element. To that end, the Secretary of State for Education has just asked me to review how online harms and digital literacy are taught in schools—reviewing not the curriculum, but how good the teaching is and what children think about how the subject has been taught, and obviously what parents think, too.

I would absolutely like to see the tech companies putting some significant funding into supporting education of this kind; it is exactly the kind of thing that they should be working together to provide. So we need to look at this issue from many aspects, not least education.

Obviously, in a dream world I would like really good and strong digital media literacy in the Bill, but actually it is all our responsibility. I know from my conversations with Nadhim Zahawi that he is very keen that this subject is taught through the national curriculum, and very strongly.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question on parental digital literacy. You mentioned the panel that you put together of 16 to 21-year-olds. Do you think that today’s parents have the experience, understanding, skills and tools to keep their children properly safe online? Even if they are pretty hands-on and want to do that, do you think that they have all the tools they need to be able to do that?

Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.

What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.

I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking the NSPCC and you, Dame Rachel, and your office for the huge contribution that you have made to the Bill as it has developed? A number of changes have been made as a result of your interventions, so I would just like to start by putting on the record my thanks to both of you and both your organisations for the work that you have done so far.

Could you outline for the Committee the areas where you think the Bill, as currently drafted, contains the most important provisions to protect children?

Dame Rachel de Souza: I was really glad to see, in the rewrite of the Online Safety Bill, a specific reference to the role of age assurance to prevent children from accessing harmful content. That has come across strongly from children and young people, so I was very pleased to see that. It is not a silver bullet, but for too long children have been using entirely inappropriate services. The No. 1 recommendation from the 16 to 21-year-olds, when asked what they wish their parents had known and what we should do, was age assurance, if you are trying to protect a younger sibling or are looking at children, so I was pleased to see that. Companies cannot hope to protect children if they do not know who the children are on their platforms, so I was extremely pleased to see that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry to interject, Dame Rachel, but do you agree that it is not just about stopping under-18s viewing pornography; it also includes stopping children under 13 accessing social media entirely, as per those companies’ purported terms and conditions, which are frequently flouted?

Dame Rachel de Souza: Absolutely. I have called together the tech companies. I have met the porn companies, and they reassured me that as long as they were all brought into the scope of this Bill, they would be quite happy as this is obviously a good thing. I brought the tech companies together to challenge them on their use of age assurance. With their artificial intelligence and technology, they know the age of children online, so they need to get those children offline. This Bill is a really good step in that direction; it will hold them to account and ensure they get children offline. That was a critically important one for me.

I was also pleased to see the holding to account of companies, which is very important. On full coverage of pornography, I was pleased to see the offence of cyber-flashing in the Bill. Again, it is particularly about age assurance.

What I would say is that nudge is not working, is it? We need this in the Bill now, and we need to get it there. In my bit of work with those 2,000 young people, we asked what they had seen in the last month, and 40% of them have not had bad images taken down. Those aspects of the Bill are key.

Andy Burrows: This is a landmark Bill, so we thank you and the Government for introducing it. We should not lose sight of the fact that, although this Bill is doing many things, first and foremost it will become a crucial part of the child protection system for decades to come, so it is a hugely important and welcome intervention in that respect.

What is so important about this Bill is that it adopts a systemic approach. It places clear duties on platforms to go through the process of identifying the reasonably foreseeable harms and requiring that reasonable steps be taken to mitigate them. That is hugely important from the point of view of ensuring that this legislation is future-proofed. I know that many companies have argued for a prescriptive checklist, and then it is job done—a simple compliance job—but a systemic approach is hugely important because it is the basis upon which companies have very clear obligations. Our engagement is very much about saying, “How can we make sure this Bill is the best it can possibly be?” But that is on the bedrock of that systemic approach, which is fundamental if we are to see a culture shift in these companies and an emphasis on safety by design—designing out problems that do not have to happen.

I have engaged with companies where child safety considerations are just not there. One company told me that grooming data is a bad headline today and tomorrow’s chip shop wrapper. A systemic approach is the key to ensuring that we start to address that balance.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. I obviously strongly agree with those comments.

I would like to turn to a one or two points that came up in questioning, and then I would like to probe a couple of points that did not. Dame Rachel mentioned advocacy and ensuring that the voice of particular groups—in this context, particularly that of children—is heard. In that context, I would like to have a look at clause 140, which relates to super-complaints. Subsection (4) says that the Secretary of State can, by regulations, nominate which organisations are able to bring super-complaints. These are complaints whereby you go to Ofcom and say that there is a particular company that is failing in its systemic duties.

Subsection (4) makes it clear that the entities nominated to be an authorised super-complainant would include

“a body representing the interests of users of regulated services”,

which would obviously include children. If an organisation such as the Office of the Children’s Commissioner or the NSPCC—I am obviously not prejudicing the future process—were designated as a super-complainant that was able to bring super-complaints to Ofcom, would that address your point about the need for proper advocacy for children?

Dame Rachel de Souza: Absolutely. I stumbled over that a bit when Maria asked me the question, but we absolutely need people who work with children, who know children and are trusted by children, and who can do that nationally in order to be the super-complainants. That is exactly how I would envisage it working.

Andy Burrows: The super-complaint mechanism is part of the well-established arrangements that we see in other sectors, so we are very pleased to see that that is included in the Bill. I think there is scope to go further and look at how the Bill could mirror the arrangements that we see in other sectors—I mentioned the energy, postal and water sectors earlier as examples—so that the statutory user advocacy arrangements for inherently vulnerable children, including children at risk of sexual abuse, mirror the arrangements that we see in those other sectors. That is hugely important as a point of principle, but it is really helpful and appropriate for ensuring that the legislation can unlock the positive regulatory outcomes that we all want to see, so I think it contributes towards really effective regulatory design.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Andy. I am conscious of the time, so I will be brief with my final three questions. You made a valid point about large social media platforms receiving complaints generally, but in this case from children, about inappropriate content, such as photographs of them on a social media platform that do not get taken down—the complaint gets ignored, or it takes a very long time. In clause 18, we have duties on the complaints procedures that the big social media firms will now have to follow. I presume that you would join me in urging Ofcom to ensure that how it enforces the duties in clause 18 includes ensuring that big social media firms are responsive and quick in how they deal with complaints. Children are specifically referred to in the clause—for example, in subsection (3) and elsewhere.

Dame Rachel de Souza: Yes, and I was so pleased to see that. The regulator needs to have teeth for it to have any effect—I think that is what we are saying. I want named senior managers to be held accountable for breaches of their safety duties to children, and I think that senior leaders should be liable to criminal sanctions when they do not uphold their duty of care to children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Good—thank you. I want to say something about gaming, because Kirsty Blackman asked about it. If messages are being sent back and forth in a gaming environment, which is probably the concern, those are in scope of the Bill, because they are user-to-user services.

I will put my last two questions together. Are you concerned about the possibility that encryption in messaging services might impede the automatic scanning for child exploitation and abuse images that takes place, and would you agree that we cannot see encryption happen at the expense of child safety? Secondly, in the context of the Molly Russell reference earlier, are you concerned about the way that algorithms can promote and essentially force-feed children very harmful content? Those are two enormous questions, and you have only two minutes to answer them, so I apologise.

Dame Rachel de Souza: I am going to say yes and yes.

Andy Burrows: I will say yes and yes as well. The point about end-to-end encryption is hugely important. Let us be clear: we are not against end-to-end encryption. Where we have concerns is about the risk profile that end-to-end encryption introduces, and that risk profile, when we are talking about it being introduced into social networking services and bundled with other sector functionality, is very high and needs to be mitigated.

About 70% of child abuse reports that could be lost with Meta going ahead. That is 28 million reports in the past six months, so it is very important that the Bill can require companies to demonstrate that if they are running services, they can acquit themselves in terms of the risk assessment processes. We really welcome the simplified child sexual exploitation warning notices in the Bill that will give Ofcom the power to intervene when companies have not demonstrated that they have been able to introduce end-to-end encryption in a safe and effective way.

One area in which we would like to see the Bill—

None Portrait The Chair
- Hansard -

Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions of this panel. On behalf of the Committee, I thank our witnesses for their evidence, and I am really sorry that we could not get Lynn Perry online. Could we move on to the last panel? Thank you very much.

Examination of Witnesses

Ben Bradley and Katy Minshall gave evidence.

10:51
None Portrait The Chair
- Hansard -

We will now hear from Ben Bradley, government relations and public policy manager at TikTok, and Katy Minshall, head of UK public policy at Twitter. We have until 11.25 for this panel of witnesses. Could the witnesses please introduce themselves for the record?

Ben Bradley: I am Ben Bradley. I am a public policy manager at TikTok, leading on the Bill from TikTok.

Katy Minshall: I am Katy Minshall. I am head of UK public policy for Twitter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Good morning, both. Thank you for joining us today. We have recently had it confirmed by the Minister in a written parliamentary question that NFTs—non-fungible tokens—will be included in the scope of the Bill. Concerns have been raised about how that will work in practice, and also in relation to GIFs, memes and other image-based content that is used on your platforms, Twitter specifically. Katy, how do you see that working in practice? Is the Bill workable in its current form to encapsulate all of that?

Katy Minshall: Thank you for inviting me here to talk about the Online Safety Bill. On whether the Bill is workable in its current form, on the one hand, we have long been supportive of an approach that looks at overall systems and processes, which I think would capture some of the emerging technologies that you are talking about. However, we certainly have questions about how are aspects of the Bill would work in practice. To give you an example, one of the late additions to the Bill was about user verification requirements, which as I understand it means that all category 1 platforms will need to offer users the opportunity to verify themselves and, in turn, those verified users have the ability to turn off interaction from unverified users. Now, while we share the Government’s policy objective of giving users more control, we certainly have some workability questions.

Just to give you one example, let’s say this existed today, and Boris Johnson turned on the feature. In practice, that would mean one of two things. Either the feature is only applicable to users in the UK, meaning that people around the world—in France, Australia, Germany or wherever it may be—are unable to interact with Boris Johnson, and only people who are verified in the UK can reply to him, tweet at him and so on, or it means the opposite and anyone anywhere can interact with Boris Johnson except those people who have chosen not to verify their identity, perhaps even in his own constituency, who are therefore are at a disadvantage in being able to engage with the Prime Minister. That is just one illustration of the sorts of workability questions we have about the Bill at present.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You brought up the Prime Minister, so we’ll carry on down that route. One of the concerns about the Bill is the issue of protecting democratic importance. If there is an exemption for content of democratic importance, would your platforms be able to take that down?

Katy Minshall: I am sorry, do you mean—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Would you be able to remove the content?

Katy Minshall: At present, what would be expected of companies in that scenario is not entirely clear in the Bill. There are certainly examples of content we have removed over the years for abuse and hateful conduct where the account owner that we suspended would have grounds to say, “Actually, this is content of democratic importance.” At the very least, it is worth pointing out that, in practice, it is likely to slow down our systems because we would have to build in extra steps to understand if a tweet or an account could be considered content of democratic importance, and we would therefore treat it differently.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q That brings me to my next question. Because what would be classed as content of democratic importance is so ambiguous, would your platforms even be able to detect it?

Katy Minshall: That is a really important question. At present, the Bill envisages that we would treat journalistic content differently from other types of content. I think the definition in the Bill—correct me if I get this wrong—is content for the purposes of journalism that is UK linked. That could cover huge swathes of the conversation on Twitter—links to blog posts, citizen journalists posting, front pages of news articles. The Bill envisages our having a system to separate that content from other content, and then treating that content differently. I struggle to understand how that would work in practice, especially when you layer on top the fact that so much of our enforcement is assisted by technology and algorithms. Most of the abusive content we take down is detected using algorithms; we suspend millions of spam accounts every day using automated systems. When you propose to layer something so ambiguous and complicated on top of that, it is worth considering how that might impact on the speed of enforcement across all of our platform.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you. Given the media carve-out and the journalism exemption in the Bill, how could you detect state actors that are quoting disinformation, or even misinformation?

Katy Minshall: At present, we label a number of accounts as Government actors or state-affiliated media and we take action on those accounts. We take down their tweets and in some cases we do not amplify their content because we have seen in current situations that some Governments are sharing harmful content. Again, I question the ambiguity in the Bill and how it would interact with our existing systems that are designed to ensure safety on Twitter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you. Just one final question for Twitter. A query we raised with the Children’s Commissioner and the NSPCC is about pornography and children accessing it. A person needs to be 13 years old to join Twitter—to host a profile on the site—but you do host pornographic content; it is used mainly by sex workers to promote their trade. How does the proposed provision affect your model of business in allowing 13-year-olds and above to access your platform?

Katy Minshall: Until we see the full extent of the definitions and requirements, it is difficult to say exactly what approach we would take under the Bill. Regarding adult content, Twitter is not a service targeting a youth audience, and as you illustrate, we endeavour to give people the ability to express themselves as they see fit. That has to be balanced with the objective of preventing young people from inadvertently stumbling on such content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So you are not predominantly aimed at children? If you are an adult service, why is it that people aged 13 or above can access your platform?

Katy Minshall: We find that, in practice, the overwhelming majority of our user base are over the age of 18; both internal and external data show that. Of course young people can access Twitter. I think we have to be very careful that the Bill does not inadvertently lock children out of services they are entitled to use. I am sure we can all think of examples of people under the age of 18 who have used Twitter to campaign, for activism and to organise; there are examples of under-18s using Twitter to that effect. But as I say, predominantly we are not a service targeting a youth audience.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Okay. Thank you, Chair.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Before Christmas, I was involved in the Joint Committee that carried out pre-legislative work on the Bill. We heard platforms repeatedly state their belief that they are doing all they can to ensure safety and protect from harm. Actually, they do not even come close. My question to both platforms—and the others we are hearing from later today—is to what extent are you going to have to be dragged, kicking and screaming, to make sure these measures are put in place, or are you willing to work with Ofcom and other organisations to make sure that that is done?

Ben Bradley: Speaking for TikTok, we view ourselves as a second-generation platform. We launched in 2018, and at that time when you launched a product you had to make sure that safety was at the heart of it. I think the Secretary of State herself has said that the Bill process actually predates the launch of TikTok in the UK.

We view ourselves as an entertainment platform and to express yourself, enjoy yourself and be entertained you have to feel safe, so I do not think we would be seen as kicking and screaming under this regime. It is something that we have supported for a long time and we are regulated by Ofcom under the video-sharing platform, or VSP, regime. What the Bill will achieve is to raise the floor of industry standards, a bit like GDPR did for data, so that for all the companies in the future—to Alex’s point, this is about the next five and 10 years—there will be a baseline of standards that everyone must comply with and expectations that you will be regulated. Also, it takes a lot of these difficult decisions about the balance between safety and expression, privacy and security out of the hands of tech companies and into the hands of a regulator that, of course, will have democratic oversight.

Katy Minshall: I do not have very much more to add. We already engage positively with Ofcom. I remember appearing before a Select Committee back in 2018 or 2019 and at that point saying that we were absolutely supportive of Ofcom taking in this role and regulation potentially being a game changer. We are supportive of the systems and processes approach and look forward to engaging constructively in the regulation.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q In terms of the timing, once the Bill comes into law, there may be a period where it is enforced to set everything up. Are both your platforms already gearing up to make sure you fulfil the requirements of the Bill from day one?

Katy Minshall: I am glad you asked that question. The problem with the Bill is it depends on so many things that do not exist yet. We are looking at the Bill and thinking how we can prepare and start thinking about what is necessary, but in practice, content that is harmful to adults and harmful to children has not been set out yet. So much of the Bill depends on secondary legislation and codes of practice, and as I described earlier in the question from Alex Davies-Jones, there are such real workability questions around exemptions and ID verification that I worry there would be the risk of substantial delays at the other end, which I do not think anyone wants to see.

Ben Bradley: It is the same from our perspective. We have our community guidelines and we are committed to enforcing those at the moment. A lot of the detail of the Bill will be produced in Ofcom’s codes of practice but I think it is important we think about operationalising the process, what it looks like in practice and whether it is workable.

Something like Katy mentioned in terms of the user empowerment duties, how prescriptive those would be and how those would work, not just from the platforms of today but for the future, is really important. For TikTok, to use a similar example on the user empowerment duties, the intent is to discover content from all over the world. When you open the app, you are recommended content from all sorts of users and there is no expectation that those would be verified. If you have opted into this proposed user empowerment duty, there is a concern that it could exacerbate the risk of filter bubbles, because you would only be receiving content from users within the UK who have verified themselves, and we work very hard to make sure there is a diverse range of recommendations in that. I think it is a fairly easy fix. Much like elsewhere in the Bill, where Ofcom has flexibility about whether to require specific recommendations, they could have that flexibility in this case as well, considering whether this type of power works for these types of platforms.

To use the example of the metaverse, how would it work once the metaverse is up and running? The whole purpose of the metaverse is a shared environment in which users interact, and because the Bill is so prescriptive at the minute about how this user empowerment duty needs to be achieved, it is not clear, if you were verified and I were unverified and you had opted not to see my content but I moved something in the shared environment, like this glass, whether that would move for everyone. It is a small point, but it just goes to the prescriptiveness of how it is currently drafted and the importance of giving Ofcom the flexibility that it has elsewhere in the Bill, but in this section as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a few questions, starting with Twitter, in relation to young people using the platform. How do you currently make sure that under-13s do not use the platform? What actions do you take to ensure that happens? Going forward, will that change?

Katy Minshall: At present, we follow the industry standard of age self-declaration. How you manage and verify identity—whether using a real-name system or emerging technologies like blockchain or documentation—is at the heart of a range of industries, not just ours.

Technology will change and new products that we cannot even envisage today will come on to the market. In terms of what we would do in relation to the Bill, as I said, until we see the full extent of the definitions and requirements, we cannot really say what exact approach we would take.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q To follow up on that, you said that there is agreement internally and externally that your service is mostly used by over-18s. Does that mean that you do not think you will have a responsibility to undertake the child safety duties?

Katy Minshall: My understanding of the Bill is that if there is a chance a young person could access your service, you would be expected to undertake the child safety duties, so my understanding is that that would be the case.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Okay. Ben, for TikTok, how do you currently ensure that under-13s are not using your service, and how is that likely to change with the Bill coming in?

Ben Bradley: We are a strictly 13-plus platform. There are basically two approaches to preventing under-age access to our platform. The first is preventing them from signing up. We are 12+ rated in the app stores, so if you have parental controls on those app stores, you cannot download the app. We also have a neutral age gate, which I think is similar to Twitter’s. We do not ask people to confirm whether they are over 13—we do not ask them to tick a box; instead we ask them to enter their date of birth. If they enter a date of birth that is under 13, they are blocked from re-entering date of birth, so they cannot just keep trying. We do not say that it is because they are under age; we just say, “TikTok isn’t right for you right now.” That is the first step.

Secondly, we proactively surface and remove under-age users. Whenever a piece of content is reported on TikTok, for whatever reason, the moderator will look at two things: the reason why it was reported and also whether the user is under 13. They can look at a range of signals to do that. Are they wearing a school uniform? Is there a birthday cake in their biography? Do they say that they are in a certain year of school? They can use those signals.

We actually publish every quarter how many suspected under-13s we remove from our platform. I think we are currently the only company to publish that on a quarterly basis, but we think it is important to be transparent about how we are approaching this, to give a sense of the efficacy of our interventions.

On what specifically might change, that is not clear; obviously, we have to wait for further guidance from Ofcom. However, we did carry out research last year with parents and young people in five countries across Europe, including the UK, where we tested different ideas of age assurance and verification, trying to understand what they would like to see. There was not really a single answer that everyone could get behind, but there were concerns raised around data protection and privacy if you were handing over this type of information to the 50 or 60 apps that might be on your phone.

One idea, which people generally thought was a good one, was that when you first get a device and first sign into the app store, you would verify your age there, and then that app store on that device could then pass an additional token to all the apps on your phone suggesting that you are of a certain age, so that we could apply an age-appropriate experience. Obviously that would not stop us doing everything that we currently do, but I think that would be a strong signal. If that were to move forward, we would be happy to explore that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Both of your sites work very heavily on algorithms for the content that is put towards people. If you are in the top tweets feed on Twitter, you get algorithmically derived or chosen content, and TikTok is even more heavily involved in algorithms. How will this Bill affect the algorithms that you use, particularly regarding some of the content that may get more and more extreme, for example, if people are going down that route? In terms of the legal but harmful stuff that is likely to come through, how will the Bill affect the algorithms that you use, and is it possible to do that? Does it work?

Ben Bradley: TikTok does not take a filter bubble approach. When you first open the app, you express areas of content that you are interested in and then we recommend content. Because it is short-form, the key to TikTok’s success is sending you diverse content, which allows you to discover things that you might never have previously expressed interest in. I use the example of Nathan Evans, a postman who went on to have a No. 1 song with “Wellerman”, or even Eurovision, for example. These are things that you would not necessarily express interest in, but when they are recommended to you, you are engaged. Because it is short-form content, we cannot show you the same type of material over and over again—you would not be interested in seeing 10 30-second videos on football, for example. We intentionally try to diversify the feed to express those different types of interests.

Katy Minshall: Our algorithms down-rank harmful content. If you want to see an example live on Twitter, if you send a tweet and get loads of replies, there is a chunk that are automatically hidden at the bottom in a “view more replies” section. Our algorithm works in other ways as well to down-rank content that could be violating our rules. We endeavour to amplify credible content as well. In the explore tab, which is the magnifying glass, we will typically be directing you to credible sources of information—news websites and so on.

In terms of how the Bill would affect that, my main hope is that codes of practice go beyond a leave up or take down binary and beyond content moderation and think about the role of algorithms. At present on Twitter, you can turn the algorithm off in the top right-hand corner of the app, on the sparkle icon. In the long term, I think what we will be aiming for is a choice in the range of algorithms that you could use on services like Twitter. I would hope that the code of practice enables that and does not preclude is as a solution to some of the legal but harmful content we may have in mind.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Just one more question. We know that women and minorities face more abuse online than men do. Is that something that you have found in your experience, particularly Twitter? What are you doing to ensure that the intersectionality of harms is considered in the work that you are doing to either remove or downgrade content?

Katy Minshall: That is absolutely the case and it has been documented by numerous organisations and research. Social media mirrors society and society has the problems you have just described. In terms of how we ensure intersectionality in our policies and approaches, we are guided by our trust and safety council, which is a network of dozens of organisations around the world, 10 of which are here in the UK, and which represents different communities and different online harms issues. Alongside our research and engagement, the council ensures that when it comes to specific policies, we are constantly considering a range of viewpoints as we develop our safety solutions.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you to the witnesses. I share your concerns about the lack of clarity regarding the journalistic content and democratic content exemptions. Do you think those exemptions should be removed entirely, or can you suggest what we might do to make them clearer in the Bill?

Katy Minshall: At the very least, there must be tighter definitions. I am especially concerned when it comes to the news publisher exemption. The Secretary of State has indicated an amendment that would mean that services like Twitter would have to leave such content up while an appeals process is ongoing. There is no timeline given. The definition in the Bill of a news publisher is, again, fairly vague. If Ben and I were to set up a news website, nominally have some standards and an email address where people could send complaints, that would enable it to be considered a news publisher under the Bill. If we think about some of the accounts that have been suspended from social media over the years, you can absolutely see them creating a news website and saying, “I have a case to come back on,” to Twitter or TikTok or wherever it maybe.

Ben Bradley: We share those concerns. There are already duties to protect freedom of expression in clause 19. Those are welcome. It is the breadth of the definition of journalistic and democratic content that is a concern for us, particularly when it comes to things like the expediated and dedicated appeals mechanism, which those people would be able to claim if their content was removed. We have already seen people like Tommy Robinson on the far right present themselves as journalists or citizen journalists. Giving them access to a dedicated and expediated appeals mechanism is an area of concern.

There are different ways you could address that, such as greater clarity in those definitions and removing subjective elements. At the minute, it is whether or not a user considers their content to be journalistic; that it is not an objective criterion but about their belief about their content.

Also, if you look at something like the dedicated and expediated appeals mechanism, could you hold that in reserve so that if a platform were found to be failing in its duties to journalistic content or in its freedom of expression duties, Ofcom could say, like it can in other areas of the Bill, “Okay, we believe that you need to create this dedicated mechanism, because you have failed to protect those duties.”? That would, I think, minimise the risk for exploitation of that mechanism.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

That is really helpful, thank you. A quick question—

None Portrait The Chair
- Hansard -

I am sorry, I have to interrupt because of time. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Two hopefully quick questions. I have been listening carefully. Could you summarise the main changes you will make to your products that your users will notice make them safer, whether they are children or adults? I have heard a lot about problems, but what are the changes you will actually make? Within that, could you talk about how you will improve your complaints system, which earlier witnesses said is inadequate?

Katy Minshall: We would certainly look to engage with Ofcom positively on the requirements it sets out. I am sorry to sound repetitive, but the challenge is that the Bill depends on so many things that do not exist yet and the definitions around what we mean by content harmful to adults or to children. In practice, that makes it challenging to say to you exactly today what approaches we would take. To be clear, we would of course look to continue working with the Government and now Ofcom with the shared objective of making the online space safer for everyone.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q I want to probe you a little on that. Harmful content not being defined means that you will not make any changes other than around that. It is quite a large Bill; surely there are other things you will do differently, no?

Katy Minshall: The lesson of the past three or four years is that we cannot wait for the Bill. We at Twitter are continuing to make changes to our product and our policies to improve safety for everyone, including children.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q So the bill is irrelevant to you.

Katy Minshall: The Bill is a really important piece of regulation, which is why I was so pleased to come today and share our perspectives. We are continuing to engage positively with Ofcom. What I am trying to say is that until we see the full extent of the requirements and definitions, it is hard to set out exactly what steps we would take with regards to the Bill.

Ben Bradley: To add to that point, it is hard to be specific about some of the specific changes we would make because a lot of the detail of the Bill defers to Ofcom guidance and the codes of practice. Obviously we all have the duties around child safety and adult safety, but the Ofcom guidance will suggest specific measures that we can take to do that, some of which we may take already, and some of which may go further than what we already do. Once we see the details of the codes, we will be able to give a clearer answer.

Broadly from a TikTok perspective, through the design of the product and the way we approach safety, we are in a good place for when the new regime comes in, because we are regulated by Ofcom in the VSP regime, but we would have to wait for the full amount of detail. But outside some of the companies that you will hear from today, this will touch 20,000 companies and will raise the floor for all the companies that will be regulated under the regime.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q But you cannot give any further detail about specific changes you will make as a result of this legislation because you have not seen the guidance and the codes.

Ben Bradley: Yes, the codes of practice will recommend specific steps that we should take to achieve our duties. Until we see the detail of those codes it is hard to be specific about some of the changes that we would make.

None Portrait The Chair
- Hansard -

Barbara, you have just a couple of minutes.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Can I ask about children’s risk assessments? Who in your organisation will write the children’s risk assessments, and at what level in your organisation will they be signed off?

Katy Minshall: At present, we have a range of risk assessment processes. We have a risk committee of the board. We do risk assessments when we make a change about—

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q No, I mean the children’s risk assessment you will have to do as part of what the Bill will bring in.

Katy Minshall: At present, we do not have a specific individual designated to do the children’s risk assessment. The key question is how much does Ofcom’s guidance on risk assessments—once we see it—intersect with our current processes versus changes we would need to make to our risk assessment processes?

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Okay. At what level in the organisation do you anticipate children’s risk assessment would be signed off? Clearly, this is a very important aspect of the Bill.

Katy Minshall: I would have to go away and review the Bill. I do not know whether a specific level is set out in the Bill, but we would want to engage with the regulation and requirements set for companies such as Twitter. However it would be expected that is what we would—

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Do you think it should be signed off at a senior level—board level—in your organisation?

Katy Minshall: Already all the biggest decisions that we make as a company are signed off at the most senior level. We report to our chief executive, Parag Agrawal, and then to the board. As I say, there is a risk committee of the board, so I expect that we would continue to make those decisions at the highest level.

Ben Bradley: It is broadly the same from a TikTok perspective. Safety is a priority for every member of the team, regardless of whether they are in a specific trust and safety function. In terms of risk assessments, we will see from the detail of the Bill at what level they need to be signed off, but our CEO has been clear in interviews that trust and safety is a priority for him and everyone at TikTok, so it would be something to which we are all committed.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Do you think you would be likely to sign it off at the board level—

None Portrait The Chair
- Hansard -

Sorry, I have to interrupt you there. I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for coming to give evidence to the Committee. On the question about user choice around identity verification, is this not conceptually quite similar to the existing blue tick regime that Twitter operates successfully?

Katy Minshall: As I say, we share your policy objective of giving users more choice. For example, at present we are testing a tool where Twitter automatically blocks abusive accounts on your behalf. We make the distinction based on an account’s behaviour and not on whether it has verified itself in some way.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Well, I’d be grateful if you applied that to my account as quickly as possible!

I do not think that the concept would necessarily operate as you suggested at the beginning. You suggested that people might end up not seeing content posted by the Prime Minister or another public figure. The concept is that, assuming a public figure would choose to verify themselves, content that they posted would be visible to everybody because they had self-verified. The content in the other direction may or may not be, depending on whether the Prime Minister or the Leader of the Opposition chose to see all content or just verified content, but their content—if they verified themselves—would be universally visible, regardless of whatever choice anyone else exercised.

Katy Minshall: Yes, sorry if I was unclear. I totally accept that point, but it would mean that some people would be able to reply to Boris Johnson and others would not. I know we are short on time, but it is worth pointing out that in a YouGov poll in April, nearly 80% of people said that they would not choose to provide ID documents to access certain websites. The requirements that you describe are based on the assumption that lots of people will choose to do it, when in reality that might not be the case.

A public figure might think, “Actually, I really appreciate that I get retweets, likes and people replying to my tweets,” but if only a small number of users have taken the opportunity to verify themselves, that is potentially a disincentive even to use this system in the first place—and all the while we were creating a system, we could have been investing in or trying to develop new solutions, such as safety mode, which I described and which tries to prevent abusive users from interacting with you.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I want to move on to the next question because we only have two minutes left.

Ben, you talked about the age verification measures that TikTok currently takes. For people who do not come via an age-protected app store, it is basically self-declared. All somebody has to do is type in a date of birth. My nine-year-old children could just type in a date of birth that was four years earlier than their real date of birth, and off they would go on TikTok. Do you accept that that is wholly inadequate as a mechanism for policing the age limit of 13?

Ben Bradley: That is not the end of our age assurance system; it is just the very start. Those are the first two things that we have to prevent sign-up, but we are also proactive in surfacing and removing under-age accounts. As I said, we publish every quarter how many suspected under-13s get removed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q If I understood your answer correctly, that is only if a particular piece of content comes to the attention of your moderators. I imagine that only 0.01% or some tiny fraction of content on TikTok comes to the attention of your moderators.

Ben Bradley: It is based on a range of signals that they have available to them. As I said, we publish a number every quarter. In the last quarter, we removed 14 million users across the globe who were suspected to be under the age of 13. That is evidence of how seriously we take the issue. We publish that information because we think it is important to be transparent about our efforts in this space, so that we can be judged accordingly.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Forgive me for moving on in the interests of time.

Earlier, we debated content of democratic importance and the protections that that and free speech have in the Bill. Do you agree that a requirement to have some level of consistency in the way that that is treated is important, particularly given that there are some glaring inconsistencies in the way in which social media firms treat content at the moment? For example, Donald Trump has been banned, while flagrant disinformation by the Russian regime, lying about what they are doing in Ukraine, is allowed to propagate—including the tweets that I drew to your attention a few weeks ago, Katy.

Katy Minshall: I agree that freedom of expression should be top of mind as companies develop safety and policy solutions. Public interest should always be considered when developing policies. From the perspective of the Bill, I would focus on freedom of expression for everyone, and not limit it to content that could be related to political discussions or journalistic content. As Ben said, there are already wider freedom of expression duties in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q To be clear, those freedom of expression duties in clause 19(2) do apply to everyone.

Katy Minshall: Sorry, but I do not know the Bill in those terms, so you would have to tell me the definition.

None Portrait The Chair
- Hansard -

Order. I am afraid that that brings us to the end of the time allotted for the Committee to ask questions in this morning’s sitting. On behalf of the Committee, I thank our witnesses for their evidence. We will meet again at 2 pm in this room to hear further oral evidence.

11:26
The Chair adjourned the Committee without Question put (Standing Order No. 88).
Adjourned till this day at Two o’clock.

Online Safety Bill (Second sitting)

The Committee consisted of the following Members:
Chairs: † Sir Roger Gale, Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Mrs Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Witnesses
Richard Earley, UK Public Policy Manager, Meta
Becky Foreman, UK Corporate Affairs Director, Microsoft
Katie O’Donovan, Director of Government Affairs and Public Policy, Google UK
Professor Clare McGlynn, Professor of Law, Durham University
Jessica Eagelton, Policy and Public Affairs Manager, Refuge
Janaya Walker, Public Affairs Manager, End Violence Against Women
Lulu Freemont, Head of Digital Regulation, techUK
Ian Stevenson, Chair, OSTIA
Adam Hildreth, CEO, Crisp
Jared Sine, Chief Business Affairs and Legal Officer, Match Group
Nima Elmi, Head of Public Policy in Europe, Bumble
Dr Rachel O’Connell, CEO TrustElevate
Rhiannon-Faye McDonald, Victim and Survivor Advocate, Marie Collins Foundation
Susie Hargreaves OBE, Chief Executive, Internet Watch Foundation
Ellen Judson, Lead Researcher at the Centre for the Analysis of Social Media, Demos
Kyle Taylor, Founder and Director, Fair Vote UK
Public Bill Committee
Tuesday 24 May 2022
(Afternoon)
[Sir Roger Gale in the Chair]
Online Safety Bill
14:00
The Committee deliberated in private.
Examination of Witnesses
Richard Earley, Becky Foreman and Katie O’Donovan gave evidence.
14:01
None Portrait The Chair
- Hansard -

Q 68 Good afternoon. We are now sitting in public and these proceedings are being broadcast. This afternoon, we will first hear oral evidence from Richard Earley, the UK public policy manager of Meta, Becky Foreman, the UK corporate affairs director at Microsoft, and Katie O’Donovan, the director of Government affairs and public policy at Google and YouTube. Ladies and gentlemen, thank you very much indeed for joining us. For the sake of the record, could I just ask you to identify yourselves?

Richard Earley: Good afternoon. My name is Richard Earley, and I work in the public policy team at Meta, leading on content issues including the Online Safety Bill.

Becky Foreman: I am Becky Foreman; I am the corporate affairs director for Microsoft UK.

Katie O’Donovan: I am Katie O’Donovan; I am director of Government affairs and public policy for Google in the UK.

None Portrait The Chair
- Hansard -

May I just ask you, for the benefit of Hansard, to try to speak up a little? The sound system is not all that it might be in this room, and the acoustics certainly are not.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Q Thank you to our witnesses for joining us this afternoon. Quite bluntly, I will get into it, because what is frustrating for us, as Parliamentarians, and for our constituents, is the fact that we need this legislation in the first place. Why are you, as platforms, allowing harmful and illegal content to perpetuate on your platforms? Why do we need this legislation for you to take action? It is within your gift to give, and despite all the things I am sure you are about to tell me that you are doing to prevent this issue from happening, it is happening and we are needing to legislate, so why?

None Portrait The Chair
- Hansard -

Mr Earley, I will go left to right to start with, if that is all right with you, so you have drawn the short straw.

Richard Earley: No worries, and thank you very much for giving us the opportunity to speak to you all today; I know that we do not have very much time. In short, we think this legislation is necessary because we believe that it is really important that democratically elected Members of Parliament and Government can provide input into the sorts of decisions that companies such as ours are making, every day, about how people use the internet. We do not believe that it is right for companies such as ours to be taking so many important decisions every single day.

Now, unfortunately, it is the case that social media reflects the society that we live in, so all of the problems that we see in our society also have a reflection on our services. Our priority, speaking for Meta and the services we provide—Facebook, Instagram and WhatsApp—is to do everything we can to make sure our users have as positive an experience as possible on our platform. That is why we have invested more than $13 billion over the past five years in safety and security, and have more than 40,000 people working at our company on safety and security every day.

That said, I fully recognise that we have a lot more areas to work on, and we are not waiting for this Bill to come into effect to do that. We recently launched a whole range of updated tools and technologies on Instagram, for example, to protect young people, including preventing anyone under the age of 18 from being messaged by a person they are not directly connected to. We are also using new technology to identify potentially suspicious accounts to prevent young people from appearing in any search results that those people carry out. We are trying to take steps to address these problems, but I accept there is a lot more to do.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Before I bring in Becky and Katie to answer that, I just want to bring you back to something you said about social media and your platforms reflecting society like a mirror. That analogy is used time and again, but actually they are not a mirror. The platforms and the algorithms they use amplify, encourage and magnify certain types of content, so they are not a mirror of what we see in society. You do not see a balanced view of two points of an issue, for example.

You say that work is already being done to remove this content, but on Instagram, for example, which is a platform predominantly used by women, the Centre for Countering Digital Hate has exposed what they term an “epidemic of misogynistic abuse”, with 90% of misogynistic abuse being sent via direct messaging. It is being ignored by the platform even when it is being reported to the moderators. Why is that happening?

Richard Earley: First, your point about algorithms is really important, but I do not agree that they are being used to promote harmful content. In fact, in our company, we use algorithms to do the reverse of that. We try to identify content that might break our policies—the ones we write with our global network of safety experts—and then remove those posts, or if we find images or posts that we think might be close to breaking those rules, we show them lower in people’s feeds so that they have a lower likelihood of being seen. That is why, over the past two years, we have reduced the prevalence of harmful posts such as hate speech on Facebook so that now only 0.03% of views of posts on Facebook contain that kind of hate speech—we have almost halved the number. That is one type of action that we take in the public parts of social media.

When it comes to direct messages, including on Instagram, there are a range of steps that we take, including giving users additional tools to turn off any words they do not want to see in direct messages from anyone. We have recently rolled out a new feature called “restrict” which enables you to turn off any messages or comments from people who have just recently started to follow you, for example, and have just created their accounts. Those are some of the tools that we are trying to use to address that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So the responsibility is on the user, rather than the platform, to take action against abuse?

Richard Earley: No, the responsibility is absolutely shared by those of us who offer platforms, by those who are engaged in abuse in society, and by civil society and users more widely. We want to ensure we are doing everything we can to use the latest technology to stop abuse happening where we can and give people who use our services the power to control their experience and prevent themselves from encountering it.

None Portrait The Chair
- Hansard -

We must allow the other witnesses to participate.

Becky Foreman: Thank you for inviting me to give evidence to you today. Online safety is extremely important to Microsoft and sits right at the heart of everything we do. We have a “safety by design” policy, and responsibility for safety within our organisation sits right across the board, from engineers to operations and policy people. However, it is a complicated, difficult issue. We welcome and support the regulation that is being brought forward.

We have made a lot of investments in this area. For example, we introduced PhotoDNA more than 10 years ago, which is a tool that is used right across the sector and by non-governmental organisations to scan for child sexual abuse material and remove it from their platforms. More recently, we have introduced a grooming tool that automates the process of trying to establish whether there is a conversation for grooming taking place between an adult and a child. That can then be flagged for human review. We have made that available at no charge to the industry, and it has been licensed by a US NGO called Thorn. We take this really seriously, but it is a complicated issue and we really welcome the regulation and the opportunity to work with the Government and Ofcom on this.

Katie O’Donovan: Thank you so much for having me here today and asking us to give evidence. Thank you for your question. I have worked at Google and YouTube for about seven years and I am really proud of our progress on safety in those years. We think about it in three different ways. First, what products can we design and build to keep our users safer? Similar to Microsoft, we have developed technology that identifies new child sex abuse material and we have made that available across the industry. We have developed new policies and new ways of detecting content on YouTube, which means we have really strict community guidelines, we identify that content and we take it down. Those policies that underlie our products are really important. Finally, we work across education, both in secondary and primary schools, to help inform and educate children through our “Be Internet Legends” programme, which has reached about 4 million people.

There is definitely much more that we can do and I think the context of a regulatory environment is really important. We also welcome the Bill and I think it is really going to be meaningful when Ofcom audits how we are meeting the requirements in the legislation—not just how platforms like ours are meeting the requirements in the Bill, but a wide spectrum of platforms that young people and adults use. That could have a really positive additive effect to the impact.

It is worth pausing and reflecting on legislation that has passed recently, as well. The age-appropriate design code or the children’s code that the Information Commissioner’s Office now manages has also helped us determine new ways to keep our users safe. For example, where we have long had a product called SafeSearch, which you can use on search and parents can keep a lock on, we now also put that on by default where we use signals to identify people who we think are under 18.

We think that is getting the right balance between providing a safer environment but also enabling people to access information. We have not waited for this regulation. This regulation can help us do more, and it can also level the playing field and really make sure that everyone in the industry steps up and meets the best practice that can exist.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, both, for adding context to that. If I can bring you back to what is not being done and why we need to legislate, Richard, I come back to you. You mentioned some of the tools and systems that you have put in place so users can stop abuse from happening. Why is it that that 90% of abuse on Instagram in direct messages is being ignored by your moderators?

Richard Earley: I do not accept that figure. I believe that if you look at our quarterly transparency report, which we just released last week, you can see that we find more than 90% of all the content that we remove for breaking our policies ourselves. Whenever somebody reports something on any of our platforms, they get a response from us. I think it is really important, as we are focusing on the Bill, to understand or make the point that, for private messaging, yes, there are different harms and different risks of harm that can apply, which is why the steps that we take differ from the steps that we take in public social media.

One of the things that we have noticed in the final draft of the Bill is that the original distinction between public social media and private messaging, which was contained in the online harms White Paper and in earlier drafts of the Bill, has been lost here. Acknowledging that distinction, and helping companies recognise that there is different risk and then different steps that can be taken in private messaging to what is taken on public social media, would be a really important thing for the Committee to consider.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Quite briefly, because I know we are short on time, exactly how many human moderators do you have working to take down disinformation and harmful illegal content on your platforms?

Richard Earley: We have around 40,000 people in total working on safety and security globally and, of those, around half directly review posts and content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q How many of those are directly employed by you and how many are third party?

Richard Earley: I do not have that figure myself but I know it is predominantly the case that, in terms of the safety functions that we perform, it is not just looking at the pieces of content; it is also designing the technology that finds and surfaces content itself. As I said, more than 90% of the time—more than 95% in most cases—it is our technology that finds and removes content before anyone has to look at it or report it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q On that technology, we have been told that you are not doing enough to remove harmful and illegal content in minority languages. This is a massive gap. In London alone, more than 250 languages are spoken on a regular basis. How do you explain your inaction on this? Can you really claim that your platform is safe if you are not building and investing in AI systems in a range of languages? What proactive steps are you taking to address this extreme content that is not in English?

Richard Earley: That group of 40,000 people that I mentioned, they operate 24 hours, 7 days a week. They cover more than 70 languages between them, which includes the vast majority of the world’s major spoken languages. I should say that people working at Meta, working on these classifiers and reviewing content, include people with native proficiency in these languages and people who can build the technology to find and remove things too. It is not just what happens within Meta that makes a difference here, but the work we do with our external partners. We have over 850 safety partners that we work with globally, who help us understand how different terms can be used and how different issues can affect the spread of harm on our platforms. All of that goes into informing both the policies we use to protect people on our platform and the technology we build to ensure those policies are followed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Finally, which UK organisations that you use have quality assured any of their moderator training materials?

Richard Earley: I am sorry, could you repeat the question?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The vast majority of people are third party. They are not employed directly by Meta to moderate content, so how many of the UK organisations you use have been quality assured to ensure that the training they provide in order to spot this illegal and harmful content is taken on board?

Richard Earley: I do not believe it is correct that for our company, the majority of moderators are employed by—

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

You do not have the figures, so you cannot tell me.

Richard Earley: I haven’t, no, but I will be happy to let you know afterwards in our written submission. Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement. If it is hate speech, of course, there is a very important language component to that training, but in other areas—nudity or graphic violence—the language component is less important. We have published quite a lot about the work we do to make sure our moderators are as effective as possible and to continue auditing and training them. I would be really happy to share some of that information, if you want.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q But that is only for those employed directly by Meta.

Richard Earley: I will have to get back to you to confirm that, but I think it applies to everyone who reviews content for Meta, whether they are directly employed by Meta or through one of our outsourced-in persistent partners.

None Portrait The Chair
- Hansard -

Thank you very much. Don’t worry, ladies; I am sure other colleagues will have questions that they wish to pursue. Dean Russell, please.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

Q Thank you, Chair. I guess this is for all three of you, but it is actually directed primarily at Richard—apologies. I do not mean to be rude—well, I am probably about to be rude.

One of the reasons why we are bringing in this Bill is that platforms such as Facebook—Meta, sorry—just have not fulfilled their moral obligations to protect children from harm. What commitment are you making within your organisation to align yourself to deliver on the requirements of the Bill?

To be frank, the track record up until now is appalling, and all I hear when in these witness sessions, including before Christmas on the Joint Committee, is that it is as though the big platforms think they are doing a good job—that they are all fine. They have spent billions of pounds and it is not going anywhere, so I want to know what practical measures you are going to be putting into place following this Bill coming into law.

Richard Earley: Of course, I do not accept that we have failed in our moral obligation to our users, particularly our younger users. That is the most important obligation that we have. I work with hundreds of people, and there are thousands of people at our company who spend every single day talking to individuals who have experienced abuse online, people who have lived experience of working with victims of abuse, and human rights defenders—including people in public life such as yourself—to understand the impact that the use of our platform can have, and work every day to make it better.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q But do you accept that there is a massive gap between those who you perhaps have been protecting and those who are not protected, hence the need for us to put this law in place?

Richard Earley: Again, we publish this transparency report every quarter, which is our attempt to show how we are doing at enforcing our rules. We publish how many of the posts that break our rules we take down ourselves, and also our estimates of how likely you are to find a piece of harmful content on the platform—as I mentioned, it is around three in every 10,000 for hate speech right now—but we fully recognise that you will not take our word for it. We expect confidence in that work to be earned, not just assumed.

That is why last year, we commissioned EY to carry out a fully independent audit of these systems. It published that report last week when we published our most recent transparency report and, again, I am very happy to share it with you here. The reason we have been calling for many years for pieces of legislation like this Bill to come into effect is that we think having Ofcom, the regulator—as my colleagues just said—able to look in more detail at the work we are doing, assess the work we are doing, and identify areas where we could do more is a really important part of what this Bill can do.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q I am conscious of the time, sorry. I know colleagues want to come in, but what are the practical measures? What will you be doing differently moving forward following this Bill?

Richard Earley: To start with, as I said, we are not waiting for the Bill. We are introducing new products and new changes all the time.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Which will do what, sorry? I do not mean to be rude, but what will they be?

Richard Earley: Well, I just spoke about some of the changes we made regarding young people, including defaulting them into private accounts. We have launched additional tools making it possible for people to put in lists of words they do not want to see. Many of those changes are aligned with the core objectives of the Bill, which are about assessing early the risks of any new tools that we launch and looking all the time at how the use of technology changes and what new risks that might bring. It is then about taking proactive steps to try to reduce the risk of those harms.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q May I ask you a specific question? Will that include enabling bereaved parents to see their children’s Facebook posts and profile?

Richard Earley: This is an issue we have discussed at length with DCMS, and we have consulted a number of people. It is, of course, one of the most sensitive, delicate and difficult issues we have to deal with, and we deal with those cases very regularly. In the process that exists at present, there are, of course, coronial powers. There is a process in the UK and other countries for coroners to request information.

When it comes to access for parents to individuals’ accounts, at present we have a system for legacy contacts on some of our services, where you can nominate somebody to have access to your account after you pass away. We are looking at how that can be expanded. Unfortunately, there are an awful lot of different obligations we have to consider, not least the obligations to a person who used our services and then passed away, because their privacy rights continue after they have passed away too.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Okay, so there is a compassion element. I am conscious of time, so I will stop there.

None Portrait The Chair
- Hansard -

One moment, please. I am conscious of the fact that we are going to run out of time. I am not prepared to allow witnesses to leave without feeling they have had a chance to say anything. Ms Foreman, Ms O’Donovan, is there anything you want to comment on from what you have heard so far? If you are happy, that is fine, I just want to make sure you are not being short-changed.

Becky Foreman: No.

Katie O'Donovan: No, I look forward to the next question.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Q Given the size of Facebook, a lot of our questions will be focused towards it—not that you guys do not have very large platforms, but the risks with social media are larger. You mentioned, Richard, that three in every 10,000 views are hate speech. If three in every 10,000 things I said were hate speech, I would be arrested. Do you not think that, given the incredibly high number of views there are on Facebook, there is much more you need to do to reduce the amount of hate speech?

Richard Earley: So, reducing that number—the prevalence figure, as we call it—is the goal that we set our engineers and policy teams, and it is what we are devoted to doing. On whether it is a high number, I think we are quite rare among companies of our size in providing that level of transparency about how effective our systems are, and so to compare whether the amount is high or low, you would require additional transparency from other companies. That is why we really welcome the part of the Bill that allows Ofcom to set standards for what kinds of transparency actually are meaningful for people.

We have alighted on the figure of prevalence, because we think it is the best way for you and the public to hold us to account for how we are doing. As I said, that figure of three in every 10,000 has declined from six in every 10,000 about 12 months ago. I hope the figure continues to go down, but it is not just a matter of what we do on our platform. It is about how all of us in society function and the regulations you will all be creating to help support the work we do.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I would like to follow up with a question about responding to complaints. The complaints process is incredibly important. Reports need to be made and Facebook needs to respond to those reports. The Centre for Countering Digital Hate said that it put in 100 complaints and that 51 did not get any response from Facebook. It seems as though there is a systemic issue with a lack of response to complaints.

Richard Earley: I do not know the details of that methodological study. What I can tell you is that every time anyone reports something on Facebook or Instagram, they get a response into their support inbox. We do not put the response directly into your Messenger inbox or IG Direct inbox, because very often when people report something, they do not want to be reminded of what they have seen among messages from their friends and family. Unfortunately, sometimes people do not know about the support inbox and so they miss the response. That could be what happened there, but every time somebody reports something on one of our platforms, they get a response.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Does the response just say, “Thanks for your report”?

Richard Earley: No. I want to be very constructive here. I should say that some of the concerns that are raised around this date from several years ago. I will accept that five or 10 years ago, the experience on our platforms was not this comprehensive, but in the last few years, we have really increased the transparency we give to people. When you submit something and report it for a particular violation, we give you a response that explains the action we took. If we removed it, we would explain what piece of our community standards it broke. It also gives you a link to see that section of our policy so you can understand it.

That is one way we have tried to increase the transparency we give to users. I think there is a lot more we could be doing. I could talk about some of the additional transparency steps we are taking around the way that our algorithms recommend content to people. Those are, again, all welcome parts of the Bill that we look forward to discussing further with Ofcom.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q One of the things that has been recommended by a number of charities is increasing cross-platform and cross-company work to identify and take action on emerging threats. Do you think there would be the level of trust necessary for cross-platform co-operation with your competitors in the light of reports that, for example, Facebook employed somebody to put out negative things about TikTok in the US? Do you think that cross-platform working will work in that environment?

Richard Earley: Yes; I think it is already working, in fact. Others on the panel mentioned a few areas in which we have been collaborating in terms of open-sourcing some of the technology we have produced. A few years ago, we produced a grooming classifier—a technology that allows people to spot potentially inappropriate interactions between adults and children—and we open-sourced that and enabled it to be used and improved on by anyone else who is building a social media network.

A number of other areas, such as PhotoDNA, have already been mentioned. An obvious one is the Global Internet Forum to Counter Terrorism, which is a forum for sharing examples of known terrorist content so that those examples can be removed from across the internet. All those areas have been priorities for us in the past. A valuable piece of the Bill is that Ofcom—from what I can see from the reports that it has been asked to make—will do a lot of work to understand where there are further opportunities for collaboration among companies. We will be very keen to continue being involved in that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a question for Katie on the algorithms that produce suggestions when you begin to type. It has been raised with me and in the evidence that we have received that when you begin to type, you might get a negative suggestion. If somebody types in, “Jews are”, the algorithm might come up with some negative suggestions. What has Google done about that?

Katie O'Donovan: We are very clear that we want the auto-suggestion, as we call it, to be a helpful tool that helps you find the information that you are looking for quicker—that is the core rationale behind the search—but we really do not want it to perpetuate hate speech or harm for protected individuals or wider groups in society. We have changed the way that we use that auto-complete, and it will not auto-complete to harmful suggestions. That is a live process that we review and keep updated. Sometimes terminology, vernacular or slang change, or there is a topical focus on a particular group of people, so we keep it under review. But by our policy and implementation, those auto-suggestions should very much not be happening on Google search.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Would it be technically possible for all of the protected characteristics, for example, to have no auto-complete prompts come up?

Katie O'Donovan: That is an excellent question on where you do not want protections and safety to minimise user or individual impact. If you wanted a protected characteristic for Jewish people, for example, we see that as really important, and we should remove the ability for hate speech. If you wanted to do that for a Jewish cookbook, Jewish culture or Jewish history, and we removed everything, you would really minimise the amount of content that people could access.

The Bill is totally vital and will be incredibly significant on UK internet access, but that is where it is really important to get the balance and nuance right. Asking an algorithm to do something quite bluntly might look at first glance like it will really improve safety, but when you dig into it, you end up with the available information being much less sophisticated, less impactful and less full, which I think nobody really wants—either for the user or for those protected groups.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Would it not be easier to define all the protected characteristics and have a list of associated words than to define every possible instance of hate speech in relation to each?

Katie O'Donovan: The way we do it at the moment is through algorithmic learning. That is the most efficient way to do it because we have millions of different search terms, a large number of which we see for the very first time every single day on Google search. We rarely define things with static terms. We use our search rater guidelines—a guide of about 200 pages—to determine how those algorithms work and make sure that we have a dynamic ability to restrict them.

That means that you do not achieve perfection, and there will be changes and new topical uses that we perhaps did not anticipate—we make sure that we have enough data incoming to adjust to that. That is the most efficient way of doing it, and making sure that it has the nuance to stop the bad autocomplete but give access to the great content that we want people to get to.

None Portrait The Chair
- Hansard -

Thank you very much. Ms Foreman, do you want to add anything to that? You do not have to.

Becky Foreman: I do not have anything to add.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Q I want to come back to transparency, which we touched on with my colleague Alex Davies-Jones earlier. Clearly, it is very important, and I think we could take a big step forward with the Bill. I want to ask you about child risk assessments, and whether they should be available publicly. I also want to ask about reports on the measures that you will have to take, as platforms, to manage the risks and mitigate the impact of harm. Harm is occurring at the moment—for example, content that causes harm is being left up. We heard earlier from the NSPCC that Facebook would not take down birthday groups for eight, nine and 10-year-old children, when it is known what purpose those birthday groups were serving for those young children. I guess my question on transparency is, “Can’t you do much better, and should there be public access to reports on the level of harm?”

Richard Earley: There are quite a few different questions there, and I will try to address them as briefly as I can. On the point about harmful Facebook groups, if a Facebook group is dedicated to breaking any of our rules, we can remove that group, even if no harmful content has been posted in it. I understand that was raised in the context of breadcrumbing, so trying to infer harmful intent from innocuous content. We have teams trying to understand how bad actors circumvent our rules, and to prevent them from doing that. That is a core part of our work, and a core part of what the Bill needs to incentivise us to do. That is why we have rules in place to remove groups that are dedicated to breaking our rules, even if no harmful content is actually posted in them.

On the question you asked about transparency, the Bill does an admirable job of trying to balance different types of transparency. There are some kinds of transparency that we believe are meaningful and valid to give to users. I gave the example a moment ago of explaining why a piece of content was removed and which of our community standards it broke. There is other transparency that we think is best given in a more general sense. We have our transparency report, as I said, where we give the figures for how much content we remove, how much of it we find ourselves—

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I am not talking here about general figures for what you have removed. I am talking about giving real access to the data on the risks of harm and the measures to mitigate harm. You could make those reports available to academics—we could find a way of doing that—and that would be very valuable. Surely what we want to do is to generate communities, including academics and people who have the aim of improving things, but you need to give them access to the data. You are the only ones who have access to the data, so it will just be you and Ofcom. A greater community out there who can help to improve things will not have that access.

Richard Earley: I completely agree. Apologies for hogging more time, but I think you have hit on an important point there, which is about sharing information with researchers. Last year, we gave data to support the publishing of more than 400 independent research projects, carried out along the lines you have described here. Just yesterday, we announced an expansion of what is called our Facebook open research tool, which expands academics’ ability to access data about advertising.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q My question is, will you publish the risk assessment and the measures you are taking to mitigate?

Richard Earley: Going back to how the Bill works, when it comes to—

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

No, I am not just asking about the Bill. Will you do that?

Richard Earley: We have not seen the Ofcom guidance on what those risk assessments should contain yet, so it is not possible to say. I think more transparency should always be the goal. If we can publish more information, we will do so.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q It would be good to have that goal. Can I come to you, Katie O’Donovan?

Katie O'Donovan: To begin with, I would pick up on the importance of transparency. We at Google and YouTube publish many reports on a quarterly or annual basis to help understand the actions we are taking. That ranges from everything on YouTube, where we publish by country the content we have taken down, why we have taken it down, how it was detected and the number of appeals. That is incredibly important information. It is good for researchers and others to have access to that.

We also do things around ads that we have removed and legal requests from different foreign Governments, which again has real validity. I think it is really important that Ofcom will have access to how we work through this—

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I was not just asking about Ofcom; I was wanting to go further than that and have wider access.

Katie O'Donovan: I do not want to gloss over the Ofcom point; I want to dwell on it for a second. In anticipation of this Bill, we were able to have conversations with Ofcom about how we work, the risks that we see and how our systems detect that. Hopefully, that is very helpful for Ofcom to understand how it will audit and regulate us, but it also informs how we need to think and improve our systems. I do think that is important.

We make a huge amount of training data available at Google. We publish a lot of shared APIs to help people understand what our data is doing. We are very open to publishing and working with academics.

It is difficult to give a broad statement without knowing the detail of what that data is. One thing I would say—it always sound a bit glib when people in my position say this—is that, in some cases, we do need to be limited in explaining exactly how our systems work to detect bad content. On YouTube, you have very clear community guidelines, which we know we have to publish, because people have a right to know what content is allowed and what is not, but we will find people who go right up to the line of that content very deliberately and carefully—they understand that, almost from a legal perspective. When it comes to fraudulent services and our ads, we have also seen people pivot the way that they attempt to defraud us. There needs to be some safe spaces to share that information. Ofcom is helpful for that too.

None Portrait The Chair
- Hansard -

Okay. Kim Leadbetter, one very quick question. We must move on—I am sorry.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Okay, I will try to be very quick. The draft Bill contained a proposed new media literacy duty. That seems to have now disappeared. What are your digital media literacy strategies?

Becky Foreman: We have a range of strategies. One thing I would point to is research that we conduct every year and have done for a number of years called the digital civility index. It is a set of research that speaks to teens and adults in a number of countries around the world to understand what harms they are concerned about online and to ascertain whether those harms are increasing or decreasing and how they vary between different geographies. That is one way in which we are trying to make more data and information available to the general public about the type of harms they might come across online and whether they are increasing or decreasing.

Richard Earley: We have a range of different organisations that we work with in the UK and internationally. One that I would like to draw attention to is the Economist Educational Foundation’s Burnet News Club. We have supported them to increase their funding to be able to aim to reach 10% of all state schools with a really incredibly immersive and impressive programme that enables young people to understand digital literacy and digital numeracy and the media. We are also members of the media literacy taskforce of the Department for Digital, Culture, Media and Sport at the moment, which has been working to build on the strategy that the Government published.

Overall, there is a really important role for us as platforms to play here. We regularly commission and start new programmes in this space. What is also really important is to have more guidance from Government and civil society organisations that we work with on what is effective, so that we can know where we can put our resources and boost the greatest work.

Katie O'Donovan: Thank you for the question. It is really important. We were disappointed to see the literacy focus lost in the Bill.

We really take the issue seriously. We know there is an absolute responsibility for us when it comes to product, and an absolute responsibility when it comes to policy. Even within the safest products and with the most impressive and on-it parents, people can be exposed in content in ways that are surprising and shocking. That is why you need this holistic approach. We have long invested in a programme that we run with the non-governmental organisation Parent Zone called “Be internet legends”. When we developed that, we did it with the PSHE Association to make sure it was totally compliant with the national curriculum. We regularly review that to check that it is actually making a difference. We did some recent research with MORI and got some really good results back.

We used to deliver that programme face to face in schools up and down the country. Obviously, the pandemic stopped that. We went online and while we did not enjoy it quite as much, we were able to reach real scale and it was really effective. Along with doing the assemblies, which are now back in person, we deliver a pack for teachers so they can also take that up at scale. We run similar programmes through YouTube with teenagers. It is absolutely incumbent on us to do more, but it must be part of the debate, because if you rely just on technological solutions, you will end up reducing access to lawful information, with some of the harms still being prevalent and people not having the skills to navigate them.

None Portrait The Chair
- Hansard -

I am sorry, but I must move on. Minister, I am afraid you only have five minutes.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Welcome to the Committee’s proceedings and thank you for joining us this afternoon. I would like to start on the question of the algorithmic promotion of content. Last week, I met with the Facebook whistleblower, Frances Haugen, who spoke in detail about she had found when working for Facebook, so I will start with you, Richard. On the question of transparency, which other Members of the Committee have touched on, would you have any objection to sharing all the information you hold internally with trusted researchers?

Richard Earley: What information are you referring to?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Data, in particular on the operation of algorithmic promotion of particular kinds of content.

Richard Earley: We already do things like that through the direct opportunity that anyone has to see why a single post has been chosen for them in their feed. You can click on the three dots next to any post and see that. For researcher access and support, as I mentioned, we have contributed to the publishing of more than 400 reports over the last year, and we want to do more of that. In fact, the Bill requires Ofcom to conduct a report on how to unlock those sorts of barriers, which we think should be done as soon as possible. Yes, in general we support that sort of research.

I would like say one thing, though. I have worked at Facebook—now Meta—for almost five years, and nobody at Facebook has any obligation, any moral incentive, to do anything other than provide people with the best, most positive experience on our platform, because we know that if we do not give people a positive experience, through algorithms or anything else, they will leave our platform and will not use it. They tell us that and they do it, and the advertisers who pay for our services do not want to see that harmful content on our platforms either. All of our incentives are aligned with yours, which are to ensure that our users have a safe and positive experience on our platforms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yet the algorithms that select particular content for promotion are optimised for user engagement —views, likes and shares—because that increases user stickiness and keeps them on the site for longer. The evidence seems to suggest that, despite what people say in response to the surveys you have just referenced, what they actually interact with the most—or what a particular proportion of the population chooses to interact with the most—is content that would be considered in some way extreme, divisive, or so on, and that the algorithms, which are optimised for user engagement, notice that and therefore uprank that content. Do you accept that your algorithms are optimised for user engagement?

Richard Earley: I am afraid to say that that is not correct. We have multiple algorithms on our services. Many of them, in fact, do the opposite of what you have just described: they identify posts that might be violent, misleading or harmful and reduce the prevalence of them within our feed products, our recommendation services and other parts of the service.

We optimise the algorithm that shows people things for something called meaningful social interaction. That is not just pure engagement; in fact, its focus—we made a large change to our algorithms in 2018 to focus on this—is on the kinds of activities online that research shows are correlated with positive wellbeing outcomes. Joining a group in your local area or deciding to go to an event that was started by one of your friends—that is what our algorithms are designed to promote. In fact, when we made that switch in 2018, we saw a decrease in more than 50 million hours of Facebook use every day as a result of that change. That is not the action of a company that is just focused on maximising engagement; it is a company that is focused on giving our users a positive experience on our platform.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q You have alluded to some elements of the algorithmic landscape, but do you accept that the dominant feature of the algorithm that determines which content is most promoted is based on user engagement, and that the things you have described are essentially second-order modifications to that?

Richard Earley: No, because as I just said, when we sent the algorithm this instruction to focus on social interaction it actually decreased the amount of time people spent on our platform.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q It might have decreased it, but the meaningful social interaction score is, not exclusively, as you said, but principally based on user engagement, isn’t it?

Richard Earley: As I said, it is about ensuring that people who spend time on our platform come away feeling that they have had a positive experience.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That does not quite answer the question.

Richard Earley: I think that a really valuable part of the Bill that we are here to discuss is the fact that Ofcom will be required, and we in our risk assessments will be required, to consider the impact on the experience of our users of multiple different algorithms, of which we have hundreds. We build those algorithms to ensure that we reduce the prevalence of harmful content and give people the power to connect with those around them and build community. That is what we look forward to demonstrating to Ofcom when this legislation is in place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, but in her testimony to, I think, the Joint Committee and the US Senate, in a document that she released to The Wall Street Journal, and in our conversation last week, Frances Haugen suggested that the culture inside Facebook, now Meta, is that measures that tend to reduce user engagement do not get a very sympathetic hearing internally. However, I think we are about to run out of time. I have one other question, which I will direct, again, to Richard. Forgive me, Katie and Becky, but it is probably most relevant for Meta.

None Portrait The Chair
- Hansard -

Q Just one moment, please. Is there anything that the other witnesses need to say about this before we move on? It will have to be very brief.

Katie O'Donovan: I welcome the opportunity to address the Committee. It is so important that this Bill has parliamentary scrutiny. It is a Bill that the DCMS has spent a lot of time on, getting it right and looking at the systems and the frameworks. However, it will lead to a fundamentally different internet for UK users versus the rest of the world. It is one of the most complicated Bills we are seeing anywhere in the world. I realise that it is very important to have scrutiny of us as platforms to determine what we are doing, but I think it is really important to also look at the substance of the Bill. If we have time, I would welcome the chance to give a little feedback on the substance of the Bill too.

Becky Foreman: I would add that the Committee spent a lot of time talking to Meta, who are obviously a big focus for the Bill, but it is important to remember that there are numerous other networks and services that potentially will be caught by the Bill and that are very different from Meta. It is important to remember that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

While the Bill is proportionate in its measures, it is not designed to impose undue burdens on companies that are not high risk. I have one more question for Richard. I think Katie was saying that she wanted to make a statement?

None Portrait The Chair
- Hansard -

We are out of time. I am sorry about this; I regard it as woefully unsatisfactory. We have got three witnesses here, a lot of questions that need to be answered, and not enough time to do it. However, we have a raft of witnesses coming in for the rest of the day, so I am going to have to draw a line under this now. I am very grateful to you for taking the trouble to come—the Committee is indebted to you. You must have the opportunity to make your case. Would you be kind enough to put any comments that you wish to make in writing so that the Committee can have them. Feel free to go as broad as you would like because I feel very strongly that you have been short-changed this afternoon. We are indebted to you. Thank you very much indeed.

Richard Earley: We will certainly do that and look forward to providing comments in writing.

Examination of Witnesses

Professor Clare McGlynn, Jessica Eagelton and Janaya Walker gave evidence.

14:48
None Portrait The Chair
- Hansard -

Good afternoon. We now hear oral evidence from Professor Clare McGlynn, professor of law at Durham University, Jessica Eagleton, policy and public affairs manager at Refuge, and Janaya Walker, public affairs manager at End Violence Against Women. Ladies, thank you very much for taking the trouble to join us this afternoon. We look forward to hearing from you.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Sir Roger, and thank you to the witnesses for joining us. We hear a lot about the negative experiences online of women, particularly women of colour. If violence against women and girls is not mentioned directly in the Bill, if misogyny is not made a priority harm, and if the violence against women and girls code of practice is not adopted in the Bill, what will that mean for the experience of women and girls?

Janaya Walker: Thank you for the opportunity to speak today. As you have addressed there, the real consensus among violence against women and girls organisations is for VAWG to be named in the Bill. The concern is that without that, the requirements that are placed on providers of regulated services will be very narrowly tied to the priority illegal content in schedule 7, as well as other illegal content.

We are very clear that violence against women and girls is part of a continuum in which there is a really broad manifestation of behaviour; some reaches a criminal threshold, but there is other behaviour that is important to be understood as part of the wider context. Much of the abuse that women and girls face cannot be understood by only looking through a criminal lens. We have to think about the relationship between the sender and the recipient—if it is an ex-partner, for example—the severity of the abuse they have experienced, the previous history and also the reach of the content. The worry is that the outcome of the Bill will be a missed opportunity in terms of addressing something that the Government have repeatedly committed to as a priority.

As you mentioned, we have worked with Refuge, Clare McGlynn, the NSPCC and 5Rights, bringing together our expertise to produce this full code of practice, which we think the Bill should be amended to include. The code of practice would introduce a cross-cutting duty that tries to mitigate this kind of pocketing of violence against women and girls into those three categories, to ensure that it is addressed really comprehensively.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q To what extent do you think that the provisions on anonymity will assist in reducing online violence against women and girls? Will the provisions currently in the Bill make a difference?

Janaya Walker: I think it will be limited. For the End Violence Against Women Coalition, our priority above all else is having a systems-based approach. Prevention really needs to be at the heart of the Bill. We need to think about the choices that platforms make in the design and operation of their services in order to prevent violence against women and girls in the first instance.

Anonymity has a place in the sense of providing users with agency, particularly in a context where a person is in danger and they could take that step in order to mitigate harm. There is a worry, though, when we look at things through an intersectional lens—thinking about how violence against women and girls intersects with other forms of harm, such as racism and homophobia. Lots of marginalised and minoritised people rely very heavy on being able to participate online anonymously, so we do not want to create a two-tier system whereby some people’s safety is contingent on them being a verified user, which is one option available. We would like the focus to be much more on prevention in the first instance.

None Portrait The Chair
- Hansard -

Professor McGlynn and Ms Eagelton, you must feel free to come in if you wish to.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q My final question is probably directed at you, Professor McGlynn. Although we welcome the new communications offence of cyber-flashing, one of the criticisms is that it will not actually make a difference because of the onus on proving intent to cause harm, rather than the sender providing consent to receive the material. How do you respond to that?

Professor Clare McGlynn: I think it is great that the Government have recognised the harms of cyber-flashing and put that into the Bill. In the last couple of weeks we have had the case of Gaia Pope, a teenager who went missing and died—an inquest is currently taking place in Dorset. The case has raised the issue of the harms of cyber-flashing, because in the days before she went missing she was sent indecent images that triggered post-traumatic stress disorder from a previous rape. On the day she went missing, her aunt was trying to report that to the police, and one of the police officers was reported as saying that she was “taking the piss”.

What I think that case highlights, interestingly, is that this girl was triggered by receiving these images, and it triggered a lot of adverse consequences. We do not know why that man sent her those images, and I guess my question would be: does it actually matter why he sent them? Unfortunately, the Bill says that why he sent them does matter, despite the harm it caused, because it would only be a criminal offence if it could be proved that he sent them with the intention of causing distress or for sexual gratification and being reckless about causing distress.

That has two main consequences. First, it is not comprehensive, so it does not cover all cases of cyber-flashing. The real risk is that a woman, having seen the headlines and heard the rhetoric about cyber-flashing being criminalised, might go to report it to the police but will then be told, “Actually, your case of cyber-flashing isn’t criminal. Sorry.” That might just undermine women’s confidence in the criminal justice system even further.

Secondly, this threshold of having to prove the intention to cause distress is an evidential threshold, so even if you think, as might well be the case, that he sent the image to cause distress, you need the evidence to prove it. We know from the offence of non-consensual sending of sexual images that it is that threshold that limits prosecutions, but we are repeating that mistake here with this offence. So I think a consent-based, comprehensive, straightforward offence would send a stronger message and be a better message from which education could then take place.

None Portrait The Chair
- Hansard -

You are nodding, Ms Eagelton.

Jessica Eagelton: I agree with Professor McGlynn. Thinking about the broader landscape and intimate image abuse as well, I think there are some significant gaps. There is quite a piecemeal approach at the moment and issues that we are seeing in terms of enforcing measures on domestic abuse as well.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Q Thank you to all the panellists; it is incredibly helpful to have you here. The strength of the Bill will really be underpinned by the strength of the criminal law that underpins it, and schedule 7 lists offences that relate to sexual images, including revenge pornography, as priority offences. Can the witnesses say whether they think the law is sufficient to protect women from having their intimate pictures shared without their consent, or indeed whether the Bill will do anything to prevent the making and sharing of deepfake images? What would you like to see?

Professor Clare McGlynn: You make a very good point about how, in essence, criminal offences are now going to play a key part in the obligations of platforms under this Bill. In general, historically, the criminal law has not been a friend to women and girls. The criminal law was not written, designed or interpreted with women’s harms in mind. That means that you have a very piecemeal, confusing, out-of-date criminal law, particularly as regards online abuse, yet that is the basis on which we have to go forward. That is an unfortunate place for us to be, but I think we can strengthen it.

We could strengthen schedule 7 by, for example, including trafficking offences. There are tens of thousands of cases of trafficking, as we know from yourselves and whistleblowers, that platforms could be doing so much more about, but that is not a priority offence. The Obscene Publications Act distribution of unlawful images offence is not included. That means that incest porn, for example, is not a priority offence; it could be if we put obscene publications in that provision. Cyber-flashing, which again companies could take a lot of steps to act against, is not listed as a priority offence. Blackmail—sexual extortion, which has risen exponentially during the pandemic—again is not listed as a priority offence.

Deepfake pornography is a rising phenomenon. It is not an offence in English law to distribute deepfake pornography at the moment. That could be a very straightforward, simple change in the Bill. Only a few words are needed. It is very straightforward to make that a criminal offence, thanks to Scots law, where it is actually an offence to distribute altered images. The way the Bill is structured means the platforms will have to go by the highest standard, so in relation to deepfake porn, it would be interpreted as a priority harm—assuming that schedule 7 is actually altered to include all the Scottish offences, and the Northern Irish ones, which are absent at the moment.

The deepfake example points to a wider problem with the criminal law on online abuse: the laws vary considerably across the jurisdictions. There are very different laws on down-blousing, deepfake porn, intimate image abuse, extreme pornography, across all the different jurisdictions, so among the hundreds of lawyers the platforms are appointing, I hope they are appointing some Scots criminal lawyers, because that is where the highest standard tends to be.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Would the other panellists like to comment on this?

Jessica Eagelton: I think something that will particularly help in this instance is having that broad code of practice; that is a really important addition that must be made to the Bill. Refuge is the largest specialist provider of gender-based violence services in the country. We have a specialist tech abuse team who specialise in technology-facilitated domestic abuse, and what they have seen is that, pretty consistently, survivors are being let down by the platforms. They wait weeks and weeks for responses—months sometimes—if they get a response at all, and the reporting systems are just not up to scratch.

I think it will help to have the broad code of practice that Janaya mentioned. We collaborated with others to produce a workable example of what that could look like, for Ofcom to hopefully take as a starting point if it is mandated in the Bill. That sets out steps to improve the victim journey through content reporting, for example. Hopefully, via the code of practice, a victim of deepfakes and other forms of intimate image abuse would be able to have a more streamlined, better response from platforms.

I would also like to say, just touching on the point about schedule 7, that from the point of view of domestic abuse, there is another significant gap in that: controlling and coercive behaviour is not listed, but it should be. Controlling and coercive behaviour is one of the most common forms of domestic abuse. It carries serious risk; it is one of the key aggravating factors for domestic homicide, and we are seeing countless examples of that online, so we think that is another gap in schedule 7.

None Portrait The Chair
- Hansard -

Ms Walker?

Janaya Walker: Some of these discussions almost reiterate what I was saying earlier about the problematic nature of this, in that so much of what companies are going to be directed to do will be tied only to the specific schedule 7 offences. There have been lots of discussions about how you respond to some harms that reach a threshold of criminality and others that do not, but that really contrasts with the best practice approach to addressing violence against women and girls, which is really trying to understand the context and all of the ways that it manifests. There is a real worry among violence against women and girls organisations about the minimal response to content that is harmful to adults and children, but will not require taking such a rigorous approach.

Having the definition of violence against women and girls on the face of the Bill allows us to retain those expectations on providers as technology changes and new forms of abuse emerge, because the definition is there. It is VAWG as a whole that we are expecting the companies to address, rather than a changing list of offences that may or may not be captured in criminal law.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Why is it important that we have this? Is this a big thing? What are you guys actually seeing here?

Jessica Eagelton: I can respond to that in terms of what we are seeing as a provider. Technology-facilitated domestic abuse is an increasing form of domestic abuse: technology is providing perpetrators with increasing ways to abuse and harass survivors. What we are seeing on social media is constant abuse, harassment, intimate image abuse, monitoring and hacking of accounts, but when it comes to the responses we are getting from platforms at the moment, while I acknowledge that there is some good practice, the majority experience of survivors is that platforms are not responding sufficiently to the tech abuse they are experiencing.

Our concern is that the Bill could be a really good opportunity for survivors of domestic abuse to have greater protections online that would mean that they are not forced to come offline. At the moment, some of the options being given to survivors are to block the perpetrator—which in some cases has a minimal impact when they can easily set up new fake accounts—or to come offline completely. First, that is not a solution to that person being able to maintain contact, stay online and take part in public debate. But secondly, it can actually escalate risk in some cases, because a perpetrator could resort to in-person forms of abuse. If we do not make some of these changes—I am thinking in particular about mandating a VAWG code of practice, and looking at schedule 7 and including controlling and coercive behaviour—the Bill is going to be a missed opportunity. Women and survivors have been waiting long enough, and we need to take this opportunity.

Janaya Walker: If I could add to that, as Jessica has highlighted, there is the direct harm to survivors in terms of the really distressing experience of being exposed to these forms of harm, or the harm they experience offline being exacerbated online, but this is also about indirect harm. We need to think about the ways in which the choices that companies are making are having an impact on the extent to which violence against women and girls is allowed to flourish.

As Jessica said, it impacts our ability to participate in online discourse, because we often see a mirroring online of what happens offline, in the sense that the onus is often on women to take responsibility for keeping themselves safe. That is the status quo we see offline, in terms of the decisions we make about what we are told to wear or where we should go as a response to violence against women and girls. Similarly, online, the onus is often on us to come offline or put our profiles on private, to take all those actions, or to follow up with complaints to various different companies that are not taking action. There is also something about the wider impact on society as a whole by not addressing this within the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q How does the proposed code of practice—or, I suppose, how could the Bill—tackle intersectionality of harms?

Janaya Walker: This is a really important question. We often highlight the fact that, as I have said, violence against women and girls often intersects with other forms of discrimination. For example, we know from research that EVAW conducted with Glitch during the pandemic that black and minoritised women and non-binary people experience a higher proportion of abuse. Similarly, research done by Amnesty International shows that black women experience harassment at a rate 84% higher than that experienced by their white counterparts. It is a real focal point. When we think about the abuse experienced, we see the ways that people’s identities are impacted and how structural discrimination emerges online.

What we have done with the code of practice is try to introduce requirements for the companies to think about things through that lens, so having an overarching human rights and equalities framework and having the Equality Act protected characteristics named as a minimum. We see in the Bill quite vague language when it comes to intersectionality; it talks about people being members of a certain group. We do not have confidence that these companies, which are not famed for their diversity, will interpret that in a way that we regard as robust—thinking very clearly about protected characteristics, human rights and equalities legislation. The vagueness in the Bill is quite concerning. The code of practice is an attempt to be more directive on what we want to see and how to think through issues in a way that considers all survivors, all women and girls.

Professor Clare McGlynn: I wholly agree. The code of practice is one way by which we can explain in detail those sorts of intersecting harms and what companies and platforms should do, but I think it is vital that we also write it into the Bill. For example, on the definitions around certain characteristics and certain groups, in previous iterations reference was made to protected characteristics. I know certain groups can go wider than that, but naming those protected characteristics is really important, so that they are front and centre and the platforms know that that is exactly what they have to cover. That will cover all the bases and ensure that that happens.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a quite specific question on something that is a bit tangential.

None Portrait The Chair
- Hansard -

Last one, please.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q If someone has consented to take part in pornography and they later change their mind and would like it to be taken down, do you think they should have the right to ask a porn website, for example, to take it down?

Professor Clare McGlynn: That is quite challenging not only for pornography platforms but for sex workers, in that if you could participate in pornography but at any time thereafter withdraw your consent, it is difficult to understand how a pornography company and the sex worker would be able to make a significant amount of money. The company would be reluctant to invest because it might have to withdraw the material at any time. In my view, that is a quite a challenge. I would not go down that route, because what it highlights is that the industry can be exploitative and that is where the concern comes from. I think there are other ways to deal with an exploitative porn industry and other ways to ensure that the material online has the full consent of participants. You could put some of those provisions into the Bill—for example, making the porn companies verify the age and consent of those who are participating in the videos for them to be uploaded. I think that is a better way to deal with that, and it would ensure that sex workers themselves can still contract to perform in porn and sustain their way of life.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you very much—this is extremely interesting and helpful. You have covered a lot of ground already, but I wonder whether there is anything specific you think the Bill should be doing more about, to protect girls—under-18s or under-16s—in particular?

Janaya Walker: A lot of what we have discussed in terms of naming violence against women and girls on the face of the Bill includes children. We know that four in five offences of sexual communications with a child involved girls, and a lot of child abuse material is targeted at girls specifically. The Bill as a whole takes a very gender-neutral approach, which we do not think is helpful; in fact, we think it is quite harmful to trying to reduce the harm that girls face online.

This goes against the approach taken in the Home Office violence against women and girls strategy and its domestic abuse plan, as well as the gold-standard treaties the UK has signed up to, such as the Istanbul convention, which we signed and have recently committed to ratifying. The convention states explicitly that domestic laws, including on violence against women and girls online, need to take a very gendered approach. Currently, it is almost implied, with references to specific characteristics. We think that in addressing the abuse that girls, specifically, experience, we need to name girls. To clarify, the words “women”, “girls”, “gender” and “sex” do not appear in the Bill, and that is a problem.

Jessica Eagelton: May I add a point that is slightly broader than your question? Another thing that the Bill does not do at the moment is provide for specialist victim support for girls who are experiencing online abuse. There has been some discussion about taking a “polluter pays” approach; where platforms are not compliant with the duties, for example, a percentage of the funds that go to the regulator could go towards victim support services, such as the revenge porn helpline and Refuge’s tech abuse team, that provide support to victims of abuse later on.

Professor Clare McGlynn: I can speak to pornography. Do you want to cover that separately, or shall I do that now?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

That is fine.

Professor Clare McGlynn: I know that there was a discussion this morning about age assurance, which obviously targets children’s access to pornography. I would emphasise that age assurance is not a panacea for the problems with pornography. We are so worried about age assurance only because of the content that is available online. The pornography industry is quite happy with age verification measures. It is a win-win for them: they get public credibility by saying they will adopt it; they can monetise it, because they are going to get more data—especially if they are encouraged to develop age verification measures, which of course they have been; that really is putting the fox in charge of the henhouse—and they know that it will be easily evaded.

One of the most recent surveys of young people in the UK was of 16 and 17-year-olds: 50% of them had used a VPN, which avoids age verification controls, and 25% more knew about that, so 75% of those older children knew how to evade age assurance. This is why the companies are quite happy—they are going to make money. It will stop some people stumbling across it, but it will not stop most older children accessing pornography. We need to focus on the content, and when we do that, we have to go beyond age assurance.

You have just heard Google talking about how it takes safety very seriously. Rape porn and incest porn are one click away on Google. They are freely and easily accessible. There are swathes of that material on Google. Twitter is hiding in plain sight, too. I know that you had a discussion about Twitter this morning. I, like many, thought, “Yes, I know there is porn on Twitter,” but I must confess that until doing some prep over the last few weeks, I did not know the nature of that porn. For example, “Kidnapped in the wood”; “Daddy’s little girl comes home from school; let’s now cheer her up”; “Raped behind the bin”—this is the material that is on Twitter. We know there is a problem with Pornhub, but this is what is on Twitter as well.

As the Minister mentioned this morning, Twitter says you have to be 13, and you have to be 18 to try to access much of this content, but you just put in whatever date of birth is necessary—it is that easy—and you can get all this material. It is freely and easily accessible. Those companies are hiding in plain sight in that sense. The age verification and age assurance provisions, and the safety duties, need to be toughened up.

To an extent, I think this will come down to the regulator. Is the regulator going to accept Google’s SafeSearch as satisfying the safety duties? I am not convinced, because of the easy accessibility of the rape and incest porn I have just talked about. I emphasise that incest porn is not classed as extreme pornography, so it is not a priority offence, but there are swathes of that material on Pornhub as well. In one of the studies that I did, we found that one in eight titles on the mainstream pornography sites described sexually violent material, and the incest material was the highest category in that. There is a lot of that around.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q We are talking here about pornography when it is hosted on mainstream websites, as opposed to pornographic websites. Could I ask you to confirm what more, specifically, you think the Bill should do to tackle pornography on mainstream websites, as you have just been describing with Twitter? What should the Bill be doing here?

Professor Clare McGlynn: In many ways, it is going to be up to the regulator. Is the regulator going to deem that things such as SafeSearch, or Twitter’s current rules about sensitive information—which rely on the host to identify their material as sensitive—satisfy their obligations to minimise and mitigate the risk? That is, in essence, what it will all come down to.

Are they going to take the terms and conditions of Twitter, for example, at face value? Twitter’s terms and conditions do say that they do not want sexually violent material on there, and they even say that it is because they know it glorifies violence against women and girls, but this material is there and does not appear to get swiftly and easily taken down. Even when you try to block it—I tried to block some cartoon child sexual abuse images, which are easily available on there; you do not have to search for them very hard, it literally comes up when you search for porn—it brings you up five or six other options in case you want to report them as well, so you are viewing them as well. Just on the cartoon child sexual abuse images, before anyone asks, they are very clever, because they are just under the radar of what is actually a prohibited offence.

It is not necessarily that there is more that the Bill itself could do, although the code of practice would ensure that they have to think about these things more. They have to report on their transparency and their risk assessments: for example, what type of content are they taking down? Who is making the reports, and how many are they upholding? But it is then on the regulator as to what they are going to accept as acceptable, frankly.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Do any other panellists want to add to that?

Janaya Walker: Just to draw together the questions about pornography and the question you asked about children, I wanted to highlight one of the things that came up earlier, which was the importance of media literacy. We share the view that that has been rolled back from earlier versions of the draft Bill.

There has also been a shift, in that the emphasis of the draft Bill was also talking about the impact of harm. That is really important when we are talking about violence against women and girls, and what is happening in the context of schools and relationship and sex education. Where some of these things like non-consensual image sharing take place, the Bill as currently drafted talks about media literacy and safe use of the service, rather than the impact of such material and really trying to point to the collective responsibility that everyone has as good digital citizens—in the language of Glitch—in terms of talking about online violence against women and girls. That is an area in which the Bill could be strengthened from the way it is currently drafted.

Jessica Eagelton: I completely agree with the media literacy point. In general, we see very low awareness of what tech abuse is. We surveyed some survivors and did some research last year—a public survey—and almost half of survivors told no one about the abuse they experienced online at the hands of their partner or former partner, and many of the survivors we interviewed did not understand what it was until they had come to Refuge and we had provided them with support. There is an aspect of that to the broader media literacy point as well: increasing awareness of what is and is not unacceptable behaviour online, and encouraging members of the public to report that and call it out when they see it.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Thank you. Can I ask for a bit more detail on a question that you touched on earlier with my colleague Kirsty Blackman? It is to Professor McGlynn, really. I think you included in your written evidence to the Committee a point about using age and consent verification for pornography sites for people featured in the content of the site—not the age verification assurance checks on the sites, but for the content. Could I just draw out from you whether that is feasible, and would it be retrospective for all videos, or just new ones? How would that work?

Professor Clare McGlynn: Inevitably, it would have to work from any time that that requirement was put in place, in reality. That measure is being discussed in the Canadian Parliament at the moment—you might know that Pornhub’s parent company, MindGeek, is based in Canada, which is why they are doing a lot of work in that regard. The provision was also put forward by the European Parliament in its debates on the Digital Services Act. Of course, any of these measures are possible; we could put it into the Bill that that will be a requirement.

Another way of doing it, of course, would be for the regulator to say that one of the ways in which Pornhub, for example—or XVideos or xHamster—should ensure that they are fulfilling their safety duties is by ensuring the age and consent of those for whom videos are uploaded. The flipside of that is that we could also introduce an offence for uploading a video and falsely representing that the person in the video had given their consent to that. That would mirror offences in the Fraud Act 2006.

The idea is really about introducing some element of friction so that there is a break before images are uploaded. For example, with intimate image abuse, which we have already talked about, the revenge porn helpline reports that for over half of the cases of such abuse that it deals with, the images go on to porn websites. So those aspects are really important. It is not just about all porn videos; it is also about trying to reduce the distribution of non-consensual videos.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

Q I think that it would have been better to hear from you three before we heard from the platforms this morning. Unfortunately, you have opened my eyes to a few things that I wish I did not have to know about—I think we all feel the same.

I am concerned about VPNs. Will the Bill stop anyone accessing through VPNs? Is there anything we can do about that? I googled “VPNs” to find out what they were, and apparently there is a genuine need for them when using public networks, because it is safer. Costa Coffee suggests that people do so, for example. I do not know how we could work that.

You have obviously educated me, and probably some of my colleagues, about some of the sites that are available. I do not mix in circles where I would be exposed to that, but obviously children and young people do and there is no filter. If I did know about those things, I would probably not speak to my colleagues about it, because that would probably not be a good thing to do, but younger people might think it is quite funny to talk about. Do you think there is an education piece there for schools and parents? Should these platforms be saying to them, “Look, this is out there, even though you might not have heard of it—some MPs have not heard of it.” We ought to be doing something to protect children by telling parents what to look out for. Could there be something in the Bill to force them to do that? Do you think that would be a good idea? There is an awful lot there to answer—sorry.

Professor Clare McGlynn: On VPNs, I guess it is like so much technology: obviously it can be used for good, but it can also be used to evade regulations. My understanding is that individuals will be able to use a VPN to avoid age verification. On that point, I emphasise that in recent years Pornhub, at the same time as it was talking to the Government about developing age verification, was developing its own VPN app. At the same time it was saying, “Of course we will comply with your age verification rules.”

Don’t get me wrong: the age assurance provisions are important, because they will stop people stumbling across material, which is particularly important for the very youngest. In reality, 75% know about VPNs now, but once it becomes more widely known that this is how to evade it, I expect that all younger people will know how to do so. I do not think there is anything else you can do in the Bill, because you are not going to outlaw VPNs, for the reasons you identified—they are actually really important in some ways.

That is why the focus needs to be on content, because that is what we are actually concerned about. When you talk about media literacy and understanding, you are absolutely right, because we need to do more to educate all people, including young people—it does not just stop at age 18—about the nature of the pornography and the impact it can have. I guess that goes to the point about media literacy as well. It does also go to the point about fully and expertly resourcing sex and relationships education in school. Pornhub has its own sex education arm, but it is not the sex education arm that I think many of us would want to be encouraging. We need to be doing more in that regard.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Q This might sound like a silly question. Can we not just put age verification on VPN sites, so that you can only have VPN access if you have gone through age verification? Do you understand what I am saying?

Professor Clare McGlynn: I do. We are beginning to reach the limits of my technical knowledge.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

You have gone beyond mine anyway.

Professor Clare McGlynn: You might be able to do that through regulations on your phone. If you have a phone that is age-protected, you might not be able to download a particular VPN app, perhaps. Maybe you could do that, but people would find ways to evade that requirement as well. We have to tackle the content. That is why you need to tackle Google and Twitter as well as the likes of Pornhub.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Can we have them back in, Sir Roger?

None Portrait The Chair
- Hansard -

Minister?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Sir Roger, and thank you to the witnesses for coming in and giving very clear, helpful and powerful evidence to the Committee this afternoon. On the question of age verification or age assurance that we have just spoken about, clause 11(14) of the Bill sets a standard in the legislation that will be translated into the codes of practice by Ofcom. It says that, for the purposes of the subsection before on whether or not children can access a particular set of content, a platform is

“only entitled to conclude that it is not possible for children to access a service…if there are systems or processes in place…that achieve the result that children are not normally able to access the service”.

Ofcom will then interpret in codes of practice what that means practically. Professor McGlynn, do you think that standard set out there—

“the result that children are not normally able to access the service or that part of it”

—is sufficiently high to address the concerns we have been discussing in the last few minutes?

Professor Clare McGlynn: At the moment, the wording with regard to age assurance in part 5—the pornography providers—is slightly different, compared with the other safety duties. That is one technicality that could be amended. As for whether the provision you just talked about is sufficient, in truth I think it comes down, in the end, to exactly what is required, and of course we do not yet know what the nature of the age verification or age assurance requirements will actually be and what that will actually mean.

I do not know what that will actually mean for something like Twitter. What will they have to do to change it? In principle, that terminology is possibly sufficient, but it kind of depends in practice what it actually means in terms of those codes of practice. We do not yet know what it means, because all we have in the Bill is about age assurance or age verification.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, you are quite right that the Ofcom codes of practice will be important. As far as I can see, the difference between clauses 68 and 11(14) is that one uses the word “access” and the other uses the word “encounter”. Is that your analysis of the difference as well?

Professor Clare McGlynn: My understanding as well is that those terms are, at the moment, being interpreted slightly differently in terms of the requirements that people will be under. I am just making a point about it probably being easier to harmonise those terms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you very much. I wanted to ask you a different question—one that has not come up so far in this session but has been raised quite frequently in the media. It concerns freedom of speech. This is probably for Professor McGlynn again. I am asking you this in your capacity as a professor of law. Some commentators have suggested that the Bill will have an adverse impact on freedom of speech. I do not agree with that. I have written an article in The Times today making that case, but what is your expert legal analysis of that question?

Professor Clare McGlynn: I read your piece in The Times this morning, which was a robust defence of the legislation, in that it said that it is no threat to freedom of speech, but I hope you read my quote tweet, in which I emphasised that there is a strong case to be made for regulation to free the speech of many others, including women and girls and other marginalised people. For example, the current lack of regulation means that women’s freedom of speech is restricted because we fear going online because of the abuse we might encounter. Regulation frees speech, while your Bill does not unduly limit freedom of speech.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, I take your second point, but did you agree with the point that the Bill as crafted does not restrict what you would ordinarily consider to be free speech?

Professor Clare McGlynn: There are many ways in which speech is regulated. The social media companies already make choices about what speech is online and offline. There are strengths in the Bill, such as the ability to challenge when material is taken offline, because that can impact on women and girls as well. They might want to put forward a story about their experiences of abuse, for example. If that gets taken down, they will want to raise a complaint and have it swiftly dealt with, not just left in an inbox.

There are lots of ways in which speech is regulated, and the idea of having a binary choice between free speech and no free speech is inappropriate. Free speech is always regulated, and it is about how we choose to regulate it. I would keep making the point that the speech of women and girls and other marginalised people is minimised at the moment, so we need regulation to free it. The House of Lords and various other reports about free speech and regulation, for example, around extreme pornography, talk about regulation as being human-rights-enhancing. That is the approach we need to take.

None Portrait The Chair
- Hansard -

Thank you very much indeed. Once again, I am afraid I have to draw the session to a close, and once again we have probably not covered all the ground we would have liked. Professor McGlynn, Ms Walker, Ms Eagleton, thank you very much indeed. As always, if you have further thoughts or comments, please put them in writing and let us know. We are indebted to you.

Examination of Witnesses

Lulu Freemont, Ian Stevenson and Adam Hildreth gave evidence.

15:32
None Portrait The Chair
- Hansard -

We will now hear oral evidence from Lulu Freemont, head of digital regulation at techUK; Ian Stevenson, the chairman of OSTIA; and Adam Hildreth, chief executive officer of Crisp, who is appearing by Zoom—and it works. Thank you all for joining us. I will not waste further time by asking you to identify yourselves, because I have effectively done that for you. Without further ado, I call Alex Davies-Jones.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Sir Roger; thank you, witnesses. We want the UK to become a world leader in tech start-ups. We want those employment opportunities for the future. Does this legislation, as it currently stands, threaten that ability?

Lulu Freemont: Hi everybody. Thank you so much for inviting techUK to give evidence today. Just to give a small intro to techUK, so that you know the perspective I am coming from, we are the trade body for the tech sector. We have roughly 850 tech companies in our membership, the majority of which are small and medium-sized enterprises. We are really focused on how this regime will work for the 25,000 tech companies that are set to be in scope, and our approach is really on the implementation and how the Bill can deliver on the objectives.

Thank you so much for the question. There are some definite risks when we think about smaller businesses and the Online Safety Bill. Today, we have heard a lot of the names that come up with regard to tech companies; they are the larger companies. However, this will be a regime that impacts thousands of different tech companies, with different functionalities and different roles within the ecosystem, all of which contribute to the economy in their own way.

There are specific areas to be addressed in the Bill, where there are some threats to innovation and investment by smaller businesses. First, greater clarity is needed. In order for this regime to be workable for smaller businesses, they need clarity on guidelines and on definitions, and they also need to be confident that the systems and processes that they put in place will be sustainable—in other words, the right ones.

Certain parts of the regime risk not having enough clarity. The first thing that I will point to is around the definitions of harm. We would very much welcome having some definitions of harmful content, or even categories of harmful content, in primary legislation. It might then be for Ofcom to determine how those definitions are interpreted within the codes, but having things to work off and types of harmful content for smaller businesses to start thinking about would be useful; obviously, that will be towards children, given that they are likely to be category 2.

The second risk for smaller businesses is really around the powers of the Secretary of State. I think there is a real concern. The Secretary of State will have some technical powers, which are pretty much normal; they are what you would expect in any form of regulation. However, the Online Safety Bill goes a bit further than that, introducing some amendment powers. So, the Secretary of State can modify codes of practice to align with public policy. In addition to that, there are provisions to allow the Secretary of State to set thresholds between the categories of companies.

Smaller businesses want to start forming a strong relationship with Ofcom and putting systems and processes in place that they can feel confident in. If they do not have that level of confidence and if the regime could be changed at any point, they might not be able to progress with those systems and processes, and when it comes to kind of pushing them out of the market, they might not be able to keep up with some of the larger companies that have been very much referenced in every conversation.

So, we need to think about proportionality, and we need to think about Ofcom’s independence and the kind of relationship that it can form with smaller businesses. We also need to think about balance. This regime is looking to strike a balance between safety, free speech and innovation in the UK’s digital economy. Let us just ensure that we provide enough clarity for businesses so that they can get going and have confidence in what they are doing.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Lulu. Adam and Ian, if either of you want to come in at any point, please just indicate that and I will bring you in.

None Portrait The Chair
- Hansard -

May I just apologise before we go any further, because I got you both the wrong way round? I am sorry. It is Mr Stevenson who is online and it is Adam Hildreth who is here in body and person.

Adam Hildreth: I think we have evolved as a world actually, when it comes to online safety. I think that if you went back five or 10 years, safety would have come after your people had developed their app, their platform or whatever they were creating from a tech perspective. I think we are now in a world where safety, in various forms, has to be there by default. And moving on to your point, we have to understand what that means for different sizes of businesses. The risk assessment word or phrase for me is the critical part there, which is putting blocks in front of people who are innovating and creating entrepreneurial businesses that make the online world a better place. Putting those blocks in without them understanding whether they can compete or not in an open and fair market is where we do not want to be.

So, getting to the point where it is very easy to understand is important—a bit like where we got to in other areas, such as data protection and where we went with the GDPR. In the end, it became simplified; I will not use the word “simplified” ever again in relation to GDPR, but it did become simplified from where it started. It is really important for anyone developing any type of tech platform that the Online Safety Bill will affect that they understand exactly what they do and do not have to put in place; otherwise, they will be taken out just by not having a legal understanding of what is required.

The other point to add, though, is that there is a whole other side to online safety, which is the online safety tech industry. There are tons of companies in the UK and worldwide that are developing innovative technologies that solve these problems. So, there is a positive as well as an understanding of how the Bill needs to be created and publicised, so that people understand what the boundaries are, if you are a UK business.

None Portrait The Chair
- Hansard -

Mr Stevenson, you are nodding. Do you want to come in?

Ian Stevenson: I agree with the contributions from both Adam and Lulu. For me, one of the strengths of the Bill in terms of the opportunity for innovators is that so much is left to Ofcom to provide codes of practice and so on in the future, but simultaneously that is its weakness in the short term. In the absence of those codes of practice and definitions of exactly where the boundaries between merely undesirable and actually harmful and actionable might lie, the situation is very difficult. It is very difficult for companies like my own and the other members of the Online Safety Tech Industry Association, who are trying to produce technology to support safer experiences online, to know exactly what that technology should do until we know which harms are in scope and exactly what the thresholds are and what the definitions of those harms are. Similarly, it is very hard for anybody building a service to know what technologies, processes and procedures they will need until they have considerably more detailed information than they have at the moment.

I agree that there are certain benefits to having more of that in the Bill, especially when it comes to the harms, but in terms of the aspiration and of what I hear is the objective of the Bill—creating safer online experiences—we really need to understand when we are going to have much more clarity and detail from Ofcom and any other relevant party about exactly what is going to be seen as best practice and acceptable practice, so that people can put in place those measures on their sites and companies in the Online Safety Tech Industry Association can build the tools to help support putting those measures in place.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you all. Lulu, you mentioned concerns about the Secretary of State’s powers and Ofcom’s independence. Other concerns expressed about Ofcom include its ability to carry out this regulation. It is being hailed as the saviour of the internet by some people. Twenty-five thousand tech companies in the UK will be under these Ofcom regulations, but questions have been asked about its technical and administrative capacity to do this. Just today, there is an online safety regulator funding policy adviser role being advertised by the Department for Digital, Culture, Media and Sport. Part of the key roles and responsibilities are:

“The successful post holder will play a key role in online safety as the policy advisor on Funding for the Online Safety Regulator.”

Basically, their job is to raise money for Ofcom. Does that suggest concerns about the role of Ofcom going forward, its funding, and its resource and capacity to support those 25,000 platforms?

Lulu Freemont: It is a very interesting question. We really support Ofcom in this role. We think that it has a very good track record with other industries that are also in techUK’s membership, such as broadcasters. It has done a very good job at implementing proportionate regulation. We know that it has been increasing its capacity for some time now, and we feel confident that it is working with us as the trade and with a range of other experts to try to understand some of the detail that it will have to understand to regulate.

One of the biggest challenges—we have had this conversation with Ofcom as well—is to understand the functionalities of tech services. The same functionality might be used in a different context, and that functionality could be branded as very high risk in one context but very low risk in another. We are having those conversations now. It is very important that they are being had now, and we would very much welcome Ofcom publishing drafts. We know that is its intention, but it should bring everything forward in terms of all the gaps in this regulation that are left to Ofcom’s codes, guidance and various other documentation.

Adam Hildreth: One of the challenges that I hear a lot, and that we hear a lot at Crisp in our work, is that people think that the Bill will almost eradicate all harmful content everywhere. The challenge that we have with content is that every time we create a new technology or mechanism that defeats harmful or illegal content, the people who are creating it—they are referred to in lots of ways, but bad actors, ultimately—create another mechanism to do it. It is very unlikely that we will ever get to a situation in which it is eradicated from every platform forever—though I hope we do.

What is even harder for a regulator is to be investigating why a piece of content is on a platform. If we get to a position where people are saying, “I saw this bit of content; it was on a platform,” that will be a really dangerous place to be, because the funding requirement for any regulator will go off the charts—think about how much content we consume. I would much prefer to be in a situation where we think about the processes and procedures that a platform puts in place and making them appropriate, ensuring that if features are aimed at children, they do a risk assessment so that they understand how those features are being used and how they could affect children in particular—or they might have a much more diverse user group, whereby harm is much less likely.

So, risk assessments and, as Ian mentioned, technologies, processes and procedures—that is the bit that a regulator can do well. If your risk assessment is good and your technology, process and procedures are as good as they can be based on a risk assessment, that almost should mean that you are doing the best job you possibly can to stop that content appearing, but you are not eradicating it. It really worries me that we are in a position whereby people are going to expect that they will never see content on a platform again, even though billions of pieces of potentially harmful content could have been removed from those platforms.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q On that point, you mentioned that it is hard to predict the future and to regulate on the basis of what is already there. We have waited a long time for the Bill, and in that time we have had new platforms and new emerging technology appear. How confident are you that the Bill allows for future-proofing, in order that we can react to anything new that might crop up on the internet?

Adam Hildreth: I helped personally in 2000 and 2001, when online grooming did not even exist as a law, so I have been involved in this an awful long time, waiting for laws to exist. I do not think we will ever be in a situation in which they are future-proofed if we keep putting every possibility into law. There needs to be some principles there. There are new features launched every day, and assessments need to be made about who they pose a risk to and the level of risk. In the same way as you would do in all kinds of industries, someone should do an assessment from a health and safety perspective. From that, you then say, “Can we even launch it at all? Is it feasible? Actually, we can, because we can take this amount of risk.” Once they understand those risk assessments, technology providers can go further and develop technology that can combat this.

If we can get to the point where it is more about process and the expectations around people who are creating any types of online environments, apps or technologies, it will be future-proofed. If we start trying to determine exact pieces of content, what will happen is that someone will work out a way around it tomorrow, and that content will not be included in the Bill, or it will take too long to get through and suddenly, the whole principle of why we are here and why we are having this discussion will go out the window. That is what we have faced every day since 1998: every time the technology works out how to combat a new risk—whether that is to children, adults, the economy or society—someone comes along and works out a way around the technology or around the rules and regulations. It needs to move quickly; that will future-proof it.

None Portrait The Chair
- Hansard -

I have four Members plus the Minister to get in, so please be brief. I call Dean Russell.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Thank you, Sir Roger. My question builds on the future-proofing. Obviously, the big focus now is the metaverse and a virtual reality world. My question has two parts. First, is the Bill helping already by encouraging the new start-ups in that space to put safety first? Secondly, do you agree that a Joint Committee of the Houses of Parliament that continued to look at the Act and its evolution over the long term once it had been passed would be beneficial? I will come to you first, Lulu.

Lulu Freemont: On future-proofing, one of the real strengths of the Bill is the approach: it is striving to rely on systems and processes, to be flexible and to adapt to future technologies. If the Bill sticks to that approach, it will have the potential to be future-proof. Some points in the Bill raise a slight concern about the future-proofness of the regulation. There is a risk that mandating specific technologies—I know that is one of Ofcom’s powers under the Bill—would put a bit of a timestamp on the regulation, because those technologies will likely become outdated at some point. Ensuring that the regulation remains flexible enough to build on the levels of risk that individual companies have, and on the technologies that work for the development and innovation of those individual companies, will be a really important feature, so we do have some concerns around the mandating of specific technologies in the Bill.

On the point about setting up a committee, one of the things for which techUK has called for a really long time is an independent committee that could think about the current definitions of harm and keep them under review. As companies put in place systems and processes that might mitigate levels of risk of harm, will those levels of harm still be harmful? We need to constantly evolve the regime so that it is true to the harms and risks that are present today, and to evaluate it against human rights implications. Having some sort of democratically led body to think about those definitional points and evaluate them as times change and harm reduces through this regime would be very welcome.

Adam Hildreth: To add to that, are people starting to think differently? Yes, they definitely are. That ultimately, for me, is the purpose of the Bill. It is to get people to start thinking about putting safety as a core principle of what they do as an overall business—not just in the development of their products, but as the overall business. I think that will change things.

A lot of the innovation that comes means that safety is not there as the principal guiding aspect, so businesses do need some help. Once they understand how a particular feature can be exploited, or how it impacts certain demographics or particular age groups—children being one of them—they will look for solutions. A lot of the time, they have no idea before they create this amazing new metaverse, or this new metaverse game, that it could actually be a container for harmful content or new types of harm. I think this is about getting people to think. The risk assessment side is critical, for me—making sure they go through that process or can bring on experts to do that.

Ian Stevenson: I would split the future-proofing question into two parts. There is a part where this Bill will provide Ofcom with a set of powers, and the question will be: does Ofcom have the capacity and agility to keep up with the rate of change in the tech world? Assuming it does, it will be able to act fairly quickly. There is always a risk, however, that once a code of conduct gets issued, it becomes very difficult to update that code of conduct in a responsive way.

There is then a second piece, which is: are the organisations that are in scope of regulation, and the powers that Ofcom has, sufficient as things change? That is where the idea of a long-term committee to keep an eye on this is extremely helpful. That would be most successful if it did not compromise Ofcom’s independence by digging deeply into individual codes of conduct or recommendations, but rather focused on whether Ofcom has the powers and capacity that it needs to regulate as new types of company, platform and technology come along.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Thank you.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My first question is for Lulu. Do small tech companies have enough staff with technical expertise to be able to fulfil their obligations under the Bill?

Lulu Freemont: It is a great question. One of the biggest challenges is capacity. We hear quite a lot from the smaller tech businesses within our membership that they will have to divert their staff away from existing work to comply with the regime. They do not have compliance teams, and they probably do not have legal counsel. Even at this stage, to try to understand the Bill as it is currently drafted—there are lots of gaps—they are coming to us and saying, “What does this mean in practice?” They do not have the answers, or the capability to identify that. Attendant regulatory costs—thinking about the staff that you have and the cost, and making sure the regulation is proportionate to the need to divert away from business development or whatever work you might be doing in your business—are really fundamental.

Another real risk, and something in the Bill that smaller businesses are quite concerned about, is the potential proposal to extend the senior management liability provisions. We can understand them being in there to enable the regulators to do their job—information requests—but if there is any extension into individual pieces of content, coupled with a real lack of definitions, those businesses might find themselves in the position of restricting access to their services, removing too much content or feeling like they cannot comply with the regime in a proportionate way. That is obviously a very extreme case study. It will be Ofcom’s role to make sure that those businesses are being proportionate and understand the provisions, but the senior management liability does have a real, chilling impact on the smaller businesses within our membership.

Adam Hildreth: One of the challenges that we have seen over the last few years is that you can have a business that is small in revenue but has a huge global user base, with millions of users, so it is not really a small business; it just has not got to the point where it is getting advertisers and getting users to pay for it. I have a challenge on the definition of a small to medium-sized business. Absolutely, for start-ups with four people in a room—or perhaps even still just two—that do not have legal counsel or anything else, we need to make it simple for those types of businesses to ingest and understand what the principles are and what is expected of them. Hopefully they will be able to do quite a lot early on.

The real challenge comes when someone labels themselves as a small business but they have millions of users across the globe—and sometimes actually quite a lot of people working for them. Some of the biggest tech businesses in the world that we all use had tens of people working for them at one point in time, when they had millions of users. That is the challenge, because there is an expectation for the big-tier providers to be spending an awful lot of money, when the small companies are actually directly competing with them. There is a challenge to understanding the definition a small business and whether that is revenue-focused, employee-focused or about how many users it has—there may be other metrics.

Ian Stevenson: One of the key questions is how much staffing this will actually take. Every business in the UK that processes data is subject to GDPR from day one. Few of them have a dedicated data protection officer from day one; it is a role or responsibility that gets taken on by somebody within the organisation, or maybe somebody on the board who has some knowledge. That is facilitated by the fact that there are a really clear set of requirements there, and there are a lot of services you can buy and consume that help you deliver compliance. If we can get to a point where we have codes of practice that make very clear recommendations, then even small organisations that perhaps do not have that many staff to divert should be able to achieve some of the basic requirements of online safety by buying in the services and expertise that they need. We have seen with GDPR that many of those services are affordable to small business.

If we can get the clarity of what is required right, then the staff burden does not have to be that great, but we should all remember that the purpose of the Bill is to stop some of the egregiously bad things that happen to people as a result of harmful content, harmful behaviours and harmful contact online. Those things have a cost in the same way that implementing data privacy has a cost. To come back to Lulu’s point, it has to be proportionate to the business.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Adam, you said a few moments ago that companies are starting to put safety at the core of what they do, which will be welcome to us all—maybe it should have happened a lot earlier. I know you have worked a lot in that area. Regulators and company owners will have to depend on an ethical culture in their organisations if they are going to abide by the new regulations, because they cannot micromanage and regulators cannot micromanage. Will the Bill do enough to drive that ethical culture? If not, what more could it do or could the industry do? I would be really interested in everybody’s answer to this one, but I will start with Adam.

Adam Hildreth: What we are seeing from the people that are getting really good at this and that really understand it is that they are treating this as a proper risk assessment, at a very serious level, across the globe. When we are talking about tier 1s, they are global businesses. When they do it really well, they understand risk and how they are going to roll out systems, technology, processes and people in order to address that. That can take time. Yes, they understand the risk, who it is impacting and what they are going to do about it, but they still need to train people and develop processes and maybe buy or build technology to do it.

We are starting to see that work being done really well. It is done almost in the same way that you would risk assess anything else: corporate travel, health and safety in the workplace—anything. It should really become one of those pillars. All those areas I have just gone through are regulated. Once you have regulation there, it justifies why someone is doing a risk assessment, and you will get businesses and corporates going through that risk assessment process. We are seeing others that do not do the same level of risk assessment and they do not have that same buy-in.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Lulu, how do you drive a culture change?

Lulu Freemont: TechUK’s membership is really broad. We have cyber and defence companies in our membership, and large platforms and telcos. We speak on behalf of the sector. We would say that there is a real commitment to safety and security.

To bring it back to regulation, the risk-based approach is very much the right one—one that we think has the potential to really deliver—but we have to think about the tech ecosystem and its diversity. Lots of TechUK members are on the business-to-business side and are thinking about the role that they play in supporting the infrastructure for many of the platforms to operate. They are not entirely clear that they are exempt in the Bill. We understand that it is a very clear policy intention to exempt those businesses, but they do not have the level of legal clarity that they need to understand their role as access facilities within the tech.

That is just one example of a part of the sector that you would not expect to be part of this culture change or regulation but which is being caught in it slightly as an unintended consequence of legal differences or misinterpretations. Coming from that wide-sector perspective, we think that we need clarity on those issues to understand the different functionalities, and each platform and service will be different in their approach to this stuff.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Ian, how do you drive a culture change in the sector?

Ian Stevenson: I think you have to look at the change you are trying to effect. For many people in the sector, there is a lack of awareness about what happens when the need to consider safety in building features is not put first. Even when you realise how many bad things can happen online, if you do not know what to do about it, you tend not to be able to do anything about it.

If we want to change culture—it is the same for individual organisations as for the sector as a whole—we have to educate people on what the problem is and give them the tools to feel empowered to do something about it. If you educate and empower people, you remove the barrier to change. In some places, an extremely ethical people-centric and safety-focused culture very naturally emerges, but in others, less so. That is precisely where making it a first-class citizen in terms of risk assessment for boards and management becomes so important. When people see management caring about things, that gets pushed out through the organisations.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q In your view, what needs to be added or taken away from the Bill to help it achieve the Government’s aim of making the UK

“the safest place in the world to be online”?

Lulu Freemont: First, I want to outline that there are some strong parts in the Bill that the sector really supports. I think the majority of stakeholders would agree that the objectives are the right ones. The Bill tries to strike a balance between safety, free speech and encouraging innovation and investment in the UK’s digital economy. The approach—risk-based, systems-led and proportionate—is the right one for the 25,000 companies that are in scope. As it does not focus on individual pieces of content, it has the potential to be future-proof and to achieve longer-term outcomes.

The second area in the Bill that we think is strong is the prioritisation of illegal content. We very much welcome the clear definitions of illegal content on the face of the Bill, which are incredibly useful for businesses as they start to think about preparing for their risk assessment on illegal content. We really support Ofcom as the appropriate regulator.

There are some parts of the Bill that need specific focus and, potentially, amendments, to enable it to deliver on those objectives without unintended consequences. I have already mentioned a few of those areas. The first is defining harmful content in primary legislation. We can leave it to codes to identify the interpretations around that, but we need definitions of harmful content so that businesses can start to understand what they need to do.

Secondly, we need clarity that businesses will not be required to monitor every piece of content as a result of the Bill. General monitoring is prohibited in other regions, and we have concerns that the Online Safety Bill is drifting away from those norms. The challenges of general monitoring are well known: it encroaches on individual rights and could result in the over-removal of content. Again, we do not think that the intention is to require companies of all sizes to look at every piece of content on their site, but it might be one of the unintended consequences, so we would like an explicit prohibition of general monitoring on the face of the Bill.

We would like to remove the far-reaching amendment powers of the Secretary of State. We understand the need for technical powers, which are best practised within regulation, but taking those further so that the Secretary of State can amend the regime in such an extreme way to align with public policy is of real concern, particularly to smaller businesses looking to confidently put in place systems and processes. We would like some consideration of keeping senior management liability as it is. Extending that further is only going to increase the chilling impact that it is having and the environment it is creating within UK investment. The final area, which I have just spoken about, is clarifying the scope. The business-to-business companies in our membership need clarity that they are not in scope and for that intention to be made clear on the face of the Bill.

We really support the Bill. We think it has the potential to deliver. There are just a few key areas that need to be changed or amended slightly to provide businesses with clarity and reassurances that the policy intentions are being delivered on.

Adam Hildreth: To add to that—Lulu has covered absolutely everything, and I agree—the critical bit is not monitoring individual pieces of content. Once you have done your risk assessment and put in place your systems, processes, people and technology, that is what people are signing up for. They are not signing up for this end assessment where, because you find that one piece of harmful content exists, or maybe many, you have failed to abide by what you are really signing up to.

That is the worry from my perspective: that people do a full risk assessment, implement all the systems, put in place all the people, technology and processes that they need, do the best job they can and have understood what investment they are putting in, and someone comes along and makes a report to a regulator—Ofcom, in this sense—and says, “I found this piece of content there.” That may expose weaknesses, but the very best risk assessments are ongoing ones anyway, where you do not just put it away in a filing cabinet somewhere and say, “That’s done.” The definitions of online harms and harmful content change on a daily basis, even for the biggest social media platforms; they change all the time. There was talk earlier about child sexual abuse material that appears as cartoons, which would not necessarily be defined by certain legislation as illegal. Hopefully the legislation will catch up, but that is where that risk assessment needs to be made again, and policies may need to be changed and everything else. I just hope we do not get to the point where the individual monitoring of content, or content misses, is the goal of the Bill—that the approach taken to online safety is this overall one.

None Portrait The Chair
- Hansard -

Thank you. I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Sir Roger, and thank you very much indeed for joining us for this afternoon’s session. Adam, we almost met you in Leeds last October or November, but I think you were off with covid at the time.

Adam Hildreth: I had covid at the time, yes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Covid struck. I would like to ask Adam and Ian in particular about the opportunities provided by emerging and new technology to deliver the Bill’s objectives. I would like you both to give examples of where you think new tech can help deliver these safety duties. I ask you to comment particularly on what it might do on, first, age assurance—which we debated in our last session—and secondly, scanning for child sexual abuse images in an end-to-end encrypted environment. Adam, do you want to go first?

Adam Hildreth: Well, if Ian goes first, the second question would be great for him to answer, because we worked on it together.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Fair enough. Ian?

Ian Stevenson: Yes, absolutely. The key thing to recognise is that there is a huge and growing cohort of companies, around the world but especially in the UK, that are working on technologies precisely to try to support those kinds of safety measures. Some of those have been supported directly by the UK Government, through the safety tech challenge fund, to explore what can be done around end-to-end encrypted messaging. I cannot speak for all the participants, but I know that many of them are members of the safety tech industry association.

Between us, we have demonstrated a number of different approaches. My own company, Cyacomb, demonstrated technology that could block known child abuse within encrypted messaging environments without compromising the privacy of users’ messages and communications. Other companies in the UK, including DragonflAI and Yoti, demonstrated solutions based on detecting nudity and looking at the ages of the people in those images, which are again hugely valuable in this space. Until we know exactly what the regulation is going to demand, we cannot say exactly what the right technology to solve it is.

However, I think that the fact that that challenge alone produced five different solutions looking at the problem from different angles shows just how vibrant the innovation ecosystem can be. My background in technology is long and mixed, but I have seen a number of sectors emerge—including cyber-security and fintech—where, once the foundations for change have been created, the ability of innovators to come up with answers to difficult questions is enormous. The capacity to do that is enormous.

There are a couple of potential barriers to that. The strength of the regulation is that it is future proof. However, until we start answering the question, “What do we need to do and when? What will platforms need to do and when will they need to do it?” we do not really create in the commercial market the innovation drivers for the technical solutions that will deliver this. We do not create the drivers for investment. It is really important to be as specific as we can about what needs to be done and when.

The other potential barrier is regulation. We have already had a comment about how there should be a prohibition of general monitoring. We have seen what has happened in the EU recently over concerns about safety technologies that are somehow looking at traffic on services. We need to be really clear that, while safety technologies must protect privacy, there needs to be a mechanism so that companies can understand when they can deploy safety technologies. At the moment there are situations where we talk to potential customers for safety technologies and they are unclear as to whether it would be proportionate to deploy those under, for example, data protection law. There are areas, even within the safety tech challenge fund work on end-to-end encrypted messaging, where it was unclear whether some of the technologies—however brilliant they were at preventing child abuse in those encrypted environments —would be deployable under current data protection and privacy of electronic communications regulations.

There are questions there. We need to make sure that when the Online Safety Bill comes through, it makes clear what is required and how it fits together with other regulations to enable that. Innovators can do almost anything if you give them time and space. They need the certainty of knowing what is required, and an environment where solutions can be deployed and delivered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Ian, thank you very much. I am encouraged by your optimism about what innovation can ultimately deliver. Adam, let me turn to you.

Adam Hildreth: I agree with Ian that the level of innovation is amazing. If we start talking about age verification and end-to-end encryptions, for me—I am going to say that same risk assessment phrase again—it absolutely depends on the type of service, who is using the service and who is exploiting the service, as to which safety technologies should be employed. I think it is dangerous to say, “We are demanding this type of technology or this specific technology to be deployed in this type of instance,” because that removes the responsibility from the people who are creating it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry to interject, but to be clear, the Bill does not do that. The Bill specifies the objectives, but it is tech agnostic. The manner of delivering those is, of course, not specified, either in the Bill or by Ofcom.

Adam Hildreth: Absolutely. Sorry, I was saying that I agree with how it has been worded. We know what is available, but technology changes all the time and solutions change all the time—we can do things in really innovative ways. However, the risk assessment has to bring together freedom of speech versus the types at risk of abuse. Is it children who are at risk, and if so, what are they at risk from? That changes the space massively when compared with some adult gaming communities, where what is harmful to them is very different from what harms other audiences. That should dictate for them what system and technology is deployed. Once we understand what best of breed looks like for those types of companies, we should know what good is.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Adam. We only have one minute left, so what is your prediction for the potential possibilities that emerging tech presents to deal with the issues of age assurance, which are difficult, and CSEA scanning, given end-to-end encrypted environments?

Adam Hildreth: The technology is there. It exists and it is absolutely deployable in the environments that need it. I am sure Ian would agree; we have seen it and done a lot of testing on it. The technology exists in the environments that need it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Including inside the end-to-end encrypted environment, rather than just at the device level? Quite a few of the safety challenge solutions that Ian mentioned are at the device level; they are not inside the encryption.

Adam Hildreth: There are ways that can work. Again, it brings in freedom of expression, global businesses and some other areas, so it is more about regulation and consumer concerns about the security of data, rather than whether technological solutions are available.

None Portrait The Chair
- Hansard -

Ms Freemont, Mr Hildreth and Mr Stevenson, thank you all very much indeed. We have run out of time. As ever, if you have any further observations that you wish to make, please put them in writing and let the Committee have them; we shall welcome them. Thank you for your time this afternoon. We are very grateful to you.

Examination of Witnesses

Jared Sine, Nima Elmi and Dr Rachel O’Connell gave evidence.

16:16
None Portrait The Chair
- Hansard -

We are now going to hear from Jared Sine, who is the chief business affairs and legal officer at Match Group, and Nima Elmi, the head of public policy in Europe at Bumble, who is appearing by Zoom. Thank you for joining us. I hope you can hear us all right. Wave if you can.

Nima Elmi indicated assent.

None Portrait The Chair
- Hansard -

We also have Dr Rachel O’Connell, who is the CEO of TrustElevate. Good afternoon.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Does the Bill differentiate enough between services that have different business models? If not, what do you think are the consequences of the lack of differentiation, and where could more differentiation be introduced? Shall we start with you, Jared Sine?

Jared Sine: Sure—thank you for the question. Business models play a pretty distinct role in the incentives of the companies. When we talk to people about Match Group and online dating, we try to point out a couple of really important things that differentiate what we do in the dating space from what many technology companies are doing in the social media space. One of those things is how we generate our revenue. The overwhelming majority of it is subscription-based, so we are focused not on time on platform or time on device, but on whether you are having a great experience, because if you are, you are going to come back and pay again, or you are going to continue your subscription with us. That is a really big differentiator, in terms of the business model and where incentives lie, because we want to make sure they have a great experience.

Secondly, we know we are helping people meet in real life. Again, if people are to have a great experience on our platforms, they are going to have to feel safe on them, so that becomes a really big focus for us.

Finally, we are more of a one-to-one platform, so people are not generally communicating to large groups, so that protects us from a lot of the other issues you see on some of these larger platforms. Ultimately, what that means is that, for our business to be successful, we really have to focus on safety. We have to make sure users come, have a good, safe experience, and we have to have tools for them to use and put in place to empower themselves so that they can be safe and have a great experience. Otherwise, they will not come back and tell their friends.

The last thing about our platforms is that ultimately, if they are successful, our users leave them because they are engaged in a relationship, get married or just decide they are done with dating all together—that happens on occasion, too. Ultimately, our goal is to make sure that people have that experience, so safety becomes a core part of what we do. Other platforms are more focused on eyeballs, advertising sales and attention—if it bleeds, it leads—but those things are just not part of the equation for us.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q And do you think the Bill differentiates enough? If not, what more could be done in it?

Jared Sine: We are very encouraged by the Bill. We think it allows for different codes of conduct or policy, as it relates to the various different types of businesses, based on the business models. That is exciting for us because we think that ultimately those things need to be taken into account. What are the drivers and the incentives in place for those businesses? Let us make sure that we have regulations in place that address those needs, based on the approaches of the businesses.

None Portrait The Chair
- Hansard -

Nima, would you like go next?

Nima Elmi: Thank you very much for inviting me along to this discussion. Building on what Jared said, currently the Bill is not very clear in terms of references to categorisations of services. It clusters together a number of very disparate platforms that have different platform designs, business models and corporate aims. Similarly to Match Group, our platform is focused much more on one-to-one communications and subscription-based business models. There is an important need for the Bill to acknowledge these different types of platforms and how they engage with users, and to ensure appropriate guidance from Ofcom on how they should be categorised, rather than clustering together a rather significant amount of companies that have very different business aims in in this space.

None Portrait The Chair
- Hansard -

Dr O’Connell, would you like to answer?

Dr Rachel O'Connell: Absolutely. I think those are really good points that you guys have raised. I would urge a little bit of caution around that though, because I think about Yellow Tinder, which was the Tinder for teens, which has been rebranded as Yubo. It transgresses: it is a social media platform; it enables livestreaming of teens to connect with each other; it is ultimately for dating. So there is a huge amount of risk. It is not a subscription-based service.

I get the industry drive to say, “Let’s differentiate and let’s have clarity”, but in a Bill, essentially the principles are supposed to be there. Then it is for the regulator, in my view, to say, at a granular level, that when you conduct a risk impact assessment, you understand whether the company has a subscription-based business model, so the risk is lower, and also if there is age checking to make sure those users are 18-plus. However, you must also consider that there are teen dating sites, which would definitely fall under the scope of this Bill and the provisions that it is trying to make to protect kids and to reduce the risk of harm.

While I think there is a need for clarity, I would urge caution. For the Bill to have some longevity, being that specific about the categorisations will have some potential unintended consequences, particularly as it relates to children and young people.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q The next question is really about age verification, which you have touched on, so let us start with you, Dr O’Connell. What do you think the Bill should contain to enable age verification or the age assurance needed to protect children online?

Dr Rachel O'Connell: There is a mention of age assurance in the Bill. There is an opportunity to clarify that a little further, and also to bring age verification services under the remit of the Bill, as they are serving and making sure that they are mitigating risk. There was a very clear outline by Elizabeth Denham when we were negotiating the Digital Economy Act in relation to age verification and adult content sites; she was very specific when she came to Committee and said it should be a third party conducting the checks. If you want to preserve privacy and security, it should be a third-party provider that runs the checks, rather than companies saying, “You know what? We’ll track everybody for the purposes of age verification.”

There needs to be a clear delineation, which currently in clause 50 is not very clear. I would recommend that that be looked at again and that some digital identity experts be brought into that discussion, so that there is a full appreciation. Currently, there is a lot of latitude for companies to develop their own services in-house for age verification, without, I think, a proper risk assessment of what that might mean for end users in terms of eroding their privacy.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q TikTok were talking to us earlier about their age verification. If companies do it themselves rather than it being a third party, where does that fall down?

Dr Rachel O'Connell: That means you have to track and analyse people’s activities and you are garnering a huge amount of data. If you are then handling people under the age of 13, under the Data Protection Act, you must obtain parental consent prior to processing data. By definition, you have to gather the data from parents. I have been working in this space for 25 years. I remember, in 2008, when the Attorneys General brought all the companies together to consider age verification as part of the internet safety technical task force, the arguments of industry—I was in industry at the time—were that it would be overly burdensome and a privacy risk. Looking back through history, industry has said that it does not want to do that. Now, there is an incentive to potentially do that, because you do not have to pay for a third party to do it, but what are the consequences for the erosion of privacy and so on?

I urge people to think carefully about that, in particular when it comes to children. It would require tracking children’s activities over time. We do not want our kids growing up in a surveillance society where they are being monitored like that from the get-go. The advantage of a third-party provider is that they can have a zero data model. They can run the checks without holding the data, so you are not creating a data lake. The parent or child provides information that can be hashed on the device and checked against data sources that are hashed, which means there is no knowledge. It is a zero data model.

The information resides on the user’s device, which is pretty cool. The checks are done, but there is no exposure and no potential for man-in-the-middle checks. The company then gets a token that says “This person is over 18”, or “This person is below 12. We have verified parental responsibility and that verified parent has given consent.” You are dealing with tokens that do not contain any personal information, which is a far better approach than companies developing things in-house.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I think the TikTok example was looking at materials and videos and seeing whether they mention school or birthdays as a way of verifying age. As you say, that does involve scanning the child’s data.

None Portrait The Chair
- Hansard -

Q Can I see if Ms Elmi wants to come in? She tends to get left out on a limb, on the screen. Are you okay down there? Do you need to come in on this, or are you happy?

Nima Elmi: Yes, I am. I have nothing to add.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Jared Sine, did you have anything to add?

Jared Sine: Sure. I would add a couple of thoughts. We run our own age verification scans, which we do through the traditional age gate but also through a number of other scans that we run.

Again, online dating platforms are a little different. We warn our users upfront that, as they are going to be meeting people in real life, there is a fine balance between safety and privacy, and we tend to lean a little more towards safety. We announce to our users that we are going to run message scans to make sure there is no inappropriate behaviour. In fact, one of the tools we have rolled out is called “Are you sure? Does this bother you?”, through which our AI looks at the message a user is planning to send and, if it is an inappropriate message, a flag will pop up that says, “Are you sure you want to send this?” Then, if they go ahead and send it, the person receiving it at the other end will get a pop-up that says, “This may not be something you want to see. Go ahead and click here if you want to.” If they open it, they then get another pop-up that asks “Does this bother you?” and, if it does, you can report the user immediately.

We think that is an important step to keep our platform safe. We make sure our users know that it is happening, so it is not under the table. However, we think there has to be a balance between safety and privacy, especially when we have users who are meeting in person. We have actually demonstrated on our platforms that this reduces harassment and behaviour that would otherwise be untoward or that you would not want on the platform.

We think that we have to be careful not to tie the hands of industry to be able to come up with technological solutions and advances that can work side by side with third-party tools and solutions. We have third-party ID verification tools that we use. If we identify or believe a user is under the age of 18, we push them through an ID verification process.

The other thing to remember, particularly as it relates to online dating, is that companies such as ours and Bumble have done the right thing by saying “18-plus only on our platforms”. There is no law that says that an online dating platform has to be 18-plus, but we think it is right thing to do. I am a father of five kids; I would not want kids on my platform. We are very vigilant in taking steps to make sure we are using the latest and greatest tools available to try to make sure that our platforms are safe.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Rachel, we have, in you, what we are told is a leading, pre-eminent authority on the issue of age verification, so we are listening very carefully to what you say. I am thinking about the evidence we had earlier today, which said that it is reasonably straightforward for a large majority of young people to subvert age verification through the use of VPNs. You have been advocating third-party verification. How could we also deal with this issue of subverting the process through the use of the VPNs?

Dr Rachel O'Connell: I am the author of the technical standard PAS 1296, an age checking code of practice, which is becoming a global standard at the moment. We worked a lot with privacy and security and identity experts. It should have taken nine months, but it took a bit longer. There was a lot of thought that went into it. Those systems were developed to, as I just described, ensure a zero data, zero knowledge kind of model. What they do is enable those verifications to take place and reduce the requirement. There is a distinction between monitoring your systems, as was said earlier, for age verification purposes and abuse management. They are very different. You have to have abuse management systems. It is like saying that if you have a nightclub, you have to have bouncers. Of course you have to check things out. You need bouncers at the door. You cannot let people go into the venue, then afterwards say that you are spotting bad behaviour. You have to check at the door that they are the appropriate age to get into the venue.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Can they not just hop on a VPN and bypass the whole system anyway?

Dr Rachel O'Connell: I think you guys will be aware of the DCMS programme of work about the verification of children last year. As part of that, there was a piece of research that asked children what they would think about age verification. The predominant thing that came across from young children is that they are really tired of having to deal with weirdos and pervs. It is an everyday occurrence for them.

To just deviate slightly to the business model, my PhD is in forensics and tracking paedophile activity on the internet way back in the ’90s. At that time, guys would have to look for kids. Nowadays, on TikTok and various livestream platforms, the algorithms recognise that an individual—a man, for example—is very interested in looking at content produced by kids. The algorithms see that a couple of times and go, “You don’t have to look anymore. We are going to seamlessly connect you with kids who livestream. We are also going to connect you with other men that like looking at this stuff.”

If you are on these livestream sites at 3 o’clock in the morning, you can see these kids who are having sleepovers or something. They put their phone down to record whatever the latest TikTok dance is, and they think that they are broadcasting to other kids. You would assume that, but what they then hear is the little pops of love hearts coming on to the screen and guys’ voices saying, “Hey sweetie, you look really cute. Lick your lips. Spread your legs.” You know where I am going with this.

The Online Safety Bill should look at the systems and processes that underpin these platforms, because there is gamification of kids. Kids want to become influencers—maybe become really famous. They see the views counter and think, “Wow, there are 200 people looking at us.” Those people are often men, who will co-ordinate their activities at the back. They will push the boys a little bit further, and if a girl is on her own, they will see. If the child does not respond to the request, they will drop off. The kid will think, “Oh my God. Well, maybe I should do it this one time.”

What we have seen is a quadrupling of child sexual abuse material online that has been termed “self-generated”, because the individual offender hasn’t actually produced it. From a psychological perspective, it is a really bad name, but that is a separate topic. Imagine if that was your kid who had been coerced into something that had then been labelled as “self-generated”. The businesses models that underpin those processes that happen online are certainly something that should be really within scope.

We do not spend enough time thinking about the implications of the use of recommendation engines and so on. I think the idea of the VPN is a bit of a red herring. Children want safety. They do not want to have to deal with this sort of stuff online. There are other elements. If you were a child and felt that you might be a little bit fat, you could go on YouTube and see whether you could diet or something. The algorithms will pick that up also. There is a tsunami of dieting and thinspiration stuff. There is psychological harm to children as a result of the systems and processes that these companies operate.

There was research into age verification solutions and trials run with BT. Basically, the feedback from both parents and children was, “Why doesn’t this exist already?”. If you go into your native EE app where it says, “Manage my family” and put in your first name, last name and mobile number and your child’s first name, last name and date of birth, it is then verified that you are their parent. When the child goes on Instagram or TikTok, they put in their first and last name. The only additional data point is the parent’s mobile number. The parent gets a notification and they say yes or no to access.

There are solutions out there. As others have mentioned, the young people want them and the parents want them. Will people try to work around them? That can happen, but if it is a parent-initiated process or a child-initiated process, you have the means to know the age bands of the users. From a business perspective, it makes a lot of sense because you can have a granular approach to the offerings you give to each of your customers in different age bands.

Nima Elmi: Just to add to what Rachel has said, I think she has articulated extremely well the complexities of the issues around not only age verification, but business models. Ultimately, this is such a complex matter that it requires continued consultation across industry, experts and civil society to identity pragmatic recommendations for industry when it comes to not only verifying the age of their users, but thinking about the nuanced differences between platforms, purposes, functionality and business models, and what that means.

In the context of the work we do here at Bumble, we are clear about our guidelines requiring people to be 18-plus to download our products from app stores, as well as ensuring that we have robust moderation processes to identify and remove under-18s from our platforms. There is an opportunity here for the Bill to go further in providing clarity and guidance on the issue of accessibility of children to services.

Many others have said over the course of today’s evidence that there needs to be a bit more colour put into definitions, particularly when certain sections of the Bill refer to what constitutes a “significant number of users” for determining child accessibility to platforms. Coupled with the fact that age verification or assurance is a complex area in and of itself and the nuance between how social media may engage with it versus a dating or social networking platform, I think that more guidance is very much needed and a much more nuanced approach would be welcome.

None Portrait The Chair
- Hansard -

I have three Members and the Minister to get in before 5 o’clock, so I urge brief questions and answers please.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Is it technically possible—I do not need to know how—to verify the age of children who are under 16, for example?

Dr Rachel O'Connell: Yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q So technology exists out there for that to happen.

Dr Rachel O'Connell: Yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Once we have the verification of those ages, do you think it would be possible or desirable to limit children’s interactions to only with other children? Is that the direction you were going in?

Dr Rachel O'Connell: I will give an example. If you go to an amusement park, kids who are below four feet, for example, cannot get on the adult rides, so the equivalent would be that they should not be on an 18-plus dating site. The service can create it at a granular level so the kids can interact with kids in the same age group or a little bit older, but they can also interact with family. You can create circles of trust among verified people.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

For a game like Roblox, which is aimed at kids—it is a kids platform—if you had the age verification and if that worked, you could have a situation where a 13-year-old on Roblox could only interact with children who are between 12 and 14. Does the technology exist to make that work?

Dr Rachel O'Connell: You could do. Then if you were using it in esports or there was a competition, you could broaden it out. The service can set the parameters, and you can involve the parents in making decisions around what age bands their child can play with. Also, kids are really into esports and that is their future, so there are different circumstances and contexts that the technology could enable.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Finally, do you think it would be desirable for Ofcom to consider a system with more consistency in parental controls, so that parents can always ensure that their children cannot talk to anybody outside their circle? Would that be helpful?

Dr Rachel O'Connell: There is a history of parental controls, and only 36% of parents use them. Ofcom research consistently says that it is 70%, but in reality, it is lower. When using age verification, the parents are removing the ability to watch everything. It is a platform; they are providing the digital playground. In the same way, when you go on swings and slides, there is bouncy tarmac because you know the kids are going to use them. It is like creating that health and safety environment in a digital playground.

When parents receive a notification that their child wants to access something, there could be a colour-coded nutrition-style thing for social media, livestreaming and so on, and the parents could make an informed choice. It is then up to the platform to maintain that digital playground and run those kinds of detection systems to see if there are any bad actors in there. That is better than parental controls because the parent is consenting and it is the responsibility of the platform to create the safer environment. It is not the responsibility of the parent to look over the child’s shoulder 24/7 when they are online.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q The age verification stuff is really interesting, so thank you to our witnesses. On violence against women and girls, clauses 150 to 155 set out three new communications offences. Do you think those offences will protect women from receiving offensive comments, trolling and threats online? What will the Bill mean for changing the way you manage those risks on your platforms?

Jared Sine: I do not know the specific provisions but I am familiar with the general concept of them. Any time you put something in law, it can either be criminalised or have enforcement behind it, and I think that helps. Ultimately, it will be up to the platforms to come up with innovative technologies or systems such as “Are You Sure?” and “Does This Bother You?” which say that although the law says x, we are going to go beyond that to find tools and systems that make it happen on our platform. Although I think it is clearly a benefit to have those types of provisions in law, it will really come down to the platforms taking those extra steps in the future. We work with our own advisory council, which includes the founder of the #MeToo movement, REIGN and others, who advise us on how to make platforms safer for those things. That is where the real bread gets buttered, so to speak.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Do you think that is consistent across your industry? It sounds like you are taking a very proactive approach to it.

Jared Sine: We are proactive about it, and I know our colleagues and friends over at Bumble are proactive about it as well. Our heads of trust and safety both came from the same company—Uber—before coming to us, so I know that they compare notes quite regularly. Because of the way the legislation is set up, there can be codes of conduct applying specifically to online dating, and to the extent that that technology exists, you need to deploy it.

None Portrait The Chair
- Hansard -

Shall we ask our friends at Bumble if they would like to come in?

Nima Elmi: It is a great question. There are three points that I want to address, and I will try to be brief. First, Bumble is very much a uniquely female-founded and female-led tech company that adopts a safety-by-design ethos. It is baked within our DNA. The majority of our board are women, and they are public figures who, unfortunately, have to some extent experienced online harms targeting women.

We believe it is incredibly important that the Bill acknowledges that women are disproportionately impacted by online harms. Some studies have found that women are 27 times more likely than men to suffer online harassment and online harms. Currently, the Bill does not acknowledge or reference gender or women at all, so a lot more can be done, and we have submitted some recommendations.

Not every company in our industry or across the tech sector is female-founded and female-led, and they prioritise the harms that they want to tackle on their platforms very differently—that is important. Our systems-based approach, which bakes in safety-by-design principles, puts women at the centre of how our products are designed and used. We deploy corrective action and safety tools to make sure that our female members feel not only safe but empowered on our platforms. When it comes to managing risk, it is central to us to ensure that women feel safe on our products and services. We are here advocating for the fact that it should not just be our products that are safe for women—it should be the internet as a whole. In our view, the Bill does not currently go far enough to make sure that that happens.

We welcome the inclusion of the miscommunication offences in clauses 150 to 155 and also welcome the offence of cyber-flashing, the inclusion of which we have been advocating for publicly for several months. However, in both instances, and particularly with cyber-flashing, the Bill does not go far enough in acknowledging that it is an offence, as Professor McGlynn has highlighted, that should be grounded on consent rather than the motivation of the perpetrator.

Essentially, there are a number of inclusions that are a step in the right direction, but we would welcome significant changes to the Bill, predominantly through including a safety duty for women, to ensure that all platforms are consistent in their approach and prioritise how their female users engage with their services, so that they feel protected, and to ensure that determining those features is not predicated on the composition of the board or who the founder is.

None Portrait The Chair
- Hansard -

Right. For once, we seem to have run out of questions. Minister, do you wish to contribute?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Everything I was going to ask has already been asked by my colleagues, so I will not duplicate that.

None Portrait The Chair
- Hansard -

Q In that case, given that we have the time, rather than doing what I normally do and inviting you to make any further submissions in writing, if there are any further comments that you would like to make about the Bill, the floor is yours. Let us start with Mr Sine.

Jared Sine: I would just make one brief comment. I think it has been mentioned by everyone here. Everyone has a role to play. Clearly, the Government have a role in proposing and pushing forward the legislation. The platforms that have the content have an obligation and a responsibility to try to make sure that their users are safe. One of the things that Dr O’Connell mentioned is age verification and trying to make sure that we keep young kids off platforms where they should not be.

I think there is a big role to play for the big tech platforms—the Apples and Googles—who distribute our apps. Over the years, we have said again and again to both of those companies, “We have age-gated our apps at 18, yet you will allow a user you know is 15, 14, 16—whatever it is—to download that app. That person has entered that information and yet you still allow that app to be downloaded.” We have begged and pleaded with them to stop and they will not stop. I am not sure that that can be included in the Bill, but if it could be, it would be powerful.

If Apple and Google could not distribute any of our apps—Hinge, Match, Tinder—to anyone under the age of 18, that solves it right there. It is the same methodology that has been used at clubs with bouncers—you have a bouncer at the door who makes sure you are 21 before you go in and have a drink. It should be the same thing with these technology platforms. If they are going to distribute and have these app stores, the store should then have rules that show age-gated apps—“This is for 17-plus or 18-plus”—and should also enforce that. It is very unfortunate that our calls on this front have gone unanswered. If the Bill could be modified to include that, it would really help to address the issue.

Dr Rachel O'Connell: Absolutely. I 100% support that. There is a tendency for people to say, “It is very complex. We need a huge amount of further consultation.” I started my PhD in 1996. This stuff has been going on for all that time. In 2008, there was a huge push by the Attorneys General, which I mentioned already, which brought all of the industry together. That was 2008. We are in 2022 now. 2017 was the Internet Safety Strategy Green Paper. We know what the risks are. They are known; we understand what they are. We understand the systems and processes that facilitate them. We understand what needs to be done to mitigate those risks and harms. Let’s keep on the track that we are going on.

Regarding industry’s concerns, a lot of them will be ironed out when companies are required to conduct risk assessments and impact assessments. They might ask, what are the age bands of your users? What are the risks associated with the product features that you are making available? What are the behaviour modification techniques that you are using, like endless scroll and loot boxes that get kids completely addicted? Are those appropriate for those ages? Then you surface the decision making within the business that results in harms and also the mitigations.

I urge you to keep going on this; do not be deterred from it. Keep the timeframe within which it comes into law fairly tight, because there are children out there who are suffering. As for the harassment—I have experienced it myself, it is horrible.

Those would be my final words.

None Portrait The Chair
- Hansard -

Thank you. Finally, Nima Elmi, please.

Nima Elmi: Thank you again for your time. I want to re-emphasise a couple of points, since we have a few minutes.

First, on the point around gendered harms, I think it is important for the Committee to really think about whether this is an opportunity to make reference in the Bill to acknowledge that women are experiencing online harms at a significantly higher rate than men. That is meant to futureproof the Bill, as new forms of online harms are, unfortunately, usually felt by women first. I know that Maria Miller, for example, has been doing extensive work around the issue of AI nudification tools, which, in the current framing of the Bill, would not be captured.

We would certainly urge that there is a greater focus in the Bill on gendered harms, whether that is through a specific safety duty, acknowledgement as a category within risk assessment, a designated code of practice—which I know Clare McGlynn, Refuge and EVAW have also advocated for—or acknowledgement of gender-based violence in transparency reporting.

Right now, the nature of moderation of technology platforms is very much grounded in the prioritisation of issues based on the leadership and usage of certain platforms, and this is an opportunity for the Government and Parliament to provide a standard setting that ensures consistency across the board while acknowledging the nuanced differences between the platforms and their business models, and their end goals. I would really like to emphasise that point.

The second point I want to emphasise, on cyber-flashing in particular, is the fact that we have an opportunity to bake in what should be societal standards that we want to hold people accountable to, both offline and online. Offences captured by the Bill that do not create a threshold where you will see prosecutions and a change in behaviour—for example, in the current formulation of the cyber-flashing offence, which is grounded in the perpetrator’s motivation rather than in consent—will have little impact in changing the hearts and minds of individuals and stopping that behaviour, because the threshold will be so high.

We would definitely encourage the Committee to reflect on the pragmatic ways in which the Bill can be refined. In particular, I want to emphasise that it will be important to acknowledge that online harms are sadly very much experienced by women—both emerging forms and existing forms of harms. I welcome this opportunity to share this feedback with the Committee.

None Portrait The Chair
- Hansard -

Ms Elmi, Dr O’Connell and Mr Sine, thank you all very much indeed; the Committee is indebted to you. Thank you so much.

Examination of Witnesses

Rhiannon-Faye McDonald and Susie Hargreaves OBE gave evidence.

16:55
None Portrait The Chair
- Hansard -

We will now hear from Rhiannon-Faye McDonald, victim and survivor advocate at the Marie Collins Foundation, and Susie Hargreaves, chief executive at the Internet Watch Foundation. Thank you for joining us this afternoon; first question, please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you both for joining us this afternoon. One of the key objectives of the legislation is to ensure that a high level of protection for children and adults is in place. In your view, does the Bill in its current form achieve that?

Susie Hargreaves: Thank you very much for inviting me today. I think the Bill is working in the right direction. Obviously, the area that we at the IWF are concerned with is child sexual abuse online, and from our point of view, the Bill does need to make a few changes in order to put those full protections in place for children.

In particular, we have drafted an amendment to put co-designation on the face of the Bill. When it comes to child sexual abuse, we do not think that contracting out is an acceptable approach, because we are talking about the most egregious form of illegal material—we are talking about children—and we need to ensure that Ofcom is not just working in a collaborative way, but is working with experts in the field. What is really important for us at the moment is that there is nothing in the Bill to ensure that the good work that has been happening over 25 years in this country, where the IWF is held up as a world leader, is recognised, and that that expertise is assured on the face of the Bill. We would like to see that amendment in particular adopted, because the Bill needs to ensure that there are systems and processes in place for dealing with illegal material. The IWF already works with internet companies to ensure they take technical services.

There needs to be a strong integration with law enforcement—again, that is already in place with the memorandum of understanding between CPS, the National Police Chiefs’ Council and the IWF. We also need clarity about the relationship with Ofcom so that child sexual abuse, which is such a terrible situation and such a terrible crime, is not just pushed into the big pot with other harms. We would like to see those specific changes.

Rhiannon-Faye McDonald: Generally, we think the Bill is providing a higher standard of care for children, but there is one thing in particular that I would like to raise. Like the IWF, the Marie Collins Foundation specialises in child sexual abuse online, specifically the recovery of people who have been affected by child sexual abuse.

The concern I would like to raise is around the contextual CSA issue. I know this has been raised before, and I am aware that the Obscene Publications Act 1959 has been brought into the list of priority offences. I am concerned that that might not cover all contextual elements of child sexual abuse: for example, where images are carefully edited and uploaded to evade content moderation, or where there are networks of offenders who are able to gain new members, share information with each other, and lead other people to third-party sites where illegal content is held. Those things might not necessarily be caught by the illegal content provisions; I understand that they will be dealt with through the “legal but harmful” measures.

My concern is that the “legal but harmful” measures do not need to be implemented by every company, only those that are likely to be accessed by children. There are companies that can legitimately say that the majority of their user base is not children, and therefore would not have to deal with that, but that provides a space for this contextual CSA to happen. While those platforms may not be accessed by children as much as other platforms, it still provides a place for this to happen—the harm can still occur, even if children do not come across it as much as they would elsewhere.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q On that point, one of the concerns that has been raised by other stakeholders is about the categorisation of platforms—for example, category 1 and category 2B have different duties on them, as Ofcom is the regulator. Would you rather see a risk-based approach to platforms, rather than categorisation? What are your thoughts on that?

Susie Hargreaves: We certainly support the concept of a risk-based approach. We host very little child sexual abuse content in the UK, with the majority of the content we see being hosted on smaller platforms in the Netherlands and other countries. It is really important that we take a risk-based approach, which might be in relation to where the content is—obviously, we are dealing with illegal content—or in relation to where children are. Having a balance there is really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q A final question from me. We heard concerns from children’s charities and the Children’s Commissioner that the Bill does not account for breadcrumbing—the cross-platform grooming that happens on platforms. What more could the Bill do to address that, and do you see it as an omission and a risk?

Susie Hargreaves: I think we probably have a slightly different line from that of some of the other charities you heard from this morning, because we think it is very tricky and nuanced. What we are trying to do at the moment is define what it actually means and how we would have to deal with it, and we are working very closely with the Home Office to go through some of those quite intense discussions. At the moment, “harmful” versus “illegal” is not clearly defined in law, and it could potentially overwhelm certain organisations if we focus on the higher-level harms and the illegal material. We think anything that protects children is essential and needs to be in the Bill, but we need to have those conversations and to do some more work on what that means in reality. We are more interested in the discussions at the moment about the nuance of the issue, which needs to be mapped out properly.

One of the things that we are very keen on in the Bill as a whole is that there should be a principles-based approach, because we are dealing with new harms all the time. For example, until 2012 we had not seen self-generated content, which now accounts for 75% of the content we remove. So we need constantly to change and adapt to new threats as they come online, and we should not make the Bill too prescriptive.

None Portrait The Chair
- Hansard -

Ms McDonald?

Rhiannon-Faye McDonald: I was just thinking of what I could add to what Susie has said. My understanding is that it is difficult to deal with cross-platform abuse because of the ability to share information between different platforms—for example, where a platform has identified an issue or offender and not shared that information with other platforms on which someone may continue the abuse. I am not an expert in tech and cannot present you with a solution to that, but I feel that sharing intelligence would be an important part of the solution.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q What risks do end-to-end encrypted platforms pose to children, and how should the Bill seek to mitigate those risks specifically?

Susie Hargreaves: We are very clear that end-to-end encryption should be within scope, as you have heard from other speakers today. Obviously, the huge threat on the horizon is the end-to-end encryption on Messenger, which would result in the loss of millions of images of child sexual abuse. In common with previous speakers, we believe that the technology is there. We need not to demonise end-to-end encryption, which in itself is not bad; what we need to do is ensure that children do not suffer as a consequence. We must have mitigations and safety mechanisms in place so that we do not lose these child sexual abuse images, because that means that we will not be able to find and support those children.

Alongside all the child protection charities, we are looking to ensure that protections equivalent to the current ones are in place in the future. We do not accept that the internet industry cannot put them in place. We know from experts such as Dr Hany Farid, who created PhotoDNA, that those mechanisms and protections exist, and we need to ensure that they are put in place so that children do not suffer as a consequence of the introduction of end-to-end encryption. Rhiannon has her own experiences as a survivor, so I am sure she would agree with that.

Rhiannon-Faye McDonald: I absolutely would. I feel very strongly about this issue, which has been concerning me for quite some time. I do not want to share too much, but I am a victim of online grooming and child sex abuse. There were images and videos involved, and I do not know where they are and who has seen them. I will never know that. I will never have any control over it. It is horrifying. Even though my abuse happened 19 years ago, I still walk down the street wondering whether somebody has seen those images and recognises me from them. It has a lifelong impact on the child, and it impacts on recovery. I feel very strongly that if end-to-end encryption is implemented on platforms, there must be safeguards in place to ensure we can continue to find and remove these images, because I know how important that is to the subject of those images.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q So what needs to change in the Bill to make sure that happens? I am not clear.

Susie Hargreaves: We just want to make sure that the ability to scan in an end-to-end encrypted environment is included in the Bill in some way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q The ability to scan is there right now—we have got that—so you are just trying to make sure we are standing still, basically. Am I correct in my understanding?

Susie Hargreaves: I think with technology you can never stand still. We do not know what is coming down the line. We have to deal with the here and now, but we also need to be prepared to deal with whatever comes down the line. The answer, “Okay, we will just get people to report,” is not a good enough replacement for the ability to scan for images.

When the privacy directive was introduced in Europe and Facebook stopped scanning for a short period, we lost millions of images. What we know is that we must continue to have those safety mechanisms in place. We need to work collectively to do that, because it is not acceptable to lose millions of images of child sexual abuse and create a forum where people can safely share them without any repercussions, as Rhiannon says. One survivor we talked to in this space said that one of her images had been recirculated 70,000 times. The ability to have a hash of a unique image, go out and find those duplicates and make sure they are removed means that people are not re-victimised on a daily basis. That is essential.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Focusing on thinking about how to prevent grooming behaviour, does the Bill have enough in place to protect children from conversations that they may have adults, or from facing grooming behaviour online?

Rhiannon-Faye McDonald: There is one specific point that I would like to raise about this. I am concerned about private communications. We know that many offenders identify and target children on more open platforms, and then very quickly move them to more private platforms to continue the grooming and abuse. We were very pleased to see that private communications were brought in scope. However, there is a difficulty in the code of practice. When that is drafted, Ofcom is not going to be able to require proactive tools to be used to identify. That includes things like PhotoDNA and image and text-based classifiers.

So although we have tools that we can use currently, which can identify conversations where grooming is happening, we are not going to be using those immediately on private platforms, on private communications where the majority of grooming is going to happen. That means there will be a delay while Ofcom establishes that there is a significant problem with grooming on the platform, and then issues are noticed to require those tools to be used.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q You mentioned the reporting mechanisms that are in place, Susie. Yes, they are not the only tool, and should not be the only tool—many more things should be happening—but are the reporting mechanisms that will be in place, once the Bill has come in and is being embedded, sufficient, or do they need to be improved as well; as requirements for platforms to have reporting mechanisms?

Susie Hargreaves: An awful lot of work has already gone into this over the past few years. We have been working closely with Departments on the draft code of practice. We think that, as it stands, it is in pretty good shape. We need to work more closely with Ofcom as those codes are developed—us and other experts in the field. Again, it needs to be very much not too directing, in the sense that we do not want to limit people, and to be available for when technology changes in the future. It is looking in the right shape, but of course we will all be part of the consultation and of the development of those practices as they go. It requires people to scan their networks, to check for child sexual abuse and—I guess for the first time, the main thing—to report on it. It is going to be a regulated thing. In itself, that is a huge development, which we very much welcome.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have one last question. Rhiannon, a suggestion was made earlier by Dr Rachel O’Connell about age verification and only allowing children to interact with other children whose age is verified within a certain area. Do you think that would help to prevent online grooming?

Rhiannon-Faye McDonald: It is very difficult. While I am strongly about protecting children from encountering perpetrators, I also recognise that children need to have freedoms and the ability to use the internet in the ways that they like. I think if that was implemented and it was 100% certain that no adult could pose as a 13-year-old and therefore interact with actual 13-year-olds, that would help, but I think it is tricky.

Susie Hargreaves: One of the things we need to be clear about, particularly where we see children groomed —we are seeing younger and younger children—is that we will not ever sort this just with technology; the education piece is huge. We are now seeing children as young as three in self-generated content, and we are seeing children in bedrooms and domestic settings being tricked, coerced and encouraged into engaging in very serious sexual activities, often using pornographic language. Actually, a whole education piece needs to happen. We can put filters and different technology in place, but remember that the IWF acts after the event—by the time we see this, the crime has been committed, the image has been shared and the child has already been abused. We need to bump up the education side, because parents, carers, teachers and children themselves have to be able to understand the dangers of being online and be supported to build their resilience online. They are definitely not to be blamed for things that happen online. From Rhiannon’s own story, how quickly it can happen, and how vulnerable children are at the moment—I don’t know.

Rhiannon-Faye McDonald: For those of you who don’t know, it happened very quickly to me, within the space of 24 hours, from the start of the conversation to the perpetrator coming to my bedroom and sexually assaulting me. I have heard other instances where it has happened much more quickly than that. It can escalate extremely quickly.

Just to add to Susie’s point about education, I strongly believe that education plays a huge part in this. However, we must be very careful in how we educate children, so that the focus is not on how to keep themselves safe, because puts the responsibility on them, which in turn increases the feelings of responsibility when things do go wrong. That increased feeling of responsibility makes it less likely that they will disclose that something has happened to them, because they feel that they will be blamed. It will decrease the chance that children will tell us that something has happened.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Just to follow up on a couple of things, mainly with Susie Hargreaves. You mentioned reporting mechanisms and said that reporting will be a step forward. However, the Joint Committee on the draft Bill recommended that the highest-risk services should have to report quarterly data to Ofcom on the results of their child sexual exploitation and abuse removal systems. What difference would access to that kind of data make to your work?

Susie Hargreaves: We already work with the internet industry. They currently take our services and we work closely with them on things such as engineering support. They also pay for our hotline, which is how we find child sexual abuse. However, the difference it would make is that we hope then to be able to undertake work where we are directly working with them to understand the level of their reports and data within their organisations.

At the moment, we do not receive that information from them. It is very much that we work on behalf of the public and they take our services. However, if we were suddenly able to work directly with them—have information about the scale of the issue within their own organisations and work more directly on that— then that would help to feed into our work. It is a very iterative process; we are constantly developing the technology to deal with the current threats.

It would also help us by giving us more intelligence and by allowing us to share that information, on an aggregated basis, more widely. It would certainly also help us to understand that they are definitely tackling the problem. We do believe that they are tackling the problem, because it is not in their business interests not to, but it just gives a level of accountability and transparency that does not exist at the moment.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q You also said earlier that there was nothing in the Bill on co-designation—nothing to recognise the Internet Watch Foundation’s 25 years of experience. Do you still expect to be co-designated as a regulator by Ofcom, and if so, what do you expect your role to be?

Susie Hargreaves: At the moment, there is nothing on the face of the Bill on co-designation. We do think that child sexual abuse is different from other types of harm, and when you think about the huge number of harms, and the scale and complexity of the Bill, Ofcom has so much to work with.

We have been working with Ofcom for the past year to look at exactly what exactly our role would be. However, because we are the country’s experts on dealing with child sexual abuse material, because we have the relationships with the companies, and because we are an internationally renowned organisation, we are able to have that trusted relationship and then undertake a number of functions for Ofcom. We could help to undertake specific investigations, help update the code, or provide that interface between Ofcom and the companies where we undertake that work on their behalf.

We very much feel that we should be doing that. It is not about being self-serving, but about recognising the track record of the organisation and the fact that the relationships and technology are in place. We are already experts in this area, so we are able to work directly with those companies because we already work with them and they trust us. Basically, we have a memorandum of understanding with the CPS and the National Police Chiefs’ Council that protects our staff from prosecution but the companies all work with us on a voluntary basis. They already work with us, they trust our data, and we have that unique relationship with them.

We are able to provide that service to take the pressure off Ofcom because we are the experts in the field. We would like that clarified because we want this to be right for children from day one—you cannot get it wrong when dealing with child sexual abuse. We must not undo or undermine the work that has happened over the last 25 years.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Just to be clear, is there uncertainty somewhere in there? I am just trying to comprehend.

Susie Hargreaves: There is uncertainty, because we do not know exactly what our relationship with Ofcom is going to be. We are having discussions and getting on very well, but we do not know anything about what the relationship will be or what the criteria and timetable for the relationship are. We have been working on this for nearly five years. We have analysts who work every single day looking at child sexual abuse; we have 70 members of staff, and about half of them look at child sexual abuse every day. They are dealing with some of the worse material imaginable, they are already in a highly stressful situation and they have clear welfare needs; uncertainty does not help. What we are looking for is certainty and clarity that child sexual abuse is so important that it is included on the face of the Bill, and that should include co-designation.

None Portrait The Chair
- Hansard -

Thank you. One question from Kim Leadbeater.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you for your very powerful testimony, Rhiannon. I appreciate that could not have been easy. Going back to the digital literacy piece, it feels like we were talking about digital literacy in the Bill when it started coming through, and that has been removed now. How important do you think it is that we have a digital literacy strategy, and that we hold social media providers in particular to having a strategy on digital education for young people?

Rhiannon-Faye McDonald: It is incredibly important that we have this education piece. Like Susie said, we cannot rely on technology or any single part of this to solve child sexual abuse, and we cannot rely on the police to arrest their way out of the problem. Education really is the key. That is education in all areas—educating the child in an appropriate way and educating parents. We hold parenting workshops. Parents are terrified; they do not know what to do, what platforms are doing what, or what to do when things go wrong. They do not even know how to talk to children about the issue; it is embarrassing for them and they cannot bring it up. Educating parents is a huge thing. Companies have a big responsibility there. They should have key strategies in place on how they are going to improve education.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking both Rhiannon-Faye and Susie for coming and giving evidence, and for all the work they are doing in this area? I know it has been done over many years in both cases.

I would like to pick up on a point that has arisen in the discussion so far—the point that Susie raised about the risks posed by Meta introducing end-to-end encryption, particularly on the Facebook Messenger service. You have referenced the fact that huge numbers of child sexual exploitation images are identified by scanning those communications, leading to the arrests of thousands of paedophiles each year. You also referenced the fact that when this was temporarily turned off in Europe owing to the privacy laws there—briefly, thankfully—there was a huge loss of information. We will come on to the Bill in a minute, but as technology stands now, if Meta did proceed with end-to-end encryption, would that scanning ability be lost?

Susie Hargreaves: Yes. It would not affect the Internet Watch Foundation, but it would affect the National Centre for Missing and Exploited Children. Facebook, as a US company, has a responsibility to do mandatory reporting to NCMEC, which will be brought in with the Bill in this country. Those millions of images would be lost, as of today, if they brought end-to-end encryption in now.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Why would it not affect the Internet Watch Foundation?

Susie Hargreaves: Because they are scanning Facebook—sorry, I am just trying to unpack the way it works. It will affect us, actually. Basically, when we provide our hash list to Facebook, it uses that to scan Messenger, but the actual images that are found—the matches—are not reported to us; they are reported into NCMEC. Facebook does take our hash list. For those of you who do not know about hashing, it is a list of digital fingerprints—unique images of child sexual abuse. We currently have about 1.3 million unique images of child sexual abuse. Facebook does use our hash list, so yes it does affect us, because it would still take our hash list to use on other platforms, but it would not use it on Messenger. The actual matches would go into NCMEC. We do not know how many matches it gets against our hash list, because it goes into NCMEC.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q But its ability to check images going across Messenger against your list would effectively terminate.

Susie Hargreaves: Yes, sorry—I was unclear about that. Yes, it would on Messenger.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Clearly the Bill cannot compel the creation of technology that does not exist yet. It is hoped that there will be technology—we heard evidence earlier suggesting that it is very close to existing—that allows scanning in an end-to-end encrypted environment. Do you have any update on that that you can give the Committee? If there is no such technology, how do you think the Bill should address that? Effectively there would be a forced choice between end-to-end encryption and scanning for CSEA content.

Susie Hargreaves: As I said before, it is essential that we do not demonise end-to-end encryption. It is really important. There are lots of reasons why, from a security and privacy point of view, people want to be able to use end-to-end encryption.

In terms of whether the technology is there, we all know that there are things on the horizon. As Ian said in the previous session, the technology is there and is about to be tried out. I cannot give any update at this meeting, but in terms of what we would do if end-to-end encryption is introduced and there is no ability to scan, we could look at on-device scanning, which I believe you mentioned before, Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes.

Susie Hargreaves: That is an option. That could be a backstop position. I think that, at the moment, we should stand our ground on this and say, “No, we need to ensure that we have some form of scanning in place if end-to-end encryption is introduced.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q For complete clarity, do you agree that the use of end-to-end encryption cannot be allowed at the expense of child safety?

Susie Hargreaves: I agree 100%.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Good. Thank you.

None Portrait The Chair
- Hansard -

Thank you very much indeed, Ms McDonald and Ms Hargreaves. We are most grateful to you; thank you for your help.

Examination of Witnesses

Ellen Judson and Kyle Taylor gave evidence.

17:29
None Portrait The Chair
- Hansard -

Finally this afternoon, we will hear from Ellen Judson, who is the lead researcher at the Centre for the Analysis of Social Media at Demos, and Kyle Taylor, who is the founder and director of Fair Vote. Thank you for joining us this afternoon.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you both for joining us, and for waiting until the end of a very long day. We appreciate it.

There is a wide exemption in the Bill for the media and for journalistic content. Are you concerned that that is open to abuse?

Kyle Taylor: Oh, absolutely. There are aspects of the Bill that are extremely worrying from an online safety perspective: the media exemption, the speech of democratic importance exemption, and the fact that a majority of paid ads are out of scope. We know that a majority of harmful content originates from or is amplified by entities that meet one of those exceptions. What that means is that the objective of the Bill, which is to make the online world safer, might not actually be possible, because platforms, at least at present, are able to take some actions around these through their current terms and conditions, but this will say explicitly that they cannot act.

One real-world example is the white supremacist terror attack just last week in Buffalo, in the United States. The “great replacement” theory that inspired the terrorist was pushed by Tucker Carlson of Fox News, who would meet the media exemption; by right-wing blogs, which were set up by people who claim to be journalists and so would meet the journalistic standards exemption; by the third-ranking House Republican, who would meet the democratic importance exemption; and it was even run as paid ads by those candidates. In that one example, you would not be able to capture a majority of the way that harm spreads online.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Is there a way in which the exemptions could be limited to ensure that the extremists you have mentioned cannot take advantage of them?

Ellen Judson: I think there are several options. The primary option, as we would see it, is that the exemptions are removed altogether, on the basis that if the Bill is really promoting a systems-based approach rather than focusing on individual small categories of content, then platforms should be required to address their systems and processes whenever those lead to an increased risk of harm. If that leads to demotion of media content that meets those harmful thresholds, that would seem appropriate within that response.

If the exemptions are not to be removed, they could be improved. Certainly, with regard to the media exemption specifically, I think the thresholds for who qualifies as a recognised news publisher could be raised to make it more difficult for bad actors and extremists, as Kyle mentioned, simply to set up a website, add a complaints policy, have an editorial code of conduct and then say that they are a news publisher. That could involve linking to existing publishers that are already registered with existing regulators, but I think there are various ways that could be strengthened.

On the democratic importance and journalism exemptions, I think the issue is that the definitions are very broad and vague; they could easily be interpreted in any way. Either they could be interpreted very narrowly, in which case they might not have much of an impact on how platforms treat freedom of expression, as I think they were intended to do; or they could be interpreted very broadly, and then anyone who thinks or who can claim to think that their content is democratically important or journalistic, even if it is clearly abusive and breaches the platform’s terms and conditions, would be able to claim that.

One option put forward by the Joint Committee is to introduce a public interest exemption, so that platforms would have to think about how they are treating content that is in the public interest. That would at least remove some of the concerns. The easiest way for platforms to interpret what is democratically important speech and what is journalistic speech is based on who the user is: are they a politician or political candidate, or are they a journalist? That risks them privileging certain people’s forms of speech over that of everyday users, even if that speech is in fact politically relevant. I think that having something that moves the threshold further away from focusing on who a user is as a proxy for whether their speech is likely to deserve extra protection would be a good start.

Kyle Taylor: It is basically just saying that content can somehow become less harmful depending on who says it. A systems-based approach is user-neutral, so its only metric is: does this potentially cause harm at scale? It does not matter who is saying it; it is simply a harm-based approach and a system solution. If you have exemptions, exceptions and exclusions, a system will not function. It suggests that a normal punter with six followers saying that the election was stolen is somehow more harmful than the President of the United States saying that an election is stolen. That is just the reality of how online systems work and how privileged and powerful users are more likely to cause harm.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q You are creating a two-tier internet, effectively, between the normal user and those who are exempt, which large swathes of people will be because it is so ambiguous. One of the other concerns that have been raised is the fact that the comments sections on newspaper websites are exempt from the Bill. Do you see an issue with that?

Ellen Judson: There is certainly an issue as that is often where we see a lot of abuse and harm, such that if that same content were replicated on a social media platform, it would almost certainly be within the scope of the Bill. There is a question, which is for Ofcom to consider in its risk profiles and risk registers, about where content at scale has the potential to cause the most harm. The reach of a small news outlet’s comments section would be much less than the reach of Donald Trump’s Twitter account, for instance. Certainly, if the risk assessments are done and comments sections of news websites have similar reach and scale and could cause significant harm, I think it would be reasonable for the regulator to consider that.

Kyle Taylor: It is also that they are publicly available. I can speak from personal experience. Just last week, there was a piece about me. The comments section simultaneously said that I should be at Nuremberg 2.0 because I was a Nazi, but also that I should be in a gas chamber. Hate perpetuates in a comments section just as it does on a social media platform. The idea that it is somehow less harmful because it is here and not there is inconsistent and incoherent with the regime where the clue is in the name: the Online Safety Bill. We are trying to make the online world safer.

On media I would add that we have to think about how easy it is, based on the criteria in the Bill, to become exempt as a media entity. We can think about that domestically, but what happens when a company is only meant to enforce their terms and conditions in that country, but can broadcast to the world? The UK could become the world’s disinformation laundromat because you can come here, meet the media exemption and then blast content to other places in the world. I do not think that is something that we are hoping to achieve through this Bill. We want to be the safest place in the world to go online and to set a global benchmark for what good regulation looks like.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I suppose, yes. Under the current media carve-out, how do you see platforms being able to detect state actors that are quoting misinformation or perpetuating disinformation on their platforms?

Ellen Judson: I think it is a real challenge with the media exemptions, because it is a recognised tactic of state-based actors, state-aligned actors and non-state actors to use media platforms as ways to disseminate this information. If you can make a big enough story out of something, it gets into the media and that perpetuates the campaign of abuse, harassment and disinformation. If there are protections in place, it will not take disinformation actors very long to work out that if there are ways that they can get stories into the press, they are effectively covered.

In terms of platform enforceability, if platforms are asked to, for instance, look at their systems of amplification and what metrics they use to recommend or promote content to users, and to do that from a risk-based perspective and based on harm except when they are talking about media, it all becomes a bit fuzzy what a platform would actually be expected to do in terms of curating those sorts of content.

Kyle Taylor: As an example, Russia Today, until its broadcast licence was revoked about three months ago, would have qualified for the media exemption. Disinformation from Russia Today is not new; it has been spreading disinformation for years and years, and would have qualified for the media exemption until very recently.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q So as a result of these exemptions, the Bill as it stands could make the internet less safe than it currently is.

Kyle Taylor: The Bill as it stands could absolutely make the internet less safe than it currently is.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q You have done a really good job of explaining the concerns about journalistic content. Thinking about the rest of the Bill for a moment, do you think the balance between requiring the removal of content and the prioritisation of content is right? Do you think it will be different from how things are now? Do you think there is a better way it could be done in the Bill?

Ellen Judson: The focus at the moment is too heavily on content. There is a sort of tacit equation of content removal—sometimes content deprioritisation, but primarily content removal—as the way to protect users from harm, and as the threat to freedom of expression. That is where the tension comes in with how to manage both those things at once. What we would want from a Bill that was taking more of a systems approach is thinking: where are platforms making decisions about how they are designing their services, and how they are operating their services at all levels? Content moderation policy is certainly included, but it goes back to questions of how a recommendation algorithm is designed and trained, who is involved in that process, and how human moderators are trained and supported. It is also about what functionality users are given and what behaviour is incentivised and encouraged. There is a lot of mitigation that platforms can put in place that does not talk about directly affecting user content.

I think we should have risk assessments that focus on the risks of harms to users, as opposed to the risk of users encountering harmful content. Obviously there is a relationship, but one piece of content may have very different effects when it is encountered by different users. It may cause a lot of harm to one user, whereas it may not cause a lot of harm to another. We know that when certain kinds of content are scaled and amplified, and certain kinds of behaviour are encouraged or incentivised, we see harms at a scale that the Bill is trying to tackle. That is a concern for us. We want more of a focus on some things that are mentioned in the Bill—business models, platform algorithms, platform designs and systems and processes. They often take a backseat to the issues of content identification and removal.

Kyle Taylor: I will use the algorithm as an example, because this word flies around a lot when we talk about social media. An algorithm is a calculation that is learning from people’s behaviour. If society is racist, an algorithm will be racist. If society is white, an algorithm will be white. You can train an algorithm to do different things, but you have to remember that these companies are for-profit businesses that sell ad space. The only thing they are optimising for in an algorithm is engagement.

What we can do, as Ellen said, through a system is force optimisation around certain things, or drive algorithms away from certain types of content, but again, an algorithm is user-neutral. An algorithm does not care what user is saying what; it is just “What are people clicking on?”, regardless of what it is or who said it. An approach to safety has to follow the same methodology and say, “We are user-neutral. We are focused entirely on propensity to cause harm.”

The second piece is all the mitigation measures you can take once a post is up. There has been a real binary of “Leave it up” and “Take it down”, but there is a whole range of stuff—the most common word used is “friction”—to talk about what you can do with content once it is in the system. You have to say to yourself, “Okay, we absolutely must have free speech protections that exceed the platform’s current policies, because they are not implemented equally.” At the same time, you can preserve someone’s free expression by demonetising content to reduce the incentive of the company to push that content or user through its system. That is a way of achieving both a reduction in harm and the preservation of free expression.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

May I just ask one more question, Chair?

None Portrait The Chair
- Hansard -

Briefly, because there are two other Members and the Minister wishing to ask questions.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Thanks. On the propensity to cause harm, we heard earlier that a company might create a great new feature and put it out, but then there is a period—a lag, if you like—before they realise the harm that is being caused. Do you trust that companies would have the ability to understand in advance of doing something what harm it may cause, and adequately to assess that?

Ellen Judson: I think there are a lot of things that companies could be doing. Some of these things are in research that they probably are conducting. As we have seen from the Facebook files, companies are conducting that sort of research, but we aren’t privy to the results. I think there a couple of things we want to see. First, we want companies to have to be more transparent about what kind of testing they have done, or, if not testing, about who they have consulted when designing these products. Are they consulting human rights experts? Are they consulting people who are affected by identity-based harm, or are they just consulting their shareholders? Even that would be a step in the right direction, and that is why it is really important.

We feel that there need to be stronger provisions in the Bill for independent researcher and civil society access to data. Companies will be able to do certain amounts of things, and regulators will have certain powers to investigate and do their own research, but it requires the added efforts of civil society properly to hold companies to account for the effects of certain changes they have made—and also to help them in identifying what the effects of those changes to design have been. I think that is really crucial.

None Portrait The Chair
- Hansard -

We are playing “Beat the clock”. I am going to ask for brief answers and brief questions, please. I will take one question from Kim Leadbeater and one from Barbara Keeley.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Gosh, right. I think we are clear that your view is that these two exceptions could potentially do more harm than good. The ideal scenario from your perspective would be to remove them, but again, the challenge is how we balance the freedom of speech issue with protecting the rights of people online who are vulnerable to abuse and harassment. How would you respond to those who say that the Bill risks setting an unwitting precedent for non-democratic countries that would seek to restrict the freedom of expression of their citizens?

Ellen Judson: There is absolutely a risk of over-moderation, and of the Bill incentivising over-moderation, particularly because of the very heavy content focus. Even with illegal content, there is a very broad range of content that companies are expected proactively to monitor for, even when the technical systems to identify that content reliably at scale are perhaps not in place. I absolutely understand and share the concern about over-moderation.

Our response would be that we should look to strengthen the freedom of expression duties currently in the Bill. At the moment, there is a quite vague duty to have regard to the importance of freedom of expression, but it is not at all clear what that would actually mean, and what would be expected from the platforms. One change we would want would be for rights—including freedom of expression and privacy—to be included in the online safety objectives, and to establish that part of the purpose of this regime is to ensure that services are being designed to protect and promote human rights, including freedom of expression. We think that would be a way to bring freedom of expression much more into the centre of the regime and the focus of the Bill, without having to have those add-on exemptions after the fact.

Kyle Taylor: And it creates a level playing field—it says, “These rules apply to everyone equally.”

On the second point, authoritarian—absolutely—but the other area that is really important is fragile democracies. For example, if you look at Hungary, just last week Viktor Orbán said, “You know what you need? Your own media.” If we are setting a standard that says it is totally fine to exempt people in politics and media, then for those fragile democracies that control most aspects of information sharing, we are explicitly saying that it is okay to privilege them over others. That is a very dangerous precedent to set when we have the opportunity to set best global standards here with the Bill.

None Portrait The Chair
- Hansard -

Barbara Keeley?

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I have a really simple question. You have touched on the balance between free speech rights and the rights of people who are experiencing harassment, but does the Bill do enough to protect human rights?

Ellen Judson: At the moment, no. The rights that are discussed in the Bill at the minute are quite limited: primarily, it is about freedom of expression and privacy, and the way that protections around privacy have been drafted is less strong than for those around freedom of expression. Picking up on the question about setting precedents, if we have a Bill that is likely to lead to more content moderation and things like age verification and user identity verification, and if we do not have strong protections for privacy and anonymity online, we are absolutely setting a bad precedent. We would want to see much more integration with existing human rights legislation in the Bill.

Kyle Taylor: All I would add is that if you look at the exception for content of democratic importance, and the idea of “active political issue”, right now, conversion therapy for trans people—that has been described by UN experts as torture—is an active political issue. Currently, the human rights of trans people are effectively set aside because we are actively debating their lives. That is another example of how minority and marginalised people can be negatively impacted by this Bill if it is not more human rights-centred.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Let me start with this concept—this suggestion, this claim—that there is special protection for politicians and journalists. I will come to clause 50, which is the recognised news publisher exemption, in a moment, but I think you are referring to clauses 15 and 16. If we turn to those clauses and read them carefully, they do not specifically protect politicians and journalists, but “content of democratic importance” and “journalistic content”. It is about protecting the nature of the content, not the person who is speaking it. Would you accept that?

Ellen Judson: I accept that that is what the Bill currently says. Our point was thinking about how it will be implemented in practice. If platforms are expected to prove to a regulator that they are taking certain steps to protect content of democratic importance—in the explanatory notes, that is content related to Government policy and political parties—and they are expected to prove that they are taking a special consideration of journalistic content, the most straightforward way for them to do that will be in relation to journalists and politicians. Given that it is such a broad category and definition, that seems to be the most likely effect of the regime.

Kyle Taylor: It is potentially—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry, Kyle, do come in in a second, but I just want to come back on that point.

Is it not true that a member of the public or anyone debating a legitimate political topic would also benefit from these measures? It is likely that MPs would automatically benefit—near automatically—but a member of the public might equally benefit if the topic they are talking about is of democratic or journalistic importance.

Ellen Judson: Our concern is that defining what is a legitimate political debate is itself already privileging. As you said, an MP is very likely automatically to benefit.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Well, it is likely; I would not say it is guaranteed.

Ellen Judson: A member of the public may be discussing something—for example, an active political debate that is not about the United Kingdom, which I believe would be out of scope of that protection. They would be engaged in political discussion and exercising freedom of expression, and if they were not doing so in a way that met the threshold for action based on harm, their speech should also come under those protections.

Kyle Taylor: I would add that the way in which you have described it would be so broad as to effectively be meaningless in the context of the Bill, and that instead we should be looking for universal free expression protections in that part of the Bill, and removing this provision. Because what is not, in a liberal democracy, speech of democratic importance? Really, that is everything. When does it reach the threshold where it is an active political debate? Is it when enough people speak about it or enough politicians bring it up? It is so subjective and so broad effectively to mean that everything could qualify. Again, this is not taking a harms-based approach to online safety, because the question is not “Who is saying it?” or “In what context?”; the question is, “Does this have the propensity to cause harm at scale?”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The harms are covered elsewhere in the Bill. This is saying what you have to take into account. In fact, at the very beginning of your remarks, Kyle, you said that some of the stuff in the US a week or two ago might have been allowed to stand under these provisions, but the provision does not provide an absolute protection; it simply says that the provider has to take it into account. It is a balancing exercise. Other parts of the Bill say, “You’ve got to look at the harm on a systemic basis.” This is saying, “You’ve got to take into account whether the content is of democratic or journalistic importance.” You made a point a second ago about general protection on free speech, which is in clause 19(2).

Kyle Taylor: Can I respond to that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, sure.

Kyle Taylor: My point is that if there is a provision in the Bill about freedom of expression, it should be robust enough that this protection does not have to be in the Bill. To me, this is saying, “Actually, our free expression bit isn’t strong enough, so we’re going to reiterate it here in a very specific context, using very select language”. That may mean that platforms decide not to act for fear of reprisal, as opposed to pursuing online safety. I suggest strengthening the freedom of expression section so that it hits all the points that the Government intend to hit, and removing those qualifiers that create loopholes and uncertainty for a regime that, if it is systems-based, does not have loopholes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I understand the point you are making, logically. Someone mentioned the human rights element earlier. Of course, article 10 of the European convention on human rights expresses the right to freedom of speech. The case law deriving from that ECHR article provides an enhanced level of protection, particularly for freedom of the press relative to otherwise, so there is some established case law which makes that point. You were talking about human rights earlier, weren’t you?

Ellen Judson: We absolutely recognise that. There is discussion in terms of meeting certain standards of responsible journalism in relation to those protections. Our concern is very much that the people and actors who would most benefit from the journalistic protections specifically would be people who do not meet those standards and cannot prove that they meet those standards, because the standards are very broad. If you intend your content to be journalistic, you are in scope, and that could apply to extremists as much as to people meeting standards of responsible journalism.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q If you are talking about clause 16, it is not that you intend it to be journalistic content; it is that it is journalistic content. You might be talking about clause 50, which is the general exemption to recognise news publishers from the provisions of the Bill. That of course does not prevent social media platforms from choosing to apply their terms and conditions to people who are recognised news publishers; it is just that the Bill is not compelling them. It is important to make that clear—that goes back to the point you made right at the beginning, Kyle. A couple of times in your testimony so far, you have said that you think the way the definition of “recognised news publisher” is drafted in clause 50 is too wide, and potentially susceptible to, basically, abuse by people who are in essence pretending to be news publishers, but who are not really. They are using this as a way to get a free pass from the provisions of the Bill. I completely understand that concern. Do you have any specific suggestions for the Committee about how that concern might be addressed? How could we change the drafting of the Bill to deal with that issue?

Kyle Taylor: Remove the exemption.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q You mean completely? Just delete it?

Kyle Taylor: Well, I am struggling to understand how we can look at the Bill and say, “If this entity says it, it is somehow less harmful than if this entity says it.” That is a two-tiered system and that will not lead to online safety, especially when those entities that are being given privilege are the most likely and largest sources and amplifiers of harmful content online. We sit on the frontlines of this every day, looking at social media, and we can point to countless examples from around the world that will show that, with these exemptions, exceptions and exclusions, you will actually empower those actors, because you explicitly say that they are special. You explicitly say that if they cause harm, it is somehow not as bad as if a normal user with six followers on Twitter causes harm. That is the inconsistency and incoherency in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are talking here about the press, not about politicians—

Kyle Taylor: Yes, but the press and media entities spread a lot of disinformation—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I get that. You have mentioned Victor Orbán and the press already in your comments. There is a long-standing western tradition of treating freedom of the press as something that is sacrosanct and so foundational to the functioning of democracy that you should not infringe or impair it in any way. That is the philosophy that underpins this exclusion.

Kyle Taylor: Except that that is inconsistent in the Bill, because you are saying that for broadcast, they must have a licence, but for print press, they do not have to subscribe to an independent standards authority or code. Even within the media, there is this inconsistency within the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a point that applies regardless of the Bill. The fact is that UK broadcast is regulated whereas UK newspapers are not regulated, and that has been the case for half a century. You can debate whether that is right or wrong, but—

Kyle Taylor: We are accepting that newspapers are not regulated then.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That matter stands outside the scope of the Bill. If one was minded to tighten this up—I know that you have expressed a contrary view to the thing just being deleted—and if you were to accept that the freedom of the press is something pretty sacrosanct, but equally you don’t want it to be abused by people using it as a fig leaf to cover malfeasant activity, do you have any particular suggestions as to how we can improve the drafting of that clause?

Kyle Taylor: I am not suggesting that the freedom of the press is not sacrosanct. Actually, I am expressing the opposite, which is that I believe that it is so sacrosanct that it should be essential to the freedom-of-expression portion of the Bill, and that the press should be set to a standard that meets international human rights and journalistic standards. I want to be really clear that I absolutely believe in freedom of the press, and it is really important that we don’t leave here suggesting that we don’t think that the press should be free—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I got that, but as I say, article 10 case law does treat the press a little differently. We are about to run out of time. I wanted to ask about algorithms, which I will probably not have a chance to do, but are there any specific changes to the clause that you would urge us to make?

Ellen Judson: To the media exemption—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To clause 50, “Recognised news publisher”.

Ellen Judson: One of the changes that the Government have indicated that they are minded to make—please correct me if I misunderstood—is to introduce a right to appeal.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Correct.

Ellen Judson: Content having to stay online while the appeal was taking place I would very much urge not to be introduced, on the grounds that the content staying online might then be found to be incredibly harmful, and by the time you have got through an appeals process, it will already have done the damage it was going to do. So, if there is a right to appeal—I would urge there not to be a particular right to appeal beyond what is already in the Bill, but if that is to be included, not having the restriction that the platforms must carry the content while the appeal process is ongoing would be important.

Kyle Taylor: You could require an independent standards code as a benchmark at least.

None Portrait The Chair
- Hansard -

Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions. It also brings us to the end of the day’s sitting. On behalf of the Committee, I thank the witnesses for your evidence. As you ran out of time and the opportunity to frame answers, if you want to put them in writing and offer them to the Minister, I am sure they will be most welcome. The Committee will meet again on Thursday at 11.30 am in this room to hear further evidence on the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

18:00
Adjourned till Thursday 26 May at half-past Eleven o’clock.
Written evidence to be reported to the House
OSB01 Professional Publishers Association (PPA)
OSB02 Neil Kendall and others
OSB03 Girlguiding
OSB04 Alliance to Counter Crime Online and the World Parrot Trust (joint submission)
OSB05 Which?
OSB06 Index on censorship
OSB07 Alliance for intellectual property
OSB08 Internet Services Providers’ Association (ISPA UK)
OSB09 International Justice Mission (IJM UK)
OSB10 Local Government Association (LGA)
OSB11 Russ Elliott
OSB12 SWGfL - Safety & Security Online
OSB13 Action for Primates and Lady Freethinker
OSB14 Association of British Insurers (ABI)
OSB15 Microsoft
OSB16 Age Verification Providers Association
OSB17 techUK
OSB18 UK Interactive Entertainment (Ukie), the trade association for the UK’s video games industry
OSB19 Centre for Media Monitoring
OSB20 Save Online Speech Coalition
OSB21 Bumble
OSB22 Professor Clare McGlynn
OSB23 Antisemitism Policy Trust