Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me. I would like to discuss your thoughts on transparency and how we can make social media companies like Meta be more transparent and open with their data, beyond the measures we currently have in the Bill. For instance, we could create statute to allow academics or researchers in to examine their data. Do you have any thoughts on how this can be incentivised?

Stephen Almond: Transparency is a key foundation of data protection law in and of itself. As the regulator in this space, I would say that there is a significant emphasis within the data protection regime on ensuring that companies are transparent about the processing of personal data that they undertake. We think that that provides proportionate safeguards in this space. I would not recommend an amendment to the Bill on this point, because I would be keen to avoid duplication or an overlap between the regimes, but it is critical; we want companies to be very clear about how people’s personal data is being processed. It is an area that we are going to continue to scrutinise.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

May I ask a supplementary to that before I come on to my main question?

None Portrait The Chair
- Hansard -

Absolutely.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Thank you so much for coming along. You spoke in your initial comments to my colleague about encryption. The challenges of encryption around child abuse images have been raised with us previously. How can we balance the need to allow people to have encrypted options, if possible, with the need to ensure that this does not adversely affect organisations such as the Internet Watch Foundation, which does so much good in protecting children and rooting out child abuse imagery?

Stephen Almond: I share your concern about this. To go back to what I was saying before, I think the approach that is set out in the Bill is proportionate and targeted. The granting of, ultimately, backstop powers to Ofcom to issue technology notices and to require services to deal with this horrendous material will have a significant impact. I think this will ensure that the regime operates in a risk-based way, where risks can be identified. There will be the firm expectation on service providers to take action, and that will require them to think about all the potential technological solutions that are available to them, be they content scanning or alternative ways of meeting their safety duties.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q My main question is about child safety, which is a prime objective for the Government in this legislation. Do you feel that the Bill’s definition of “likely to be accessed by children” should be more closely aligned with the one used in the ICO’s age-appropriate design code?

Stephen Almond: The objectives of both the Online Safety Bill and the children’s code are firmly aligned in respect of protecting children online. We have reviewed the definitions and, from our perspective, there are distinctions in the definition that is applied in the Bill and the children’s code, but we find no significant tension between them. My focus at the ICO, working in co-operation with Ofcom, will ultimately be on ensuring that there is clarity for business on how the definitions apply to their services, and that organisations know when they are in scope of the children’s code and what actions they should take.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Do you think any further aspects of the age-appropriate design code should be incorporated into the Bill?

Stephen Almond: We are not seeking to incorporate further aspects of the code into the Bill. We think it is important that the regimes fit together coherently, but that that is best achieved through regulatory co-operation between the ICO and Ofcom. The incorporation of the children’s code would risk creating some form of regulatory overlap and confusion.

I can give you a strong assurance that we have a good track record of working closely with Ofcom in this area. Last year, the children’s code came into force, and not too longer after it, Ofcom’s video-sharing platform regime came into force. We have worked very closely to make sure that those regimes are introduced in a harmonised way and that people understand how they fit together.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Working closely with Ofcom is really good, but do you think there needs to be a duty to co-operate with Ofcom, or indeed with other regulators—to be specified in the Bill—in case relations become more tense in future?

Stephen Almond: The Bill has, in my view, been designed to work closely alongside data protection law. It supports effective co-operation between us and Ofcom by requiring and setting out a series of duties for Ofcom to consult with the ICO on the development of any codes of practice or formal guidance with an impact on privacy. With that framework in mind, I do not think there is a case to instil further co-operation duties in that way. I hope I can give you confidence that we and Ofcom will be working tirelessly together to promote the safety and privacy of citizens online. It is firmly in our interests and in the interest of society as a whole to do so.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you for joining us, Mr Almond. You stated the aim of making the UK the

“safest place in the world to be online”.

In your view, what needs to be added or taken away from the Bill to achieve that?

Stephen Almond: I am not best placed to comment on the questions of online safety and online harms. You will speak to a variety of different experts who can comment on that point. From my perspective as a digital regulator, one of the most important things will be ensuring that the Bill is responsive to future challenges. The digital world is rapidly evolving, and we cannot necessarily envisage all the developments in technology that will come, or the emergence of new harms. The data protection regime is a principles-based piece of legislation. That gives us a great degree of flexibility and discretion to adapt to novel forms of technology and to provide appropriate guidance as challenges emerge. I really recommend retaining that risk-based, principles-based approach to regulation that is envisaged currently in the Online Safety Bill.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Lynn Perry is on the line, but we have lost her for the moment. I am afraid we are going to have to press on.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to focus on one particular issue, which is anonymity. Kick It Out has done so much with the FA to raise awareness of that issue. I was interested in your views on how the Bill treats that. The Bill mentions anonymity and pseudonymity, but it does so only once. Should the Bill take a clearer stance on online anonymity? Do you have any views on whether people should be able to use the internet fully anonymously, or should they disclose their identity to the platform? Do you have any thoughts on that? You have done a huge amount of work on it.

Sanjay Bhandari: There is quite a lot in that question. In terms of whether people should be fully anonymous or not, it depends on what you mean by fully. I am a lawyer, so I have 30 years specialising in the grey, rather than in the black and white. It really does depend on what you mean by fully. In my experience, nothing is absolute. There is no absolute right to freedom of speech; I cannot come in here and shout “Fire!” and make you all panic. There is also no absolute right to anonymity; I cannot use my anonymity online as a cloak to commit fraud. Everything is qualified. It is a question of what is the balance of those qualifications and what those qualifications should be, in the particular context of the problem that we are seeking to address.

The question in this context is around the fact that anonymity online is actually very important in some contexts. If you are gay in a country where that is illegal, being anonymous is a fantastic way to be able to connect with people like you. In a country that has a more oppressive regime, anonymity is another link to the outside world. The point of the Bill is to try to get the balance so that anonymity is not abused. For example, when a football player misses a penalty in a cup final, the point of the Bill is that you cannot create a burner account and instantly send them a message racially abusing them and then delete the account—because that is what happens now. The point of the Bill, which we are certainly happy with in general terms, is to draw a balance in the way that identity verification must be offered as an option, and to give users more power over who they interact with, including whether they wish to engage only with verified accounts.

We will come back and look in more detail at whether we would like more amendments, and we will also work with other organisations. I know that my colleague Stephen Kinsella of Clean up the Internet has been looking at those anonymity provisions and at whether verification should be defined and someone’s status visible on the face of the platforms, for instance. I hope that answers those two or three questions.

Maria Miller Portrait Mrs Miller
- Hansard - -

That is very helpful; thank you.

None Portrait The Chair
- Hansard -

I saw you nodding, Ms Perry. Do you wish to add anything?

Lynn Perry: I agree. The important thing, particularly from the perspective of Barnardo’s as a children’s charity, is the right of children to remain safe and protected online and in no way compromised by privacy or anonymity considerations online. I was nodding along at certain points to endorse the need to ensure that the right balance is struck for protections for those who might be most vulnerable.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q We have heard a lot from other witnesses about the ability of Ofcom to regulate the smaller high-risk platforms. What is your view on that?

Poppy Wood: Absolutely, and I agree with what was said earlier, particularly by groups such as HOPE not hate and Antisemitism Policy Trust. There are a few ways to do this, I suppose. As we are saying, at the moment the small but high-risk platforms just are not really caught in the current categorisation of platforms. Of course, the categories are not even defined in the Bill; we know there are going to be categories, but we do not know what they will be.

I suppose there are different ways to do this. One is to go back to where this Bill started, which was not to have categories of companies at all but to have a proportionality regime, where depending on your size and your functionality you had to account for your risk profile, and it was not set by Ofcom or the Government. The problem of having very prescriptive categories—category 1, category 2A, category 2B—is, of course, that it becomes a race to the bottom in getting out of these regulations without having to comply with the most onerous ones, which of course are category 1.

There is also a real question about search. I do not know how they have wriggled out of this, but it was one of the biggest surprises in the latest version of the Bill that search had been given its own category without many obligations around adult harm. I think that really should be revisited. All the examples that were given earlier today are absolutely the sort of thing we should be worrying about. If someone can google a tractor in their workplace and end up looking at a dark part of the web, there is a problem with search, and I think we should be thinking about those sorts of things. Apologies for the example, but it is a really, really live one and it is a really good thing to think about how search promotes these kinds of content.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to touch on something we have not talked about a lot today, which is enforcement and the enforcement powers in the Bill. There are significant enforcement powers in the Bill, but do our two witnesses here which those enforcement powers are enough. Eva?

Eva Hartshorn-Sanders: Are you specifically asking about the takedown notices and the takedown powers?

Maria Miller Portrait Mrs Miller
- Hansard - -

No, I am talking about director liability and the enforcement on companies.

Eva Hartshorn-Sanders: Right. I think the responsibility on both companies and senior executives is a really critical part of this legislative package. You see how adding liability alongside financial penalties works in health and safety legislation and corporate manslaughter provisions to motivate changes not only within company culture but in the work that they are doing and what they factor into the decisions they make. It is a critical part of this Bill.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Is there more that could or should be added to the Bill?

Eva Hartshorn-Sanders: I think it is a good start. I would want to have another look at it to say more. There is a review after two years, as set out in clause 149, so there could be a factor that gets added into that, as well.

Maria Miller Portrait Mrs Miller
- Hansard - -

Poppy, do you have anything to add?

Poppy Wood: Yes. I think we could go much further on enforcement. One of the things that I really worry about is that if the platforms make an inadequate risk assessment, there is not much that Ofcom can do about it. I would really like to see powers for Ofcom to say, “Okay, your risk assessment hasn’t met the expectations that we put on you, so we want you to redo it. And while you’re redoing it, we may want to put you into a different category, because we may want to have higher expectations of you.” That way, you cannot start a process where you intentionally make an inadequate risk assessment in order to extend the process of you being properly regulated. I think that is one thing.

Then, going back to the point about categorisation, I think that Ofcom should be given the power to recategorise companies quickly. If you think that a category 2B company should be a category 1 company, what powers are there for Ofcom to do that? I do not believe that there are any for Ofcom to do that, certainly not to do it quickly, and when we are talking about small but high-risk companies, that is absolutely the sort of thing that Ofcom should be able to do—to say, “Okay, you are now acting like a category 1 company.” TikTok, Snapchat—they all started really small and they accelerated their growth in ways that we just could not have predicted. When we are talking about the emergence of new platforms, we need to have a regulator that can account for the scale and the pace at which these platforms grow. I think that is a place where I would really like to see Ofcom focusing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a question for the Centre for Countering Digital Hate. I raised some of your stats on reporting with Meta—Facebook—when they were here, such as the number of reports that are responded to. They basically said, “This is not true any more; we’re now great”—I am paraphrasing, obviously. Could you please let us know whether the reporting mechanism on major platforms—particularly Facebook—is now completely fixed, or whether there are still lots of issues with it?

Eva Hartshorn-Sanders: There are still lots of issues with it. We recently put a report out on anti-Muslim hatred and found that 90% of the content that was reported was not acted on. That was collectively, across the platforms, so it was not just Facebook. Facebook was in the mid-90s, I think, in terms of its failure to act on that type of harmful content. There are absolutely still issues with it, and this regulation—this law—is absolutely necessary to drive change and the investment that needs to go into it.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q A great deal of the discussion we are having about this Bill is its scope—what is covered and what is not covered. Many of us will look regularly at newspapers online, particularly the comments sections, which can be quite colourful. Should comments on newspaper publisher platforms be included in the scope of the Bill?

Owen Meredith: Yes, I think they should be included within the news publisher exemption as it is spelt out. As far as I understand, that has always been the intention, since the original White Paper many years ago that led to where we are today. There is a very good reason for that, not least the fact that the comments on news publisher websites are still subject to the responsibility of the editor and the publisher; they are subject to the regulation of the Independent Press Standards Organisation, in the case of those publishers who are regulated under the self-regulation system by IPSO, as the majority of my members are. There is a very different environment in news publisher websites’ comments sections, where you are actively seeking to engage with those and read those as a user, whereas on social media platforms that content can come to you without you wishing to engage with it.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Can I just probe on that slightly? You say the comments are the responsibility of the editor. Does that mean that if something is published on there that is defamatory, it would then be attributed to the editor?

Owen Meredith: Everything published by the news site is ultimately the responsibility of the editor.

Matt Rogerson: I think there are various cases. I think Delfi is the relevant case in relation to comments, where if a publisher is notified of a defamatory comment within their comments section, they are legally liable for it if they do not take it down. To speak from a Guardian perspective, we would like comments sections to be included within the exemption. The self-regulation we have in place for our comments section has been quite a journey. We undertook quite a big bit of research on all the comments that had been left over an 11-year period. We tightened up significantly the processes that we had in place. We currently use a couple of steps to make sure those comments sections are well moderated. We use machine learning against very tightly defined terms, and then every single comment that is taken down is subject to human review. I think that works in the context of a relatively small website such as The Guardian, but it would be a much bigger challenge for a platform of the size of Facebook.

None Portrait The Chair
- Hansard -

Kim Leadbeater?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me, because I know others will want to come in. How do you think platforms such as Meta—I know we have used Meta as an example, but there are others—can be incentivised, beyond the statutory duty that we are currently imposing, to publish their data to allow academics and researchers into their platforms to examine exactly what is going on? Or is this the only way?

Frances Haugen: All industries that live in democratic societies must live within democratic processes, so I do believe that it is absolutely essential that we the public, through our democratic representatives like yourself, have mandatory transparency. The only two other paths I currently see towards getting any transparency out of Meta, because Meta has demonstrated that it does not want to give even the slightest slivers of data—for example, how many moderators there are—are via ESG, so we can threaten then with divestment by saying, “Prosocial companies are transparent with their data,” and via litigation. In the United States, sometimes we can get data out of these companies through the discovery process. If we want consistent and guaranteed access to data, we must put it in the Bill, because those two routes are probabilistic—we cannot ensure that we will get a steady, consistent flow of data, which is what we need to have these systems live within a democratic process.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Turning to the issue of child safety and online abuse with images involving children, what should be added to or removed from the Bill to improve how it protects children online? Have you got any thoughts on that? Some groups have described the Bill’s content as overly broad. Would you make any comments on how effective it will be in terms of online safety for children?

Frances Haugen: I am not well versed on the exact provisions in the Bill regarding child safety. What I can say is that one of the most important things that we need to have in there is transparency around how the platforms in general keep children under the age of 13 off their systems—transparency on those processes—because we know that Facebook is doing an inadequate job. That is the single biggest lever in terms of child safety.

I have talked to researchers at places like Oxford and they talk about how, with social media, one of the critical windows is when children transition through puberty, because they are more sensitive on issues, they do not have great judgment yet and their lives are changing in really profound ways. Having mandatory transparency on what platforms are doing to keep kids off their platforms, and the ability to push for stronger interventions, is vital, because keeping kids off them until they are at least 13, if not 16, is probably the biggest single thing we can do to move the ball down the field for child safety.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q You say that transparency is so important. Can you give us any specifics about particular areas that should be subject to transparency?

Frances Haugen: Specifically for children or across the whole platform?

Maria Miller Portrait Mrs Miller
- Hansard - -

Specifically for children.

Frances Haugen: I will give you an example. Facebook has estimated ages for every single person on the platform, because the reality is that lots of adults also lie about their ages when they join, and advertisers want to target very specific demographics—for example, if you are selling a kit for a 40th birthday, you do not want to mis-target that by 10 years. Facebook has estimated ages for everyone on the platform. It could be required to publish every year, so that we could say, “Hey, there are four kids on the platform who you currently believe, using your estimated ages, are 14 years old—based not on how old they say they are, but on your estimate that this person is 14 years old. When did they join the platform? What fraction of your 14-year-olds have been on the platform since they were 10?” That is a vital statistic.

If the platforms were required to publish that every single quarter, we could say, “Wow! You were doing really badly four years ago, and you need to get a lot better.” Those kinds of lagging metrics are a way of allowing the public to grade Facebook’s homework, instead of just trusting Facebook to do a good job.

Facebook already does analyses like this today. They already know that on Facebook Blue, for example, for some age cohorts, 20% of 11-year-olds were on the platform—and back then, not that many kids were online. Today, I would guess a much larger fraction of 11-year-olds are on Instagram. We need to have transparency into how badly they are doing their jobs.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Frances, do you think that the Bill needs to set statutory minimum standards for things such as risk assessments and codes of practice? What will a company such as Facebook do without a minimum standard to go by?

Frances Haugen: It is vital to get into the statute minimum standards for things such as risk assessments and codes of conduct. Facebook has demonstrated time and again—the reality is that other social media platforms have too—that it does the bare minimum to avoid really egregious reputational damage. It does not ensure the level of quality needed for public safety. If you do not put that into the Bill, I worry that it will be watered down by the mountains of lobbyists that Facebook will throw at this problem.