Debates between Kim Leadbeater and Maria Miller during the 2019-2024 Parliament

Thu 9th Jun 2022
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Tue 24th May 2022

Online Safety Bill (Seventh sitting)

Debate between Kim Leadbeater and Maria Miller
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Will the Minister give way?

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Will the Minister give way?

Online Safety Bill (Fourth sitting)

Debate between Kim Leadbeater and Maria Miller
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Working closely with Ofcom is really good, but do you think there needs to be a duty to co-operate with Ofcom, or indeed with other regulators—to be specified in the Bill—in case relations become more tense in future?

Stephen Almond: The Bill has, in my view, been designed to work closely alongside data protection law. It supports effective co-operation between us and Ofcom by requiring and setting out a series of duties for Ofcom to consult with the ICO on the development of any codes of practice or formal guidance with an impact on privacy. With that framework in mind, I do not think there is a case to instil further co-operation duties in that way. I hope I can give you confidence that we and Ofcom will be working tirelessly together to promote the safety and privacy of citizens online. It is firmly in our interests and in the interest of society as a whole to do so.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Q Thank you for joining us, Mr Almond. You stated the aim of making the UK the

“safest place in the world to be online”.

In your view, what needs to be added or taken away from the Bill to achieve that?

Stephen Almond: I am not best placed to comment on the questions of online safety and online harms. You will speak to a variety of different experts who can comment on that point. From my perspective as a digital regulator, one of the most important things will be ensuring that the Bill is responsive to future challenges. The digital world is rapidly evolving, and we cannot necessarily envisage all the developments in technology that will come, or the emergence of new harms. The data protection regime is a principles-based piece of legislation. That gives us a great degree of flexibility and discretion to adapt to novel forms of technology and to provide appropriate guidance as challenges emerge. I really recommend retaining that risk-based, principles-based approach to regulation that is envisaged currently in the Online Safety Bill.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Specifically for children.

Frances Haugen: I will give you an example. Facebook has estimated ages for every single person on the platform, because the reality is that lots of adults also lie about their ages when they join, and advertisers want to target very specific demographics—for example, if you are selling a kit for a 40th birthday, you do not want to mis-target that by 10 years. Facebook has estimated ages for everyone on the platform. It could be required to publish every year, so that we could say, “Hey, there are four kids on the platform who you currently believe, using your estimated ages, are 14 years old—based not on how old they say they are, but on your estimate that this person is 14 years old. When did they join the platform? What fraction of your 14-year-olds have been on the platform since they were 10?” That is a vital statistic.

If the platforms were required to publish that every single quarter, we could say, “Wow! You were doing really badly four years ago, and you need to get a lot better.” Those kinds of lagging metrics are a way of allowing the public to grade Facebook’s homework, instead of just trusting Facebook to do a good job.

Facebook already does analyses like this today. They already know that on Facebook Blue, for example, for some age cohorts, 20% of 11-year-olds were on the platform—and back then, not that many kids were online. Today, I would guess a much larger fraction of 11-year-olds are on Instagram. We need to have transparency into how badly they are doing their jobs.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Frances, do you think that the Bill needs to set statutory minimum standards for things such as risk assessments and codes of practice? What will a company such as Facebook do without a minimum standard to go by?

Frances Haugen: It is vital to get into the statute minimum standards for things such as risk assessments and codes of conduct. Facebook has demonstrated time and again—the reality is that other social media platforms have too—that it does the bare minimum to avoid really egregious reputational damage. It does not ensure the level of quality needed for public safety. If you do not put that into the Bill, I worry that it will be watered down by the mountains of lobbyists that Facebook will throw at this problem.

Online Safety Bill (Third sitting)

Debate between Kim Leadbeater and Maria Miller
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Can I bring Lorna in here? We are talking about moving from content to the drivers of harm. Where would you suggest that should be achieved within the Bill?

Professor Lorna Woods: I think by an overarching risk assessment rather than one that is broken down into the different types of content, because that, in a way, assumes a certain knowledge of the type of content before you can do a risk assessment, so you are into a certain circular mode there. Rather than prejudging types of content, I think it would be more helpful to look at what is there and what the system is doing. Then we could look at what a proportionate response would be—looking, as people have said, at the design and the features. Rather than waiting for content to be created and then trying to deal with it, we could look at more friction at an earlier stage.

If I may add a technical point, I think there is a gap relating to search engines. The draft Bill excluded paid-for content advertising. It seems that, for user-to-user content, this is now in the Bill, bringing it more into line with the current standards for children under the video-sharing platform provisions. That does not apply to search. Search engines have duties only in relation to search content, and search content excludes advertising. That means, as I read it, that search engines would have absolutely no duties to children under their children safety duty in relation to advertising content. You could, for example, target a child with pornography and it would fall outside the regime. I think that is a bit of a gap.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Q Thank you, witnesses, for your time this morning. I am going to focus initially on journalistic content. Is it fair that the platforms themselves are having to try to define what journalistic content is and, by default, what a journalist is? Do you see a way around this?

William Moy: No, no, yes. First, no, it is not fair to put that all on the platforms, particularly because—I think this a crucial thing for the Committee across the Bill as a whole—for anything to be done at internet scale, it has to be able to be done by dumb robots. Whatever the internet companies tell you about the abilities of their technology, it is not magic, and it is highly error-prone. For this duty to be meaningful, it has to be essentially exercised in machine learning. That is really important to bear in mind. Therefore, being clear about what it is going to tackle in a way that can be operationalised is important.

To your second point, it is really important in this day and age to question whether journalistic content and journalists equate to one another. I think this has come up in a previous session. Nowadays, journalism, or what we used to think of as journalism, is done by all kinds of people. That includes the same function of scrutiny and informing others and so on. It is that function that we care about—the passing of information between people in a democracy. We need to protect that public interest function. I think it is really important to get at that. I am sure there are better ways of protecting the public interest in this Bill by targeted protections or specifically protecting freedom of expression in specific ways, rather than these very broad, vague and general duties.

Online Safety Bill (Second sitting)

Debate between Kim Leadbeater and Maria Miller
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Ian, how do you drive a culture change in the sector?

Ian Stevenson: I think you have to look at the change you are trying to effect. For many people in the sector, there is a lack of awareness about what happens when the need to consider safety in building features is not put first. Even when you realise how many bad things can happen online, if you do not know what to do about it, you tend not to be able to do anything about it.

If we want to change culture—it is the same for individual organisations as for the sector as a whole—we have to educate people on what the problem is and give them the tools to feel empowered to do something about it. If you educate and empower people, you remove the barrier to change. In some places, an extremely ethical people-centric and safety-focused culture very naturally emerges, but in others, less so. That is precisely where making it a first-class citizen in terms of risk assessment for boards and management becomes so important. When people see management caring about things, that gets pushed out through the organisations.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q In your view, what needs to be added or taken away from the Bill to help it achieve the Government’s aim of making the UK

“the safest place in the world to be online”?

Lulu Freemont: First, I want to outline that there are some strong parts in the Bill that the sector really supports. I think the majority of stakeholders would agree that the objectives are the right ones. The Bill tries to strike a balance between safety, free speech and encouraging innovation and investment in the UK’s digital economy. The approach—risk-based, systems-led and proportionate—is the right one for the 25,000 companies that are in scope. As it does not focus on individual pieces of content, it has the potential to be future-proof and to achieve longer-term outcomes.

The second area in the Bill that we think is strong is the prioritisation of illegal content. We very much welcome the clear definitions of illegal content on the face of the Bill, which are incredibly useful for businesses as they start to think about preparing for their risk assessment on illegal content. We really support Ofcom as the appropriate regulator.

There are some parts of the Bill that need specific focus and, potentially, amendments, to enable it to deliver on those objectives without unintended consequences. I have already mentioned a few of those areas. The first is defining harmful content in primary legislation. We can leave it to codes to identify the interpretations around that, but we need definitions of harmful content so that businesses can start to understand what they need to do.

Secondly, we need clarity that businesses will not be required to monitor every piece of content as a result of the Bill. General monitoring is prohibited in other regions, and we have concerns that the Online Safety Bill is drifting away from those norms. The challenges of general monitoring are well known: it encroaches on individual rights and could result in the over-removal of content. Again, we do not think that the intention is to require companies of all sizes to look at every piece of content on their site, but it might be one of the unintended consequences, so we would like an explicit prohibition of general monitoring on the face of the Bill.

We would like to remove the far-reaching amendment powers of the Secretary of State. We understand the need for technical powers, which are best practised within regulation, but taking those further so that the Secretary of State can amend the regime in such an extreme way to align with public policy is of real concern, particularly to smaller businesses looking to confidently put in place systems and processes. We would like some consideration of keeping senior management liability as it is. Extending that further is only going to increase the chilling impact that it is having and the environment it is creating within UK investment. The final area, which I have just spoken about, is clarifying the scope. The business-to-business companies in our membership need clarity that they are not in scope and for that intention to be made clear on the face of the Bill.

We really support the Bill. We think it has the potential to deliver. There are just a few key areas that need to be changed or amended slightly to provide businesses with clarity and reassurances that the policy intentions are being delivered on.

Adam Hildreth: To add to that—Lulu has covered absolutely everything, and I agree—the critical bit is not monitoring individual pieces of content. Once you have done your risk assessment and put in place your systems, processes, people and technology, that is what people are signing up for. They are not signing up for this end assessment where, because you find that one piece of harmful content exists, or maybe many, you have failed to abide by what you are really signing up to.

That is the worry from my perspective: that people do a full risk assessment, implement all the systems, put in place all the people, technology and processes that they need, do the best job they can and have understood what investment they are putting in, and someone comes along and makes a report to a regulator—Ofcom, in this sense—and says, “I found this piece of content there.” That may expose weaknesses, but the very best risk assessments are ongoing ones anyway, where you do not just put it away in a filing cabinet somewhere and say, “That’s done.” The definitions of online harms and harmful content change on a daily basis, even for the biggest social media platforms; they change all the time. There was talk earlier about child sexual abuse material that appears as cartoons, which would not necessarily be defined by certain legislation as illegal. Hopefully the legislation will catch up, but that is where that risk assessment needs to be made again, and policies may need to be changed and everything else. I just hope we do not get to the point where the individual monitoring of content, or content misses, is the goal of the Bill—that the approach taken to online safety is this overall one.