Online Safety Bill (First sitting) Debate
Full Debate: Read Full DebateBaroness Keeley
Main Page: Baroness Keeley (Labour - Life peer)Department Debates - View all Baroness Keeley's debates with the Department for Digital, Culture, Media & Sport
(2 years, 7 months ago)
Public Bill CommitteesQ
Kevin Bakhurst: One area that is very important and which is in the Bill and one of our responsibilities is to make sure there is a sufficiently robust and reactive complaints process from the platforms—one that people feel they can complain to and be heard—and an appeals process. We feel that that is in the Bill. We already receive complaints at Ofcom from people who have issues about platforms and who have gone to the platforms but do not feel their complaints have been properly dealt with or recognised. That is within the video-sharing platform regime. Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly. It will be a really important part of the regime to make sure that platforms provide a complaints process that is easy to navigate and that people can use quite quickly and accessibly.
Richard Wronka: An additional point I would make, building on that, is that this is a really complex ecosystem. We understand that and have spent a lot of the last two or three years trying to get to grips with that complex ecosystem and building relationships with other participants in the ecosystem. It brings in law enforcement, other regulators, and organisations that support victims of crime or online abuse. We will need to find effective ways to work with those organisations. Ultimately, we are a regulator, so there is a limit to what we can do. It is important that those other organisations are able to operate effectively, but that is perhaps slightly outside our role.
Q
Richard Wronka: I think our starting point here is that we think transparency is a really important principle within the regime—a fundamental principle. There are specific provisions in the Bill that speak to that, but more generally we are looking for this regime to usher in a new era of transparency across the tech sector, so that users and other participants in this process can be clearer about what platforms are doing at the moment, how effective that is and what more might be done in the future. That is something that will be a guiding principle for us as we pick up regulation.
Specifically, the Bill provides for transparency reports. Not all services in scope will need to provide transparency reports, but category 1 and 2 services will be required to produce annual transparency reports. We think that is really important. At the moment, risk assessments are not intended to be published—that is not provided for in the Bill—but the transparency reports will show the effectiveness of the systems and processes that those platforms have put in place.
Q
Richard Wronka: I think what is important for us as a regulator is that we are able to access those risk assessments; and for the biggest services, the category 1 services, we would be expecting to do that routinely through a supervisory approach. We might even do that proactively, or where services have come to us for dialogue around those—
Q
Richard Wronka: Some services may wish to publish the risk assessments. There is nothing in the Bill or in our regulated approach that would prevent that. At the moment, I do not see a requirement in the Bill to do that. Some services may have concerns about the level of confidential information in there. The important point for us is that we have access to those risk assessments.
Kevin Bakhurst: Picking up on the risk assessments, it is a tricky question because we would expect those assessments to be very comprehensive and to deal with issues such as how algorithms function, and so on. There is a balance between transparency, which, as Richard says, we will drive across the regime—to address information that can harm, or people who are trying to behave badly online or to game the system—and what the regulator needs in practical terms. I am sure the platforms will be able to talk to you more about that.
Q
There is also a question of timing. The reports suggested that the new hub and jobs will come into play in 2025. I am sure that everyone here wants to see the Bill taking effect sooner. Ofcom will need to do a lot of reviews and reporting in the first year after the Bill receives Royal Assent. How will that be possible if people are not in post until 2025?
Kevin Bakhurst: They are both big questions. I will take the first part and maybe Richard can take the second one about the timing. On the resourcing, it is important to say publicly that we feel strongly that, very unusually, we have had funding from Government to prepare for this regime. I know how unusual that is; I was at a meeting with the European regulators last week, and we are almost unique in that we have had funding and in the level of funding that we have had.
The funding has meant that we are already well advanced in our preparations. We have a team of around 150 people working on online safety across the organisation. A number are in Manchester, but some are in London or in our other offices around the UK. It is important to say that that funding has helped us to get off to a really strong start in recruiting people across the piece—not just policy people. Importantly, we have set up a new digital function within Ofcom and recruited a new chief technology officer, who came from Amazon Alexa, to head up that function.
The funding has allowed us to really push hard into this space, which is not easy, and to recruit some of the skills we feel we need to deliver this regime as effectively and rapidly as possible. I know that resourcing is not a matter within the Bill; it is a separate Treasury matter. Going forward though, we feel that, in the plans, we have sufficient resourcing to deliver what we are being asked to deliver. The team will probably double in size by the time we actually go live with the regime. It is a significant number of people.
Some significant new duties have been added in, such as fraudulent advertising, which we need to think carefully about. That is an important priority for us. It requires a different skillset. It was not in the original funding plan. If there are significant changes to the Bill, it is important that we remain alive to having the right people and the right number of people in place while trying to deliver with maximum efficiency. Do you want to talk about timing, Richard?
Richard Wronka: All I would add to that, Kevin, is that we are looking to front-load our recruitment so that we are ready to deliver on the Bill’s requirements as quickly as possible once it receives Royal Assent and our powers commence. That is the driving motivation for us. In many cases, that means recruiting people right now, in addition to the people we have already recruited to help with this.
Clearly there is a bit of a gating process for the Bill, so we will need a settled legislative framework and settled priority areas before we can get on with the consultation process. We will look to run that consultation process as swiftly as possible once we have those powers in place. We know that some stakeholders are very keen to see the Bill in place and others are less enthusiastic, so we need to run a robust process that will stand the test of time.
The Bill itself points us towards a phased process. We think that illegal content, thanks to the introduction of priority illegal content in the Bill, with those priority areas, is the area on which we can make the quickest progress as soon as the Bill achieved Royal Assent.
Thank you. I intend to bring in the Minister at about 10 o’clock. Kirsty Blackman, Kim Leadbeater and Dean Russell have indicated that they wish to ask questions, so let us try to keep to time.
And on the screen—[Interruption.] Uh-oh, it has frozen. We will have to come back to that. We will take evidence from the witnesses in the room until we have sorted out the problem with the screen.
Q
Andy Burrows: Thank you for the question. We think that more could be built into the Bill to ensure that children’s needs and voices can be fed into the regime.
One of the things that the NSPCC would particularly like to see is provision for statutory user advocacy arrangements, drawing on the examples that we see in multiple other regulated sectors, where we have a model by which the levy on the firms that will cover the costs of the direct regulation also provides for funded user advocacy arrangements that can serve as a source of expertise, setting out children’s needs and experiences.
A comparison here would be the role that Citizens Advice plays in the energy and postal markets as the user voice and champion. We think that would be really important in bolstering the regulatory settlement. That can also help to provide an early warning function—particularly in a sector that is characterised by very rapid technological and market change—to identify new and emerging harms, and bolster and support the regulator in that activity. That, for us, feels like a crucial part of this jigsaw.
Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.
Dame Rachel de Souza: I was very pleased when the Government asked me, when I came into the role, to look at what more could be done to keep children safe online and to make sure that their voices went right through the passage of the Bill. I am committed to doing that. Obviously, as Children’s Commissioner, my role is to elevate children’s voices. I was really pleased to convene a large number of charities, internet safety organisations and violence against women and girls experts in a joint briefing to MPs to try to get children’s voices over.
I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and talking their complaints into account. I know you have a busy day, but that is the key point that I want to get across.
Lynn Perry is back on the screen—welcome. Would you like to introduce yourself for the record and then answer the question? [Interruption.] Oh, she has gone again. Apparently the problem is at Lynn’s end, so we will just have to live with it; there is nothing we can do on this side.
Q
Andy Burrows: The systemic regime is important. That will help to ensure that the regime can be future-proofed; clearly, it is important that we are not introducing a set of proposals and then casting them in aspic. But there are ways that the Bill could be more strongly future-proofed, and that links to ensuring that the regime can effectively map on to the dynamics of the child sexual abuse problem in particular.
Let me give a couple of examples of where we think the Bill could be bolstered. One is around placing a duty on companies to consider the cross-platform nature of harm when performing their risk assessment functions, and having a broad, overarching duty to ask companies to work together to tackle the child sexual abuse threat. That is very important in terms of the current dynamics of the problem. We see, for example, very well-established grooming pathways, where abusers will look to exploit the design features of open social networks, such as on Instagram or Snapchat, before moving children and abuse on to perhaps live-streaming sites or encrypted messaging sites.
The cross-platform nature of the threat is only going to intensify in the years ahead as we start to look towards the metaverse, for example. It is clear that the metaverse will be built on the basis of being cross-platform and interdependent in nature. We can also see the potential for unintended consequences from other regulatory regimes. For example, the Digital Markets Act recently passed by the EU has provisions for interoperability. That effectively means that if I wanted to send you a message on platform A, you could receive it on platform B. There is a potential unintended consequence there that needs to be mitigated; we need to ensure that there is a responsibility to address the harm potential that could come from more interoperable services.
This is a significant area where the Bill really can be bolstered to address the current dynamics of the problem and ensure that legislation is as effective as it possibly can be. Looking to the medium to long term, it is crucial to ensure that we have arrangements that are commensurate to the changing nature of technology and the threats that will emerge from that.
Dame Rachel de Souza: A simple answer from me: of course we cannot future-proof it completely, because of the changing nature of online harms and technology. I talked to a large number of 16 to 21-year-olds about what they wished their parents had known about technology and what they had needed to keep them safe, and they listed a range of things. No. 1 was age assurance—they absolutely wanted good age assurance.
However, the list of harms and things they were coming across—cyber-flashing and all this—is very much set in time. It is really important that we deal with those things, but they are going to evolve and change. That is why we have to build in really good cross-platform work, which we have been talking about. We need these tech companies to work together to be able to stay live to the issues. We also need to make sure that we build in proper advocacy and listen to children and deal with the issues that come up, and that the Bill is flexible enough to be able to grow in that way. Any list is going to get timed out. We need to recognise that these harms are there and that they will change.
I will bring in Kim Leadbeater and then Maria Miller and Kirsty Blackman, but I will definitely bring in the Minister at 10.45 am.
Q
Dame Rachel de Souza: Baroness Kidron has done some fantastic work on this, and I really support her work. I want to tell you why. I am a former headteacher—I worked for 30 years in schools as a teacher and headteacher. Only in the last five or six years did I start seeing suicides of children and teenagers; I did not see them before. In the year just before I came to be Children’s Commissioner, there was a case of a year 11 girl from a vulnerable family who had a relationship with a boy, and it went all over the social media sites. She looked up self-harm material, went out to the woods and killed herself. She left a note that basically said, “So there. Look what you’ve done.”
It was just horrendous, having to pick up the family and the community of children around her, and seeing the long-term effects of it on her siblings. We did not see things like that before. I am fully supportive of Baroness Kidron and 5Rights campaigning on this issue. It is shocking to read about the enormous waiting and wrangling that parents must go through just to get their children’s information. It is absolutely shocking. I think that is enough from me.
Andy Burrows: I absolutely agree. One of the things we see at the NSPCC is the impact on parents and families in these situations. I think of Ian Russell, whose daughter Molly took her own life, and the extraordinarily protracted process it has taken to get companies to hand over her information. I think of the anguish and heartbreak that comes with this process. The Bill is a fantastic mechanism to be able to redress the balance in terms of children and families, and we would strongly support the amendments around giving parents access to that data, to ensure that this is not the protracted process that it currently all too often is.
Just quickly, do coroners have sufficient powers? Should they have more powers to access digital data after the death of a child?
Andy Burrows: We can see what a protracted process it has been. There have been improvements to the process. It is currently a very lengthy process because of the mutual legal assistance treaty arrangements—MLAT, as they are known—by which injunctions have to be sought to get data from US companies. It has taken determination from some coroners to pursue cases, very often going up against challenges. It is an area where we think the arrangements could certainly be streamlined and simplified. The balance here should shift toward giving parents and families access to the data, so that the process can be gone through quickly and everything can be done to ease the heartbreak for families having to go through those incredibly traumatic situations.
Q
Dame Rachel de Souza: There is no silver bullet. This is now a huge societal issue and I think that some of the things that I would want to say would be about ensuring that we have in our educational arsenal, if you like, a curriculum that has a really strong digital media literacy element. To that end, the Secretary of State for Education has just asked me to review how online harms and digital literacy are taught in schools—reviewing not the curriculum, but how good the teaching is and what children think about how the subject has been taught, and obviously what parents think, too.
I would absolutely like to see the tech companies putting some significant funding into supporting education of this kind; it is exactly the kind of thing that they should be working together to provide. So we need to look at this issue from many aspects, not least education.
Obviously, in a dream world I would like really good and strong digital media literacy in the Bill, but actually it is all our responsibility. I know from my conversations with Nadhim Zahawi that he is very keen that this subject is taught through the national curriculum, and very strongly.
Q
Katy Minshall: At present, we have a range of risk assessment processes. We have a risk committee of the board. We do risk assessments when we make a change about—
Q
Katy Minshall: At present, we do not have a specific individual designated to do the children’s risk assessment. The key question is how much does Ofcom’s guidance on risk assessments—once we see it—intersect with our current processes versus changes we would need to make to our risk assessment processes?
Q
Katy Minshall: I would have to go away and review the Bill. I do not know whether a specific level is set out in the Bill, but we would want to engage with the regulation and requirements set for companies such as Twitter. However it would be expected that is what we would—
Q
Katy Minshall: Already all the biggest decisions that we make as a company are signed off at the most senior level. We report to our chief executive, Parag Agrawal, and then to the board. As I say, there is a risk committee of the board, so I expect that we would continue to make those decisions at the highest level.
Ben Bradley: It is broadly the same from a TikTok perspective. Safety is a priority for every member of the team, regardless of whether they are in a specific trust and safety function. In terms of risk assessments, we will see from the detail of the Bill at what level they need to be signed off, but our CEO has been clear in interviews that trust and safety is a priority for him and everyone at TikTok, so it would be something to which we are all committed.