Online Safety Bill (Fifteenth sitting) Debate
Full Debate: Read Full DebateBaroness Keeley
Main Page: Baroness Keeley (Labour - Life peer)Department Debates - View all Baroness Keeley's debates with the Department for Digital, Culture, Media & Sport
(2 years, 5 months ago)
Public Bill CommitteesIt is a pleasure to serve under your chairmanship, Sir Roger. Clause 168 is a very short and straightforward clause. Ofcom will be required to publish a variety of documents under the Online Safety Bill. The clause simply requires that this be done in a way that is appropriate and likely to bring it to the attention of any audience who are going to be affected by it. Ofcom is already familiar with this type of statutory obligation through existing legislation, such as the Digital Economy Act 2017, which places similar obligations on Ofcom. Ofcom is well versed in publishing documents in a way that is publicly accessible. Clause 168 puts the obligation on to a clear statutory footing.
As the Minister said, clause 168 rightly sets out that the raw material the Bill requires of Ofcom is published in a way that will bring it to the attention of any audience likely to be affected by it. It will be important that all the guidance is published in a way that is easily available and accessible, including for people who are not neurotypical, or experience digital exclusion. I think we would all agree, after the work we have done on the Bill, that the subjects are complex and the landscape is difficult to understand. I hope Ofcom will make its documents as accessible as possible.
Question put and agreed to.
Clause 168 accordingly ordered to stand part of the Bill.
Clause 169
Service of notices
Question proposed, That the clause stand part of the Bill.
Clause 169 sets out the process for the service of any notice under the Bill, including notices to deal with child sexual exploitation and abuse or terrorism content, information notices, enforcement notices, penalty notices and public statement notices to providers of regulated services both within and outside the United Kingdom. The clause sets out that Ofcom may give a notice to a person by handing it to them, leaving it at the person’s last known address, sending it by post to that address or sending it by email to the person’s email address. It provides clarity regarding who Ofcom must give notice to in respect of different structures. For example, notice may be given to an officer of a body corporate.
As the Minister said, clause 169 sets out the process of issuing notices or decisions by Ofcom. It mostly includes provisions about how Ofcom is to contact the company, which seem reasonable. The Opposition do not oppose clause 169.
Question put and agreed to.
Clause 169 accordingly ordered to stand part of the Bill.
Clause 170
Repeal of Part 4B of the Communications Act
Question proposed, That the clause stand part of the Bill.
Clause 170 repeals the video-sharing platform regime. While the VSP and online safety regimes have similar objectives, the new framework in the Bill will be broader and will apply to a wider range of online platforms. It is for this reason that we will repeal the VSP regime and transition those entities regulated as VSPs across to the online safety regime, which is broader and more effective in its provisions. The clause simply sets out the intention to repeal the VSP.
Clause 171 repeals part 3 of the Digital Economy Act 2017. As we have discussed previously, the Online Safety Bill now captures all online sites that display pornography, including commercial pornography sites, social media sites, video sharing platforms, forums and search engines. It will provide much greater protection to children than the Digital Economy Act. The Digital Economy Act was criticised for not covering social media platforms, which this Bill does cover. By removing that section from the Digital Economy Act, we are laying the path to regulate properly and more comprehensively.
Finally, in this group, clause 172 amends section 1B of the Protection of Children Act 1978 and creates a defence to the offence of making an indecent photograph of a child for Ofcom, its staff and those assisting Ofcom in exercising its online safety duties. Clearly, we do not want to criminalise Ofcom staff while they are discharging their duties under the Bill that we are imposing on them, so it is reasonable to set out that such a defence exists. I hope that provides clarity to the Committee on the three clauses.
The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.
When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his
“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]
in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?
Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.
The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.
Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.
On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?
So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.
Question put and agreed to.
New clause 42 accordingly read a Second time, and added to the Bill.
New Clause 43
Payment of sums into the Consolidated Fund
“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.
(2) In subsection (1), after paragraph (i) insert—
‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;
(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’
(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.
(4) After subsection (3) insert—
‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’
(5) In the heading, omit ‘licence’.”—(Chris Philp.)
This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.
Brought up, read the First and Second time, and added to the Bill.
New Clause 3
Establishment of Advocacy Body
“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.
(2) A ‘child user’—
(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and
(b) includes both any existing child user and any future child user.
(3) The work of the Advocacy Body may include—
(a) representing the interests of child users;
(b) the protection and promotion of these interests;
(c) any other matter connected with those interests.
(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—
(a) safety duties about illegal content, in particular CSEA content;
(b) safety duties protecting children;
(c) ‘enforceable requirements’ relating to children.
(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.
(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.
(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)
This new clause creates a new advocacy body for child users of regulated internet services.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.
Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:
“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]
A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.
My hon. Friend is making a really valid point. As I look around the room—I mean this with no disrespect to anybody—I see that we are all of an age at which we do not understand the internet in the same way that children and young people do. Surely, one of the key purposes of the Bill is to make sure that children and young people are protected from harms online, and as the Children’s Commissioner said in her evidence, their voices have to be heard. I am sure that, like me, many Members present attend schools as part of their weekly constituency visits, and the conversations we have with young people are some of the most empowering and important parts of this job. We have to make sure that the voices of the young people who we all represent are heard in this important piece of legislation, and it is really important that we have an advocacy body to ensure that.
I very much agree with my hon. Friend. She is quite right: we have to remember that we do not see these things as children and young people do.
The user advocacy body that my hon. Friend has just spoken in support of could also shine a light on the practices that are most harmful to children by using data, evidence and specialist expertise to point to new and emerging areas of harm. That would enable the regulator to ensure its risk profiles and regulatory approach remain valid and up to date. In his evidence, Andy Burrows of the NSPCC highlighted the importance of an advocacy body acting as an early warning system:
“Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]
The provision in the new clause is comparable to those that already exist in many other sectors. For example, Citizens Advice is the statutory user advocate for consumers of energy and the postal services, and there are similar arrangements representing users of public transport. Establishing a children’s user advocacy body would ensure that the most vulnerable online users of all—children at risk of online sexual abuse—receive equivalent protections to customers of post offices or passengers on a bus.
The hon. Lady will recall the issue that I raised earlier in the Committee’s deliberations, regarding the importance of victim support that gives people somewhere to go other than the platforms. I think that is what she is now alluding to. Does she not believe that the organisations that are already in place, with the right funding—perhaps from the fines coming from the platforms themselves—would be in a position to do this almost immediately, and that we should not have to set up yet another body, or have I misunderstood what she has said?
I do not think that the right hon. Lady has misunderstood what I said. I said that the new clause would allow the Secretary of State to appoint a new or existing body as the statutory user advocate, so it could very much be either.
New clause 3 would also rebalance the interests of children against the vocal and well-resourced regulated companies. I think that is a key argument for having an advocacy body. Without such a counterbalance, large tech companies could attempt to capture independent expert voices, fund highly selective research with the intent to skew the evidence base, and then challenge regulatory decisions with the evidence base they have created.
Those tactics are not new; similar tactics are used in other regulated sectors, such as the tobacco industry. In line with other sectors, the user advocacy body should be funded by a levy on regulated companies. That would be in line with the “polluter pays” principle in part 6 and would be neutral to the Exchequer—another reason to accept it. Compared with the significant benefits and improved outcomes it would create, the levy would represent only a minimal additional burden on companies.
There is strong support for the creation of a user advocate. Research by the NSPCC shows that 88% of UK adults who responded to a YouGov survey think that it is necessary for the Bill to introduce a requirement for an independent body that can protect the interests of children at risk of online harms, including grooming and child sexual abuse.
It is also a popular option among children. YoungMinds has said that young people do not feel they are being included enough in the drafting of the Bill. It evidenced that with research it undertook that found that almost 80% of young people aged 11 to 25 surveyed had never even heard of the Bill.
A young woman told the NSPCC why she felt a children’s advocacy body is needed. She is a survivor of online grooming, and it is worth sharing what she said in full, because it is powerful and we have not shared the voices of young people enough. She said:
“When I was 13, a man in his 30s contacted me on Facebook. I added him because you just used to add anyone on Facebook. He started messaging me and I liked the attention. We’d speak every day, usually late at night for hours at a time…He started asking for photos, so I sent some. Then he asked for some explicit photos, so I did that too, and he reciprocated…In my eyes, telling anyone in my life about this man was not an option. We need to stop putting the responsibility on a vulnerable child to prevent crime and start living in a world which puts keeping children safe first. That means putting child safety at the heart of policy. I want a statutory child user advocacy body funded by the industry levy. This would play a vital role in advocating for children’s rights in regulatory debates. Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side. Having a body stand up for the rights of children in such a vulnerable position is invaluable…it is so rare that voices like mine have a chance to be heard by policy makers. Watching pre legislative debates I’ve been struck by how detached from my lived experience they can be”—
that is very much the point that my hon. Friend the Member for Batley and Spen made—
“and indeed the lived experiences of thousands of others. If we want to protect children, we need to understand and represent what they need.”
I hope that the Committee will recognise the bravery of that young woman in speaking about her experiences as a survivor of online grooming. I hope that the Minister will respect the insights she offers and consider the merits of having a user advocacy body to support children and young people experiencing harms online.
I read new clause 3 in conjunction with the starred new clause 44, because it makes sense to consider the funding of the advocacy body, and the benefits of that funding, when discussing the merits of such a body. Part of that is because the funding of the advocacy body, and the fact that it needs to be funded, is key to its operation, and a key reason why we need it.
The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.
The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.
I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.
I wholeheartedly agree with what the hon. Member for Aberdeen North just said, but I wish to emphasise some elements because it seems to me that the Minister was not listening, although he has listened to much that has been said. I made some specific points, used quotes and brought forward some evidence. He feels that children have been consulted in the drafting of the Bill; I cited a YoungMinds survey that showed that that was very much not what young people feel. YoungMinds surveyed a large group of young people and a very large proportion of them had not even heard of the Bill.
The evidence of the young survivor of online grooming was very powerful. She very much wanted a user-advocacy body and spoke strongly about that. The Minister is getting it wrong if he thinks that somebody in that situation, who has been groomed, would go to a parent. The quote that I cited earlier was:
“Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side.”
There were clearly adults in her life she could have gone to, but she did not because she was in that vulnerable position—a position of weakness. That is why some kind of independent advocacy body for children is so important.
I do not think children and young people do feel consulted about the Bill because the organisations and charities are telling us that. I join all Opposition Members in supporting and paying tribute to the remarkable job that the Children’s Commissioner does. I quoted her setting out her worries about the Bill. I quoted her saying that
“the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]
That is what she said. She did not say, “I’m the person charged with doing this. I’m the person who has the resource and my office has the resource.”
I hope that I did not in any way confuse the debate earlier, because these two things are very separate. The idea of a user-advocacy service and individual victim support are two separate issues. The Minister has already taken up the issue of victim support, which is what the Children’s Commissioner was talking about, but that is separate from advocacy, which is much broader and not necessarily related to an individual problem.
Indeed, but the Children’s Commissioner was very clear about certain elements being missing in the Bill, as is the NSPCC and other organisations. It is just not right for the Minister to land it back with the Children’s Commissioner as part of her role, because she has to do so many other things. The provisions in the Bill in respect of a parent or adult assisting a young people in a grooming situation are a very big concern. The Children’s Commissioner cited her own survey of 2,000 children, a large proportion of whom had not succeeded in getting content about themselves removed. From that, we see that she understands that the problem exists. We will push the new clause to a Division.
Question put, That the clause be read a Second time.