All 11 Debates between Kirsty Blackman and Baroness Keeley

Tue 28th Jun 2022
Tue 28th Jun 2022
Thu 23rd Jun 2022
Tue 21st Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Thu 16th Jun 2022
Tue 14th Jun 2022
Thu 9th Jun 2022
Tue 7th Jun 2022
Tue 24th May 2022
Tue 24th May 2022

Online Safety Bill (Seventeenth sitting)

Debate between Kirsty Blackman and Baroness Keeley
Committee stage
Tuesday 28th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I beg to move, That the clause be read a Second time.

This is another attempt to place a higher bar and more requirements on regulated services that are likely to cause the most serious risks of harm. The Minister has consistently said that he is keen to consider regulating the companies and platforms that have the highest potential risk of harm more strictly than the normal regime would allow. Some of the platforms would not be category 1 on the basis that they have a small number of members, but the potential for harm—radicalisation, extremism, severe damage to people or extreme pornography—is very high.

I am not yet happy that the Minister has provided an adequate answer to the question about the regulation of the highest-risk platforms that do not meet the category 1 thresholds. If he is unwilling to accept this amendment or any of the other amendments tabled by the Opposition on this specific issue, I hope that he will give consideration to a Government amendment on Report or when the Bill goes through the House of Lords in order that this loose end can be tied up.

As I have said before—I do not want go too much over comments that I have made previously—it is reasonable for us to have a higher bar and a more strict regulation regime on specific platforms that Ofcom will easily be able to identify and that create the highest harm. Again, as I have said, this is another way of going about it. The new clause suggests that if Ofcom assesses that a service poses a very high risk of harm, it might, notwithstanding the categorisation of that service, require it to perform the children’s risk assessment duties and the safety duties protecting children. This is specifically about the children’s risk assessment.

I have previously raised concerns about not being able to accurately assess the number of child users that a service has. I am still not entirely comfortable that platforms will be able to accurately assess the number of child users they have, and therefore they might not be subject to the child user requirements, because they have underplayed or understated the number of children using their service, or because there are only a few hundred children using the service, which is surely massively concerning for the wellbeing of those few hundred children.

I hope the Minister can give us some comfort that he is not just considering what action to take, but that he will take some sort of action on Report or when the Bill proceeds through the House of Lords.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve with you in the Chair again, Ms Rees. I rise to speak in support of new clause 27.

We have argued that the Government’s approach to categorising services fails to take account of the harms that could result from smaller services. I understand that a risk-based approach rather than a size-based approach is being considered, and that is welcome. The new clause would go some way to improving the categorisation of services as it stands. It is critical that there are ways for Ofcom to assess companies’ risk of harm to users and to place additional duties on them even when they lie outside the category to which they were initially assigned. Ofcom should be able to consult any organisation that it sees fit to consult, including user advocacy groups and civil society, in assessing whether a service poses

“a very high risk of harm”.

Following that, Ofcom should have powers to deliver the strictest duties on companies that expose adults to the most dangerous harms. That should always be proportionate to the risk of harm.

Labour supports the new clause and the arguments made by the hon. Member for Aberdeen North.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move, That the clause be read a Second time.

The new clause attempts to address an asymmetry in the Bill in relation to the lack of user empowerment features for child users. As far as I am aware, there is no requirement for user empowerment functions for child users in the Bill. The new clause would require that if a service has to have user empowerment features in place for adults, then

“OFCOM may require a service to provide equivalent features designed specifically for child users.”

Ofcom would be able then to provide guidance on how those user empowerment features for child users would work.

This provision is especially important for the fairly small number of platforms and providers that are very much aimed at children, and where the vast majority of users are children. We are not talking about Facebook, for example, although if Facebook did have child user empowerment, it would be a good thing. I am thinking about organisations and games such as Roblox, which is about 70% children; Fortnite, although it has quite a lot of adult users too; and Minecraft, which has significant numbers of child users. On those platforms that are aimed at children, not having a child-centred, child-focused user empowerment requirement is an oversight. It is missing from the Bill.

It is important that adults have the ability to make privacy choices about how they use sites and to make choices about some of the content that they can see on a site by navigating the user empowerment functions that exist. But it is also important for children to have that choice. I do not see why adults should be afforded that level of choice and flexibility over the way that they use platforms and the providers that they engage with, but children should not. We are not just talking here about kids who are eight: we are talking about children far older, and for whom adult-centred, adult-written user empowerment functions may not be the best option or as easy to access as ones that are specifically focused on and designed for children.

I have had a discussion with the National Society for the Prevention of Cruelty to Children about the user empowerment functions for child users. We have previously discussed the fact that complaints features have to be understandable by the users of services, so if the Minister is unwilling to accept the new clause, will he give some consideration to what happens when the provider of the platform is marketing that platform to children?

The Roblox website is entirely marketed as a platform for children. It is focused in that way, so will the Minister consider whether Ofcom should be able to require differential user empowerment functions, particularly in cases where the overwhelming majority of users are children? Also, it would not be beyond the wit of man for platforms such as Facebook to have two differential user empowerment functions based on whether somebody is under the age of 18—whether they are a child or an adult—because users tell Facebook their date of birth when signing up. We have talked a lot about age verification and the ways in which that could work.

I would appreciate it if the Minister would consider this important matter. It is something that is lacking at the moment, and we are doing our children a disservice by not providing them with the same functionality that we are providing, or requiring, for adult users.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Labour argued in favour of greater empowerment provisions for children during the debate on new clause 3, which would have brought in a user advocacy body for children. YoungMinds has pointed out that many young people are unaware of the Bill, and there has been little engagement with children regarding its design. I am sure members of the Committee would agree that the complexity of the Bill is evidence enough of that.

New clause 28 would make the online world more accessible for children and increase their control over the content they see. We know that many children use category 1 services, so they should be entitled to the same control over harmful content as adults. As such, Labour supports the new clause.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move, That the clause be read a Second time.

I mentioned this in earlier consideration. The issue was raised with me by Mencap, specifically in relation to the people it represents who have learning disabilities and who have a right to access the internet just as we all do. They should be empowered to use the internet with a level of safety and be able to access complaints, to make content reports and to use the user empowerment functions. Everybody who is likely to use the platforms should be able to access and understand those functions.

Will the Minister make it clear that he expects Ofcom, when drafting guidance about the user empowerment functions and their accessibility, the content reporting and the complaints procedures, to consult people about how those things work? Will he make it clear that he hopes Ofcom will take into account the level of accessibility? This is not just about writing things in plain English—or whatever that campaign is about writing things in a way that people can understand—it is about actually speaking to groups that represent people with learning disabilities to ensure that content reporting, the empowerment functions and the complaints procedures are accessible, easy to find and easy to understand, so that people can make the complaints that they need to make and can access the internet on an equal and equitable basis.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I rise to speak in support of the new clause. Too often people with learning disabilities are left out of discussions about provisions relevant to them. People with learning disabilities are disproportionately affected by online harms and can receive awful abuse online.

At the same time, Mencap has argued that social media platforms enable people with learning disabilities to develop positive friendships and relationships. It is therefore even more important that people with learning disabilities do not lose out on the features described in clause 14, which allow them to control the content to which they are exposed. It is welcome that clauses 17, 18, 27 and 28 specify that reporting and complaints procedures must be easy to access and use.

The Bill, however, should go further to ensure that the duties on complaints and reporting explicitly cater to adults with learning disabilities. In the case of clause 14 on user empowerment functions, it must be made much clearer that those functions are easy to access and use. The new clause would be an important step towards ensuring that the Bill benefits everyone who experiences harms online, including people with learning disabilities. Labour supports the new clause.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

That is fine, but I have a further point to make. The new clause would be very important to all those people who support people with learning disabilities. So much of the services that people use do not take account of people’s learning disabilities. I have done a huge amount of work to try to support people with learning disabilities over the years. This is a very important issue to me.

There are all kinds of good examples, such as easy-read versions of documents, but the Minister said when batting back this important new clause that the expression “all adult users” includes people with learning disabilities. That is not the case. He may not have worked with a lot of people with learning disabilities, but they are excluded from an awful lot. That is why I support making that clear in the Bill.

We on the Opposition Benches say repeatedly that some things are not included by an all-encompassing grouping. That is certainly the case here. Some things need to be said for themselves, such as violence against women and girls. That is why this is an excellent new clause that we support.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister, particularly for providing the clarification that I asked for about who is likely to be consulted or taken into account when Ofcom is writing the codes of practice. Notwithstanding that, and particularly given the rather excellent speech from the shadow Minister, the hon. Member for Worsley and Eccles South, I am keen to press the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will take this opportunity, as my hon. Friend has done, to add a few words of thanks. She has already thanked all the people in this place who we should be thanking, including the Clerks, who have done a remarkable job over the course of our deliberations with advice, drafting, and support to the Chair. I also thank the stakeholder organisations. This Bill is uniquely one in which the stakeholders—the children’s charities and all those other organisations—have played an incredible part. I know from meetings that they have already advertised that those organisations will continue playing that part over the coming weeks, up until Report. It has been fantastic.

Finally, I will mention two people who have done a remarkable amount of work: my researcher Iona and my hon. Friend’s researcher Freddie, who have done a huge amount to help us prepare speaking notes. It is a big task, because this is a complex Bill. I add my thanks to you, Ms Rees, for the way you have chaired this Committee. Please thank Sir Roger on our behalf as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Seeing as we are not doing spurious points of order, I will also take the opportunity to express our thanks. The first one is to the Chairs: thank you very much, Ms Rees and Sir Roger, for the excellent work you have done in the Chair. This has been a very long Bill, and the fact that you have put up with us for so long has been very much appreciated.

I thank all the MPs on the Committee, particularly the Labour Front-Bench team and those who have been speaking for the Labour party. They have been very passionate and have tabled really helpful amendments—it has been very good to work with the Labour team on the amendments that we have put together, particularly the ones we have managed to agree on, which is the vast majority. We thank Matt Miller, who works for my hon. Friend the Member for Ochil and South Perthshire. He has been absolutely wonderful. He has done an outstanding amount of work on the Bill, and the amazing support that he has given us has been greatly appreciated. I also thank the Public Bill Office, especially for putting up with the many, many amendments we submitted, and for giving us a huge amount of advice on them.

Lastly, I thank the hundreds of organisations that got in touch with us, and the many people who took the time to scrutinise the Bill, raise their concerns, and bring those concerns to us. Of those hundreds of people and organisations, I particularly highlight the work of the National Society for the Prevention of Cruelty to Children. Its staff have been really helpful to work with, and I have very much appreciated their advice and support in drafting our amendments.

Online Safety Bill (Sixteenth sitting)

Debate between Kirsty Blackman and Baroness Keeley
Committee stage
Tuesday 28th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. The new clause would require regulated companies to disclose proactively to the regulator material changes in its operations that may impact on safety, and any significant breaches as a result of its safety duties. Category 1 services should be under regulatory duties to disclose proactively to the regulator matters about which it could reasonably expect to be informed. For example, companies should notify Ofcom about significant changes to their products and services, or to their moderation arrangements, that may impact on the child abuse threat and the company’s response to it. A similar proactive duty already applies in the financial services sector. The Financial Conduct Authority handbook states:

“A firm must deal with its regulators in an open and cooperative way, and must disclose to the FCA appropriately anything relating to the firm of which that regulator would reasonably expect notice.”

The scope of the duty we are suggesting could be drawn with sufficient clarity so that social media firms properly understand their requirements and companies do not face unmanageable reporting burdens. Such companies should also be subject to red flag disclosure requirements, whereby they would be required to notify the regulator of any significant lapses in, or changes to, systems and processes that compromise children’s safety or could put them at risk. For example, if regulation had been in place over the last 12 months, Facebook might reasonably have been expected to report on the technology and staffing issues to which it attributes its reduced detection of child abuse content.

Experience from the financial services sector demonstrates the importance of disclosure duties as a means of regulatory intelligence gathering. Perhaps more importantly, they provide a useful means of hard-wiring regulatory compliance into company decisions on the design and operation of their sites.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you for chairing this meeting, Sir Roger. I have a quick question for the Minister that relates to the new clause, which is a reasonable request for a duty on providers to disclose information to Ofcom. We would hope that the regulator had access to that information, and if companies are making significant changes, it is completely reasonable that they should have to tell Ofcom.

I do not have any queries or problems with the new clause; it is good. My question for the Minister is—I am not trying to catch anyone out; I genuinely do not know the answer—if a company makes significant changes to something that might impact on its safety duties, does it have to do a new risk assessment at that point, or does it not have to do so until the next round of risk assessments? I do not know the answer, but it would be good if the direction of travel was that any company making drastic changes that massively affected security—for example, Snapchat turning on the geolocation feature when it did an update—would have to do a new risk assessment at that point, given that significant changes would potentially negatively impact on users’ safety and increase the risk of harm on the platform.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Throughout these debates it has been clear that we agree on both sides that the Online Safety Bill must be a regime that promotes the highest levels of transparency. This will ensure that platforms can be held accountable for their systems and processes. Like other regulated industries, they must be open and honest with the regulator and the public about how their products work and how they keep users safe.

As we know, platforms duck and dive to avoid sharing information that could make life more difficult for them or cast them in a dim light. The Bill must give them no opportunity to shirk their responsibilities. The Bill enables the largest platforms to carry out a risk assessment safe in the knowledge that it may never see the light of day. Ofcom can access such information if it wants, but only following a lengthy process and as part of an investigation. This creates no incentive for platforms to carry out thorough and proper risk assessments. Instead, platforms should have to submit these risk assessments to Ofcom not only on request but as a matter of course. Limiting this requirement to only the largest platforms will not overload Ofcom, but will give it the tools and information it needs to oversee an effective regime.

In addition, the public have a right to know the risk profile of the services they use. This happens in all other regulated industries, with consumers having easy access to the information they need to make informed decisions about the products they use. At present, the Bill does not give users the information they deserve about what to expect online. Parents in particular will be empowered by information about the risk level of platforms their children use. Therefore, it is imperative that risk assessments are made publicly available, as well as submitted to the regulator as a matter of course.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of comments on the point about parental empowerment. I have been asked by my children for numerous apps. I have a look at them and think, “I don’t know anything about this app. I have never seen or heard of it before, and I have no idea the level of user-to-user functionality in this app.” Nowhere is there a requirement for this information to be set out. There is nowhere that parents can easily find this information.

With iPhones, if a kid wants an app, they have to request it from their parent and their parents needs to approve whether or not they get it. I find myself baffled by some of them because they are not ones that I have ever heard of or come across. To find out whether they have that level of functionality, I have to download and use the app myself in the way that, hopefully, my children would use it in order to find out whether it is safe for them.

A requirement for category 1 providers to be up front and explain the risks and how they manage them, and even how people interact with their services, would increase the ability of parents to be media literate. We can be as media literate as we like, but if the information is not there and we cannot find it anywhere, we end up having to make incredibly restrictive decisions in relation to our children’s ability to use the internet, which we do not necessarily want to make. We want them to be able to have fun, and the information being there would be very helpful, so I completely agree on that point.

My other point is about proportionality. The Opposition moved new clause 4, relating to risk assessments, and I did not feel able to support it on the basis of the arguments that the Minister made about proportionality. He made the case that Ofcom would receive 25,000 risk assessments and would be swamped by the number that it might receive. This new clause balances that, and has the transparency that is needed.

It is completely reasonable for us to put the higher burden of transparency on category 1 providers and not on other providers because they attract the largest market share. A huge percentage of the risk that might happen online happens with category 1 providers, so I am completely happy to support this new clause, which strikes the right balance. It answers the Minister’s concerns about Ofcom being swamped, because only category 1 providers are affected. Asking those providers to put the risk assessment on their site is the right thing to do. It will mean that there is far more transparency and that people are better able to make informed decisions.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move, That the clause be read a Second time.

I tabled new clause 17 in relation to protected characteristics because of some of the points made by Danny Stone. I missed the relevant evidence session because unfortunately, at the time, I was in the Chamber, responding to the Chancellor of the Exchequer. I am referring to some of the points made by Danny Stone in the course of the evidence session in relation to the algorithmic prompts that there are in search functions.

We have an issue with search functions; we have an issue with the algorithmic prompts that there are in search functions. There is an issue if someone puts in something potentially derogatory, if they put in something relating to someone with a protected characteristic. For example, if someone were to type “Jews are”, the results that they get with those algorithmic prompts can be overwhelmingly racist, overwhelmingly antisemitic, overwhelmingly discriminatory. The algorithm should not be pushing those things.

To give organisations like Google some credit, if something like that is highlighted to them, they will address it. Some of them take a long time to sort it, but they will have a look at it, consider sorting it and, potentially, sort it. But that is not good enough. By that point, the damage is done. By that point, the harm has been put into people’s minds. By that point, someone who is from a particular group and has protected characteristics has already seen that Google—or any other search provider—is pushing derogatory terms at people with protected characteristics.

I know that the prompts work like that because of artificial intelligence; firms are not intentionally writing these terms in order to push them towards people, but the AI allows that to happen. If such companies are going to be using artificial intelligence—some kind of software algorithm—they have a responsibility to make sure that none of the content they are generating on the basis of user searches is harmful. I asked Google about this issue during one of our evidence sessions, and the response they gave was, “Oh, algorithmic prompts are really good, so we should keep them”—obviously I am paraphrasing. I do not think that is a good enough argument. I do not think the value that is added by algorithmic prompts is enough to counter the harm that is caused by some of those prompts.

As such, the new clause specifically excludes protected characteristics from any algorithm that is used in a search engine. The idea is that if a person starts to type in something about any protected characteristic, no algorithmic prompt will appear, and they will just be typing in whatever they were going to type in anyway. They will not be served with any negative, harmful, discriminatory content, because no algorithmic prompt will come up. The new clause would achieve that across the board for every protected characteristic term. Search engines would have to come up with a list of such terms and exclude all of them from the work of the algorithm in order to provide that layer of protection for people.

I do not believe that that negative content could be in any way balanced by the potential good that could arise from somebody being able to type “Jews are” and getting a prompt that says “funny”. That would be a lovely, positive thing for people to see, but the good that could be caused by those prompts is outweighed by the negativity, harm and pain that is caused by the prompts we see today, which platforms are not quick enough to act on.

As I say, the harm is done by the time the report is made; by the time the concern is raised, the harm has already happened. New clause 17 would prevent that harm from ever happening. It would prevent anybody from ever being injured in any way by an algorithmic prompt from a search engine. That is why I have tabled that new clause, in order to provide a level of protection for any protected characteristic as defined under the Equality Act 2010 when it comes to search engine prompts.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The problem underlying the need for this new clause is that under the Bill, search services will not have to address or risk assess legal harm to adults on their sites, while the biggest user-to-user services will. As Danny Stone of the Antisemitism Policy Trust told us in evidence, that includes sites such as Google and Microsoft Bing, and voice search assistants including Amazon’s Alexa and Apple’s Siri. Search services rightly highlight that the content returned by a search is not created or published by then, but as the hon. Member for Aberdeen North has said, algorithmic indexing, promotion and search prompts provided in the search bar are their responsibility. As she has pointed out, and as we have heard in evidence sessions, those algorithms can cause significant harm.

Danny Stone told us on 26 May:

“Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 130, Q207.]

The hon. Member for Aberdeen North mentioned the examples from Microsoft Bing that Danny gave in his evidence—“Jews are” and “gays are”. He gave other examples of answers that were returned by search services, such as using Amazon Alexa to search, “Is George Soros evil?” The response was, “Yes, he is.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The issue is that the search prompts that the hon. Member has talked about are problematic, because just one person giving an answer to Amazon could prompt that response. The second one, about the White Helmets, was a comment on a website that was picked up. Clearly, that is an issue.

Danny Stone’s view is that it would be wise to have something that forces search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently. It is not reasonable to exempt major international and ubiquitous search services from risk assessing and having a policy to address the harms caused by their algorithms. We know that leaving it up to platforms to sort this out themselves does not work, which is why Labour is supporting the new clause proposed by our SNP colleague.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move, That the clause be read a Second time.

I think you are probably getting fed up with me, Sir Roger, so I will try my best not to speak for too long. The new clause is one of the most sensible ones we have put forward. It simply allows Ofcom to ask regulated services to submit to Ofcom

“a specific piece of research held by the service”

or

“all research the service holds”

on a specific topic. It also allows Ofcom to product a report into

“how regulated services commission, collate, publish and make use of research.”

The issues that we heard raised by Frances Haugen about the secretive nature of these very large companies gave us a huge amount concern. Providers will have to undertake risk assessments on the basis of the number of users they have, the risk of harm to those users and what percentage of their users are children. However, Ofcom is just going to have to believe the companies when they say, “We have 1 million users,” unless it has the ability to ask for information that proves the risk assessments undertaken are adequate and that nothing is being hidden by those organisations. In order to find out information about a huge number of the platforms, particularly ones such as Facebook, we have had to have undercover researchers posing as other people, submitting reports and seeing how they come out.

We cannot rely on these companies, which are money-making entities. They exist to make a profit, not to make our lives better. In some cases they very much do make our lives better—in some cases they very much do not—but that is not their aim. Their aim is to try to make a profit. It is absolutely in their interests to underplay the number of users they have and the risk faced by people on their platforms. It is very much in their interest to underplay how the algorithms are firing content at people, taking them into a negative or extreme spiral. It is also in their interests to try to hide that from Ofcom, so that they do not have to put in the duties and mitigations that keep people safe.

We are not asking those companies to make the information public, but if we require them to provide to Ofcom their internal research, whether on the gender or age of their users, or on how many of their users are viewing content relating to self-harm, it will raise their standards. It will raise the bar and mean that those companies have to act in the best interests—or as close as they can get to them—of their users. They will have to comply with what is set out in the Bill and the directions of Ofcom.

I see no issue with that. Ofcom is not going to share the information with other companies, so that they could subvert competition law. Ofcom is a regulator; it literally does not do that. Our proposal would mean that Ofcom has the best, and the most, information in order to take sensible decisions to properly regulate the platforms. It is not a difficult provision for the Minister to accept.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The transparency requirements set out in the Bill are welcome but limited. Numerous amendments have been tabled by the Opposition and by our colleagues in the SNP to increase transparency, so that we can all be better informed about the harms around us, and so that the regulator can determine what protections are needed for existing and emerging harms. This new clause is another important provision in that chain and I speak in support of it.

We know that there is research being undertaken all the time by companies that is never published—neither publicly nor to the regulator. As the hon. Member for Aberdeen North said, publishing research undertaken by companies is an issue championed by Frances Haugen, whose testimony last month the Committee will remember. A few years ago, Frances Haugen brought to the public’s attention the extent to which research is held by companies such as Facebook—as it was called then—and never reaches the public realm.

Billions of members of the public are unaware that they are being tracked and monitored by social media companies as subjects in their research studies. The results of those studies are only published when revealed by brave whistleblowers. However, their findings could help charities, regulators and legislators to recognise harms and help to make the internet a safer place. For example, Frances Haugen leaked one Facebook study that found that a third of teenage girls said Instagram made them feel worse about their bodies. Facebook’s head of safety, Antigone Davis, fielded questions on this issue from United States Senators last September. She claimed that the research on the impact of Instagram and Facebook to children’s health was “not a bombshell”. Senator Richard Blumenthal responded:

“I beg to differ with you, Ms Davis, this research is a bombshell. It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings.”

It is this kind of cover-up that new clause 19 seeks to prevent.

I remind the Committee of one more example that Frances Haugen illustrated to us in her evidence last month. Meta conducts frequent analyses of the estimated age of its users, which is often different from the ages they submit when registering, both among adults and children. Frances told us that Meta does this so that adverts can be targeted more effectively. However, if Ofcom could request this data, as the new clause would require, it would give an important insight into how many under-13s were in fact creating accounts on Facebook. Ofcom should be able to access such information, so I hope hon. Members and the Minister will support the new clause as a measure to increase transparency and support greater protections for children.

Online Safety Bill (Fifteenth sitting)

Debate between Kirsty Blackman and Baroness Keeley
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.

When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his

“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]

in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?

Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.

The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.

On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I do not think that the right hon. Lady has misunderstood what I said. I said that the new clause would allow the Secretary of State to appoint a new or existing body as the statutory user advocate, so it could very much be either.

New clause 3 would also rebalance the interests of children against the vocal and well-resourced regulated companies. I think that is a key argument for having an advocacy body. Without such a counterbalance, large tech companies could attempt to capture independent expert voices, fund highly selective research with the intent to skew the evidence base, and then challenge regulatory decisions with the evidence base they have created.

Those tactics are not new; similar tactics are used in other regulated sectors, such as the tobacco industry. In line with other sectors, the user advocacy body should be funded by a levy on regulated companies. That would be in line with the “polluter pays” principle in part 6 and would be neutral to the Exchequer—another reason to accept it. Compared with the significant benefits and improved outcomes it would create, the levy would represent only a minimal additional burden on companies.

There is strong support for the creation of a user advocate. Research by the NSPCC shows that 88% of UK adults who responded to a YouGov survey think that it is necessary for the Bill to introduce a requirement for an independent body that can protect the interests of children at risk of online harms, including grooming and child sexual abuse.

It is also a popular option among children. YoungMinds has said that young people do not feel they are being included enough in the drafting of the Bill. It evidenced that with research it undertook that found that almost 80% of young people aged 11 to 25 surveyed had never even heard of the Bill.

A young woman told the NSPCC why she felt a children’s advocacy body is needed. She is a survivor of online grooming, and it is worth sharing what she said in full, because it is powerful and we have not shared the voices of young people enough. She said:

“When I was 13, a man in his 30s contacted me on Facebook. I added him because you just used to add anyone on Facebook. He started messaging me and I liked the attention. We’d speak every day, usually late at night for hours at a time…He started asking for photos, so I sent some. Then he asked for some explicit photos, so I did that too, and he reciprocated…In my eyes, telling anyone in my life about this man was not an option. We need to stop putting the responsibility on a vulnerable child to prevent crime and start living in a world which puts keeping children safe first. That means putting child safety at the heart of policy. I want a statutory child user advocacy body funded by the industry levy. This would play a vital role in advocating for children’s rights in regulatory debates. Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side. Having a body stand up for the rights of children in such a vulnerable position is invaluable…it is so rare that voices like mine have a chance to be heard by policy makers. Watching pre legislative debates I’ve been struck by how detached from my lived experience they can be”—

that is very much the point that my hon. Friend the Member for Batley and Spen made—

“and indeed the lived experiences of thousands of others. If we want to protect children, we need to understand and represent what they need.”

I hope that the Committee will recognise the bravery of that young woman in speaking about her experiences as a survivor of online grooming. I hope that the Minister will respect the insights she offers and consider the merits of having a user advocacy body to support children and young people experiencing harms online.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I read new clause 3 in conjunction with the starred new clause 44, because it makes sense to consider the funding of the advocacy body, and the benefits of that funding, when discussing the merits of such a body. Part of that is because the funding of the advocacy body, and the fact that it needs to be funded, is key to its operation, and a key reason why we need it.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.

The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.

I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I wholeheartedly agree with what the hon. Member for Aberdeen North just said, but I wish to emphasise some elements because it seems to me that the Minister was not listening, although he has listened to much that has been said. I made some specific points, used quotes and brought forward some evidence. He feels that children have been consulted in the drafting of the Bill; I cited a YoungMinds survey that showed that that was very much not what young people feel. YoungMinds surveyed a large group of young people and a very large proportion of them had not even heard of the Bill.

The evidence of the young survivor of online grooming was very powerful. She very much wanted a user-advocacy body and spoke strongly about that. The Minister is getting it wrong if he thinks that somebody in that situation, who has been groomed, would go to a parent. The quote that I cited earlier was:

“Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side.”

There were clearly adults in her life she could have gone to, but she did not because she was in that vulnerable position—a position of weakness. That is why some kind of independent advocacy body for children is so important.

I do not think children and young people do feel consulted about the Bill because the organisations and charities are telling us that. I join all Opposition Members in supporting and paying tribute to the remarkable job that the Children’s Commissioner does. I quoted her setting out her worries about the Bill. I quoted her saying that

“the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

That is what she said. She did not say, “I’m the person charged with doing this. I’m the person who has the resource and my office has the resource.”

Online Safety Bill (Fourteenth sitting)

Debate between Kirsty Blackman and Baroness Keeley
Committee stage
Tuesday 21st June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Good afternoon, Ms Rees. The importance of an effective complaints procedure has been argued strongly by many people who have given oral and written evidence to this Committee and indeed by Committee members. It is welcome that clause 140 introduces a super-complaints mechanism to report multiple, widespread concerns about the harm caused by services, but the lack of redress for individuals has been raised repeatedly.

This is a David and Goliath situation, with platforms holding all the power, while individuals are left to navigate the often complex and underfunded internal complaints systems provided by the platforms. This is what the London School of Economics and Political Science has called the

“current imbalance between democratic, ‘people’ power and the power of platforms.”

As we argued on new clause 1, there is a clear need to consider a route for redress at an individual level. The current situation is unsatisfactory for people who feel they have been failed by a service’s complaints system and who find themselves with no source of redress.

The current situation is also unsatisfactory for the regulator. Kevin Bakhurst from Ofcom told the right hon. Member for Basingstoke during our evidence sessions:

“Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly.”––[Official Report, Online Safety Public Bill Committee, 24 May; c.9-10, Q9.]

An external redress process was recommended by the Joint Committee on the draft Bill and has been suggested by multiple stakeholders. Our new clause would make sure that we find the best possible solution to the problem. I hope the Minister reconsiders these points and supports new clause 1 when the time comes to vote on it.

As I have argued previously, organisations will not be able to make full and effective use of the super-complaints system unless the platforms risk assessments are published in full. The Opposition’s amendments 11 and 13 sought to address that issue, and I am disappointed that the Government failed to grasp their importance. There is now a real risk that civil society and other groups will not be able to assess and identify the areas where a company may not be meeting its safety duties. How does the Minister expect organisations making super-complaints to identify and argue that a service is causing harm to its users if they have no access to the company’s own analysis and mitigation strategy? Not including a duty to publish risk assessments leaves a gaping hole in the Bill and risks undermining the super-complaints mechanism. I hope that the Minister will reconsider his opposition to this important transparency mechanism in future stages of the Bill.

For powers about super-complaints to be meaningful, there must be a strict deadline for Ofcom to respond to them, and we will support the SNP amendment if it is pushed to a vote. The Enterprise Act 2002 gives a 90-day deadline for the Competition and Markets Authority to respond. Stakeholders have suggested a similar deadline to respond for super-complaints as an effective mechanism to ensure action from the regulator. I urge the Minister to consider this addition, either in the Bill with this amendment, or in the secondary legislation that the clause requires.

Clauses 141 and 142 relate to the structures around super-complaints. Clause 141 appears to be more about handing over powers to the Secretary of State than insuring a fair system of redress. The Opposition have said repeatedly how we feel about the powers being handed over to the Secretary of State. Clause 142 includes necessary provisions on the creation and publication of guidance by Ofcom, which we do not oppose. Under clause 141, Ofcom will have to provide evidence of the validity of the super-complaint and the super-complainant within a stipulated timeframe. However, there is little in the Bill about what will happen when a super-complaint is made, and much of the detail on how that process will work has been left to secondary legislation.

Does the Minister not think that it is strange to leave it up to the Secretary of State to determine how Ofcom is to deal with super-complaints? How does he envisage the system working, and what powers does he think Ofcom will need to be able to assert itself in relation to super-complaints? It seems odd to leave the answers to those important questions out of the Bill.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I appreciate the support from the Opposition in relation to amendment 153. I want to talk about amendment 153, but also about some of the issues there are with clauses 140 and 141—not so much 142. Clause 140(3) allows the Secretary of State to make regulations in relation to working out who an eligible entity is for making super-complaints. The Minister has helpfully been very clear that the definition is likely to be pretty wide—the definition of groups that are working on behalf of consumers is likely to be wide. The regulations that are made in this section are going to be made under the draft affirmative procedure. Although secondary legislation is not brilliant, the affirmative procedure will allow more scrutiny than negative procedure. I appreciate that the Minister has chosen—or the people drafting the Bill have chosen—that way forward for deciding on the eligible entity.

I am concerned that when it comes to clause 141(1), the regulations setting out how the complaints process will be made, and the regulation level, will be done under the negative procedure rather than under the draft affirmative procedure. I have got the Delegated Powers and Regulatory Reform Committee memorandum, which tells us about each of the delegated powers of the Bill, and the justification for them. I understand that the Department is referring to the Police Super-complaints (Designation and Procedure) Regulations 2018, which were made under the negative procedure. However, I am not convinced that in the Policing and Crime Act 2017 we were left with quite so little information about what would be included in those complaints. I think the justification for the negative procedure is not great, especially given the concerns raised about the over-reach of the Secretary of State’s power and the amount of influence they have on Ofcom.

I think clause 142 is fine; it makes sense that Ofcom is able to make guidance. I would have liked to see the regulation part involve more input from parliamentarians. If there is not going to be more input from parliamentarians, there should at least be more in the Bill about how the complaints procedure would work. The reason we have tabled amendment 153 is to ensure that Ofcom provides a response. That response does not have to be a final response saying, “We have investigated everything and these are the findings.” I understand that that may take some time. However, Ofcom must provide a response to super-complainants in 90 days. Even if it were to provide that information in the terms laid out in clause 141(2)(d)—whether a complaint is within clause 140, or is admissible under clause 140 or whether an entity is an eligible entity—and we were to commit Ofcom to provide that information within 90 days, that would be better than the current drafting, which is no time limits at all. It is not specified. It does not say that Ofcom has to deal with the complaint within a certain length of time.

A quick response from Ofcom is important for a number of reasons. I expect that those people who are bringing super-complaints are likely to be third sector organisations. Such organisations do not have significant or excessive budgets. They will be making difficult choices about where to spend their money. If they are bringing forward a super-complaint, they will be doing it on the basis that they think it is incredibly important and it is worth spending their finite funding on legal advice in order to bring forward that super-complaint. If there is an unnecessary delay before Ofcom even recognises whether the complaint is eligible, charities may spend money unnecessarily on building up a further case for the next stages of the super-complaint. They should be told very quickly, “No, we are not accepting this” or “Yes, we are accepting this”.

Ofcom has the ability to levy fees so that it can provide the service that we expect it to provide as a result of the Bill. It will have a huge amount of extra work compared with its current work. It needs to be able to levy fees in order to fulfil its functions. If there is no timeline and it says, “We want to levy fees because we want to be able to respond on a 90-day basis”, it would not be beyond companies to come back and say, “That is unrealistic—you should not be charging us extra fees in order for you to have enough people to respond within a 90-day period to super-complaints.”

If Ofcom is to be able to levy fees effectively to provide the level of service that we would all—including, I am sure, the Minister—like to see to super-complainants who are making very important cases on behalf of members of the public and people who are being harmed by content online, and to give Ofcom that backing when it is setting the structures and levying the fees, it would be sensible for the Minister to make some commitments about the timelines for super-complaints.

In earlier clauses of the Bill, primacy is given to complaints to social media platforms, for example—to regulated providers—about freedom of speech. The Bill says that they are to give such complaints precedence. They are to deal with them as important and, where some content has been taken down, quickly. That precedence is written into the Bill. Such urgency is not included in these three clauses on super-complaints in the way I would like to see. The Bill should say that Ofcom has to deal with super-complaints quickly. I do not mean it should do that by doing a bad job. I mean that it should begin to investigate quickly, work out whether it is appropriate to investigate it under the super-complaints procedure, and then begin the investigation.

In some cases, stuff will be really urgent and will need to be dealt with very quickly, especially if, for example, it includes child sexual abuse images. That would need to be dealt with in a matter of hours or days, rather than any longer period.

I would like to see some sort of indication given to Ofcom about the timelines that we are expecting it to work to. Given the amount of work that third sector organisations have put in to support this Bill and try to make it better, this is a fairly easy amendment for the Minister to accede to—an initial response by Ofcom within a 90-day period; we are not saying overnight—so that everyone can be assured that the internet is, as the Minister wishes, a much safer place.

Online Safety Bill (Thirteenth sitting)

Debate between Kirsty Blackman and Baroness Keeley
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 77, in clause 140, page 121, line 9, leave out subsection (2).

This amendment removes the tests that complaints have to be of particular importance in order to be admissible.

When I first read clause 140, subsection (2) raised a significant number of red flags for me. The subsection might be reasonable if we did not have giant companies—social media platforms particularly—that significant numbers of people across the UK use regularly. Facebook might be counted as a “single regulated service”, but 85% of UK residents—57.1 million people—had a Facebook account earlier this year. Twitter is used by 28% of people living in the UK, which is 19 million users. TikTok is at 19%, which is significantly less, but still a very high number of people—13 million users. I can understand the decision that a super-complaint picking on one certain company might be a bit extreme, but it does not make sense when we are considering the Facebooks of this world.

If someone is making a complaint about a single regulated service and that service is Facebook, Twitter, TikTok or another large platform—or a new, yet-to-be-created platform—that significant numbers of people use, there is no justification for treating that complaint differently just because it is against a single entity. When a complaint is made against Facebook—I am picking on Facebook because 85% of the UK public are members of it; it is an absolute behemoth—I would like there to be no delay in its being taken to Ofcom. I would like Ofcom not to have to check and justify that the complaint is “of particular importance”.

Subsection (2)(a) states that one of the tests of the complaint should be that it “is of particular importance” or, as subsection (2)(b) notes, that it

“relates to the impacts on a particularly large number of users of the service or members of the public.”

I do not understand what

“large number of users of the service”

would mean. Does a large number of the users of Facebook mean 50% of its users? Does it mean 10%? What is a large number? Is that in percentage terms, or is it something that is likely to impact 1 million people? Is that a large number? The second part—

“large number…of members of the public”—

is again difficult to define. I do not think there is justification for this additional hoop just because the complaint relates to a single regulated service.

Where a complaint relates to a very small platform that is not causing significant illegal harm, I understand that Ofcom may want to consider whether it will accept, investigate and give primacy and precedence to that. If the reality is that the effect is non-illegal, fairly minor and impacts a fairly small number of people, in the order of hundreds instead of millions, I can understand why Ofcom might not want to give that super-complaint status and might not want to carry out the level of investigation and response necessary for a super-complaint. But I do not see any circumstances in which Ofcom could justify rejecting a complaint against Facebook simply because it is a complaint against a single entity. The reality is that if something affects one person on Facebook, it will affect significantly more than one person on Facebook because of Facebook’s absolutely massive user base. Therefore this additional hoop is unrealistic.

Paragraph (a), about the complaint being “of particular importance”, is too woolly. Does it relate only to complaints about things that are illegal? Does it relate only to things that are particularly urgent—something that is happening now and that is having an impact today? Or is there some other criterion that we do not yet know about?

I would very much appreciate it if the Minister could give some consideration to amendment 77, which would simply remove subsection (2). If he is unwilling to remove that subsection, I wonder whether we could meet halfway and whether, let us say, category 1 providers could all be excluded from the “single provider” exemption, because they have already been assessed by Ofcom to have particular risks on their platforms. That group is wider than the three names that I have mentioned, and I think that that would be a reasonable and realistic decision for the Government—and direction for Ofcom—to take. It would be sensible.

If the Government believe that there is more information—more direction—that they could add to the clause, it would be great if the Minister could lay some of that out here and let us know how he intends subsection (2) to operate in practice and how he expects Ofcom to use it. I get that people might want it there as an additional layer of protection, but I genuinely do not imagine that it can be justified in the case of the particularly large providers, where there is significant risk of harm happening.

I will illustrate that with one last point. The Government specifically referred earlier to when Facebook—Meta—stopped proactively scanning for child sexual abuse images because of an issue in Europe. The Minister mentioned the significant amount of harm and the issues that were caused in a very small period. And that was one provider—the largest provider that people use and access. That massive amount of harm can be caused in a very small period. I do not support allowing Meta or any other significantly large platform to have a “get out of jail” card. I do not want them to be able to go to Ofcom and say, “Hey, Ofcom, we’re challenging you on the basis that we don’t think this complaint is of particular importance” or “We don’t think the complaint relates to the impacts on a particularly large number of users of the service or members of the public.” I do not want them to have that ability to wriggle out of things because this subsection is in the Bill, so any consideration that the Minister could give to improving clause 140 and subsection (2) would be very much appreciated.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We support the SNP’s amendment 77, moved by the hon. Member for Aberdeen North. The super-complaints mechanism introduced by clause 140 is a useful device for reporting numerous, widespread concerns about the harm caused by multiple or single services or providers. Subsection (1) includes the conditions on the subjects of super-complaints, which can relate to one or more services. However, as the hon. Member has pointed out, that is caveated by subsection (2), under which a super-complaint that refers to a single service or provider must prove, as she has just outlined, that it is “of particular importance” or

“relates to the impacts on a particularly large number of users of the service or members of the public.”

Given the various hoops through which a super-complaint already has to jump, it is not clear why the additional conditions are needed. Subsection (2) significantly muddies the waters and complicates the provisions for super-complaints. For instance, how does the Minister expect Ofcom to decide whether the complaint is of particular importance? What criteria does he expect the regulator to use? Why include it as a metric in the first place when the super-complaint has already met the standards set out in subsection (1)?

Online Safety Bill (Twelfth sitting)

Debate between Kirsty Blackman and Baroness Keeley
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the hon. Member for Pontypridd for laying out her case in some detail, though nowhere near the level of detail that these people have to experience while providing moderation. She has given a very good explanation of why she is asking for the amendment and new clause to be included in the Bill. Concerns are consistently being raised, particularly by the Labour party, about the impact on the staff members who have to deal with this content. I do not think the significance of this issue for those individuals can be overstated. If we intend the Bill to have the maximum potential impact and reduce harm to the highest number of people possible, it makes eminent sense to accept this amendment and new clause.

There is a comparison with other areas in which we place similar requirements on other companies. The Government require companies that provide annual reports to undertake an assessment in those reports of whether their supply chain uses child labour or unpaid labour, or whether their factories are safe for people to work in—if they are making clothes, for example. It would not be an overly onerous request if we were to widen those requirements to take account of the fact that so many of these social media companies are subjecting individuals to trauma that results in them experiencing PTSD and having to go through a lengthy recovery process, if they ever recover. We have comparable legislation, and that is not too much for us to ask. Unpaid labour, or people being paid very little in other countries, is not that different from what social media companies are requiring of their moderators, particularly those working outside the UK and the US in countries where there are less stringent rules on working conditions. I cannot see a reason for the Minister to reject the provision of this additional safety for employees who are doing an incredibly important job that we need them to be doing, in circumstances where their employer is not taking any account of their wellbeing.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As my hon. Friend the Member for Pontypridd has pointed out, there is little or no transparency about one of the most critical ways in which platforms tackle harms. Human moderators are on the frontline of protecting children and adults from harmful content. They must be well resourced, trained and supported in order to fulfil that function, or the success of the Bill’s aims will be severely undermined.

I find it shocking that platforms offer so little data on human moderation, either because they refuse to publish it or because they do not know it. For example, in evidence to the Home Affairs Committee, William McCants from YouTube could not give precise statistics for its moderator team after being given six days’ notice to find the figure, because many moderators were employed or operated under third-party auspices. For YouTube’s global counter-terrorism lead to be unaware of the detail of how the platform is protecting its users from illegal content is shocking, but it is not uncommon.

In evidence to this Committee, Meta’s Richard Earley was asked how many of Meta’s 40,000 human moderators were outsourced to remove illegal content and disinformation from the platform. My hon. Friend the Member for Pontypridd said:

“You do not have the figures, so you cannot tell me.”

Richard Earley replied:

“I haven’t, no, but I will be happy to let you know afterwards in our written submission.”

Today, Meta submitted its written evidence to the Committee. It included no reference to human content moderators, despite its promise.

The account that my hon. Friend gave just now shows why new clause 11 is so necessary. Meta’s representative told this Committee in evidence:

“Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 45, Q76.]

But now we know from whistleblowers such as Daniel, whose case my hon. Friend described, that that is untrue. What is happening to Daniel and the other human moderators is deeply concerning. There are powerful examples of the devastating emotional impact that can occur because human moderators are not monitored, trained and supported.

There are risks of platforms shirking responsibility when they outsource moderation to third parties. Stakeholders have raised concerns that a regulated company could argue that an element of its service is not in the scope of the regulator because it is part of a supply chain. We will return to that issue when we debate new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties.

Platforms, in particular those supporting user-to-user generated content, employ those services from third parties. Yesterday, I met Danny Stone, the chief executive of the Antisemitism Policy Trust, who described the problem of antisemitic GIFs. Twitter would say, “We don’t supply GIFs. The responsibility is with GIPHY.” GIPHY, as part of the supply chain, would say, “We are not a user-to-user platform.” If someone searched Google for antisemitic GIFs, the results would contain multiple entries saying, “Antisemitic GIFs—get the best GIFs on GIPHY. Explore and share the best antisemitic GIFs.”

One can well imagine a scenario in which a company captured by the regulatory regime established by the Bill argues that an element of its service is not within the ambit of the regulator because it is part of a supply chain presented by, but not necessarily the responsibility of, the regulated service. The contracted element, which I have just described by reference to Twitter and GIPHY, supported by an entirely separate company, would argue that it was providing a business-to-business service that is not user-generated content but content designed and delivered at arm’s length and provided to the user-to-user service to deploy for its users.

I suggest that dealing with this issue would involve a timely, costly and unhelpful legal process during which systems were not being effectively regulated—the same may apply in relation to moderators and what my hon. Friend the Member for Pontypridd described; there are a number of lawsuits involved in Daniel’s case—and complex contract law was invoked.

We recognise in UK legislation that there are concerns and issues surrounding supply chains. Under the Bribery Act 2010, for example, a company is liable if anyone performing services for or on the company’s behalf is found culpable for specific actions. These issues on supply chain liability must be resolved if the Bill is to fulfil its aim of protecting adults and children from harm.

Online Safety Bill (Tenth sitting)

Debate between Kirsty Blackman and Baroness Keeley
Committee stage
Tuesday 14th June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The additional regulations created by the Secretary of State in connection with the reports will have a lot resting on them. It is vital that they receive the appropriate scrutiny when the time comes. For example, the regulations must ensure that referrals to the National Crime Agency made by companies are of a high quality, and that requirements are easy to comply with. Prioritising the highest risk cases will be important, particularly where there is an immediate threat to the safety and welfare of a child.

Clause 60 sets out that the Secretary of State’s regulations must include

“provision about cases of particular urgency”.

Does the Minister have an idea what that will look like? What plans are in place to ensure that law enforcement can prioritise the highest risk and harm cases?

Under the new arrangements, the National Crime Agency as the designated body, the Internet Watch Foundation as the appropriate authority for notice and takedown in the UK, and Ofcom as the regulator for online harms will all hold a vast amount of information on the scale of the threat posed by child sexual exploitation and illegal content. How will the introduction of mandatory reporting assist those three organisations in improving their understanding of how harm manifests online? How does the Minister envisage the organisations working together to share information to better protect children online?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am glad that clause 60 will be in the Bill and that there will be a duty to report to the NCA. On subsection (3), though, I would like the Minister to clarify that if the Secretary of State believes that the Scottish Ministers would be appropriate people to consult, they would consult them, and the same for the Northern Ireland Executive.

I would appreciate the Minister explaining how clause 61 will work in a Scottish context, because that clause talks about the Crime and Courts Act 2013. Does a discussion need to be had with Scottish Ministers, and perhaps Northern Ireland Ministers as well, to ensure that information sharing takes place seamlessly with devolved areas with their own legal systems, to the same level as within England and Wales? If the Minister does not have an answer today, which I understand that he may not in detail, I am happy to hear from him later; I understand that it is quite a technical question.

Online Safety Bill (Seventh sitting)

Debate between Kirsty Blackman and Baroness Keeley
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. I certainly do not think I am suggesting that the bigger platforms such as Twitter and Facebook will reduce their reporting mechanisms as a result of how the Bill is written. However, it is possible that newer or smaller platforms, or anything that starts after this legislation comes, could limit the ability to report on the basis of these clauses.

Baroness Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Good morning, Ms Rees.

It is important that users of online services are empowered to report harmful content, so that it can be removed. It is also important for users to have access to complaints procedures when wrong moderation decisions have been made. Reporting and complaint mechanisms are integral to ensuring that users are safe and that free speech is upheld, and we support these provisions in the Bill.

Clauses 17 and 18, and clauses 27 and 28, are two parts of the same process: content reporting by individual users, and the handling of content reported as a complaint. However, it is vital that these clauses create a system that works. That is the key point that Labour Members are trying to make, because the wild west system that we have at the moment does not work.

It is welcome that the Government have proposed a system that goes beyond the users of the platform and introduces a duty on companies. However, companies have previously failed to invest enough money in their complaints systems for the scale at which they are operating in the UK. The duties in the Bill are an important reminder to companies that they are part of a wider society that goes beyond their narrow shareholder interest.

One example of why this change is so necessary, and why Labour Members are broadly supportive of the additional duties, is the awful practice of image abuse. With no access to sites on which their intimate photographs are being circulated, victims of image abuse have very few if any routes to having the images removed. Again, the practice of image abuse has increased during the pandemic, including through revenge porn, which the Minister referred to. The revenge porn helpline reported that its case load more than doubled between 2019 and 2020.

These clauses should mean that people can easily report content that they consider to be either illegal, or harmful to children, if it is hosted on a site likely to be accessed by children, or, if it is hosted on a category 1 platform, harmful to adults. However, the Minister needs to clarify how these service complaints systems will be judged and what the performance metrics will be. For instance, how will Ofcom enforce against a complaint?

In many sectors of the economy, even with long-standing systems of regulation, companies can have tens of millions of customers reporting content, but that does not mean that any meaningful action can take place. The hon. Member for Aberdeen North has just told us how often she reports on various platforms, but what action has taken place? Many advocacy groups of people affected by crimes such as revenge porn will want to hear, in clear terms, what will happen to material that has been complained about. I hope the Minister can offer that clarity today.

Transparency in reporting will be vital to analysing trends and emerging types of harm. It is welcome that in schedule 8, which we will come to later, transparency reporting duties apply to the complaints process. It is important that as much information as possible is made public about what is going on in companies’ complaints and reporting systems. As well as the raw number of complaints, reporting should include what is being reported or complained about, as the Joint Committee on the draft Bill recommended last year. Again, what happens to the reported material will be an important metric on which to judge companies.

Finally, I will mention the lack of arrangements for children. We have tabled new clause 3, which has been grouped for discussion with other new clauses at the end of proceedings, but it is relevant to mention it now briefly. The Children’s Commissioner highlighted in her oral evidence to the Committee how children had lost faith in complaints systems. That needs to be changed. The National Society for the Prevention of Cruelty to Children has also warned that complaints mechanisms are not always appropriate for children and that a very low proportion of children have ever reported content. A child specific user advocacy body could represent the interests of child users and support Ofcom’s regulatory decisions. That would represent an important strengthening of protections for users, and I hope the Government will support it when the time comes.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - - - Excerpts

I rise briefly to talk about content reporting. I share the frustrations of the hon. Member for Aberdeen North. The way I read the Bill was that it would allow users and affected persons, rather than “or” affected persons, to report content. I hope the Minister can clarify that that means affected persons who might not be users of a platform. That is really important.

Will the Minister also clarify the use of human judgment in these decisions? Many algorithms are not taking down some content at the moment, so I would be grateful if he clarified that there is a need for platforms to provide a genuine human judgment on whether content is harmful.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.

Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.

The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.

Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

Online Safety Bill (Sixth sitting)

Debate between Kirsty Blackman and Baroness Keeley
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

May I say—this might be a point of order—how my constituency name is pronounced? I get a million different versions, but it is Worsley, as in “worse”. It is an unfortunate name for a great place.

I will speak to all the amendments in the group together, because they relate to how levels of risk are assessed in relation to certain characteristics. The amendments are important because small changes to the descriptions of risk assessment will help to close a significant gap in protection.

Clauses 10 and 12 introduce a duty on regulated companies to assess harms to adults and children who might have an innate vulnerability arising from being a member of a particular group or having a certain characteristic. However, Ofcom is not required to assess harms to people other than children who have that increased innate vulnerability. Amendment 71 would require Ofcom to assess risks of harm particularly affecting people with certain characteristics or membership of a group or groups as part of its risk register. That would reduce the regulatory burden if companies had Ofcom’s risk assessment to base their work on.

Getting this right is important. The risk management regime introduced by the Bill should not assume that all people are at the same risk of harm—they are clearly not. Differences in innate vulnerability increase the incidence and impact of harm, such as by increasing the likelihood of encountering content or of that content being harmful, or heightening the impact of the harm.

It is right that the Bill emphasises the vulnerability of children, but there are other, larger groups with innate vulnerability to online harm. As we know, that often reflects structural inequalities in society.

For example, women will be harmed in circumstances where men might not be, and they could suffer some harms that have a more serious impact than they might for men. A similar point can be made for people with other characteristics. Vulnerability is then compounded by intersectional issues—people might belong to more than one high-risk group—and I will come to that in a moment.

The initial Ofcom risk assessment introduced by clause 83 is not required to consider the heightened risks to different groups of people, but companies are required to assess that risk in their own risk assessments for children and adults. They need to be given direction by an assessment by Ofcom, which amendment 71 would require.

Amendments 72 to 75 address the lack of recognition in these clauses of intersectionality issues. They are small amendments in the spirit of the Bill’s risk management regime. As drafted, the Bill refers to a singular “group” or “characteristic” for companies to assess for risk. However, some people are subject to increased risks of harm arising from being members of more than one group. Companies’ risk assessments for children and adults should reflect intersectionality, and not just characteristics taken individually. Including the plural of “group” and “characteristic” in appropriate places would achieve that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will first speak to our amendment 85, which, like the Labour amendment, seeks to ensure that the Bill is crystal clear in addressing intersectionality. We need only consider the abuse faced by groups of MPs to understand why that is necessary. Female MPs are attacked online much more regularly than male MPs, and the situation is compounded if they have another minority characteristic. For instance, if they are gay or black, they are even more likely to be attacked. In fact, the MP who is most likely to be attacked is black and female. There are very few black female MPs, so it is not because of sheer numbers that they are at such increased risk of attack. Those with a minority characteristic are at higher risk of online harm, but the risk facing those with more than one minority characteristic is substantially higher, and that is what the amendment seeks to address.

I have spoken specifically about people being attacked on Twitter, Facebook and other social media platforms, but people in certain groups face an additional significant risk. If a young gay woman does not have a community around her, or if a young trans person does not know anybody else who is trans, they are much more likely to use the internet to reach out, to try to find people who are like them, to try to understand. If they are not accepted by their family, school or workplace, they are much more likely to go online to find a community and support—to find what is out there in terms of assistance—but using the internet as a vulnerable, at-risk person puts them at much more significant risk. This goes back to my earlier arguments about people requiring anonymity to protect themselves when using the internet to find their way through a difficult situation in which they have no role models.

It should not be difficult for the Government to accept this amendment. They should consider it carefully and understand that all of us on the Opposition Benches are making a really reasonable proposal. This is not about saying that someone with only one protected characteristic is not at risk; it is about recognising the intersectionality of risk and the fact that the risk faced by those who fit into more than one minority group is much higher than that faced by those who fit into just one. This is not about taking anything away from the Bill; it is about strengthening it and ensuring that organisations listen.

We have heard that a number of companies are not providing the protection that Members across the House would like them to provide against child sexual abuse. The governing structures, risk assessments, rules and moderation at those sites are better at ensuring that the providers make money than they are at providing protection. When regulated providers assess risk, it is not too much to ask them to consider not just people with one protected characteristic but those with multiple protected characteristics.

As MPs, we work on that basis every day. Across Scotland and the UK, we support our constituents as individuals and as groups. When protected characteristics intersect, we find ourselves standing in Parliament, shouting strongly on behalf of those affected and giving them our strongest backing, because we know that that intersection of harms is the point at which people are most vulnerable, in both the real and the online world. Will the Minister consider widening the provision so that it takes intersectionality into account and not only covers people with one protected characteristic but includes an over and above duty? I genuinely do not think it is too much for us to ask providers, particularly the biggest ones, to make this change.

Online Safety Bill (Second sitting)

Debate between Kirsty Blackman and Baroness Keeley
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have one last question. Rhiannon, a suggestion was made earlier by Dr Rachel O’Connell about age verification and only allowing children to interact with other children whose age is verified within a certain area. Do you think that would help to prevent online grooming?

Rhiannon-Faye McDonald: It is very difficult. While I am strongly about protecting children from encountering perpetrators, I also recognise that children need to have freedoms and the ability to use the internet in the ways that they like. I think if that was implemented and it was 100% certain that no adult could pose as a 13-year-old and therefore interact with actual 13-year-olds, that would help, but I think it is tricky.

Susie Hargreaves: One of the things we need to be clear about, particularly where we see children groomed —we are seeing younger and younger children—is that we will not ever sort this just with technology; the education piece is huge. We are now seeing children as young as three in self-generated content, and we are seeing children in bedrooms and domestic settings being tricked, coerced and encouraged into engaging in very serious sexual activities, often using pornographic language. Actually, a whole education piece needs to happen. We can put filters and different technology in place, but remember that the IWF acts after the event—by the time we see this, the crime has been committed, the image has been shared and the child has already been abused. We need to bump up the education side, because parents, carers, teachers and children themselves have to be able to understand the dangers of being online and be supported to build their resilience online. They are definitely not to be blamed for things that happen online. From Rhiannon’s own story, how quickly it can happen, and how vulnerable children are at the moment—I don’t know.

Rhiannon-Faye McDonald: For those of you who don’t know, it happened very quickly to me, within the space of 24 hours, from the start of the conversation to the perpetrator coming to my bedroom and sexually assaulting me. I have heard other instances where it has happened much more quickly than that. It can escalate extremely quickly.

Just to add to Susie’s point about education, I strongly believe that education plays a huge part in this. However, we must be very careful in how we educate children, so that the focus is not on how to keep themselves safe, because puts the responsibility on them, which in turn increases the feelings of responsibility when things do go wrong. That increased feeling of responsibility makes it less likely that they will disclose that something has happened to them, because they feel that they will be blamed. It will decrease the chance that children will tell us that something has happened.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Just to follow up on a couple of things, mainly with Susie Hargreaves. You mentioned reporting mechanisms and said that reporting will be a step forward. However, the Joint Committee on the draft Bill recommended that the highest-risk services should have to report quarterly data to Ofcom on the results of their child sexual exploitation and abuse removal systems. What difference would access to that kind of data make to your work?

Susie Hargreaves: We already work with the internet industry. They currently take our services and we work closely with them on things such as engineering support. They also pay for our hotline, which is how we find child sexual abuse. However, the difference it would make is that we hope then to be able to undertake work where we are directly working with them to understand the level of their reports and data within their organisations.

At the moment, we do not receive that information from them. It is very much that we work on behalf of the public and they take our services. However, if we were suddenly able to work directly with them—have information about the scale of the issue within their own organisations and work more directly on that— then that would help to feed into our work. It is a very iterative process; we are constantly developing the technology to deal with the current threats.

It would also help us by giving us more intelligence and by allowing us to share that information, on an aggregated basis, more widely. It would certainly also help us to understand that they are definitely tackling the problem. We do believe that they are tackling the problem, because it is not in their business interests not to, but it just gives a level of accountability and transparency that does not exist at the moment.

Online Safety Bill (First sitting)

Debate between Kirsty Blackman and Baroness Keeley
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you.

Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I want to ask about the many tragic cases of teenagers who have died by suicide after viewing self-harm material online. Do you think coroners have sufficient powers to access digital data after the death of a child, and should parents have the right to access their children’s digital data following their death?

Dame Rachel de Souza: Baroness Kidron has done some fantastic work on this, and I really support her work. I want to tell you why. I am a former headteacher—I worked for 30 years in schools as a teacher and headteacher. Only in the last five or six years did I start seeing suicides of children and teenagers; I did not see them before. In the year just before I came to be Children’s Commissioner, there was a case of a year 11 girl from a vulnerable family who had a relationship with a boy, and it went all over the social media sites. She looked up self-harm material, went out to the woods and killed herself. She left a note that basically said, “So there. Look what you’ve done.”

It was just horrendous, having to pick up the family and the community of children around her, and seeing the long-term effects of it on her siblings. We did not see things like that before. I am fully supportive of Baroness Kidron and 5Rights campaigning on this issue. It is shocking to read about the enormous waiting and wrangling that parents must go through just to get their children’s information. It is absolutely shocking. I think that is enough from me.

Andy Burrows: I absolutely agree. One of the things we see at the NSPCC is the impact on parents and families in these situations. I think of Ian Russell, whose daughter Molly took her own life, and the extraordinarily protracted process it has taken to get companies to hand over her information. I think of the anguish and heartbreak that comes with this process. The Bill is a fantastic mechanism to be able to redress the balance in terms of children and families, and we would strongly support the amendments around giving parents access to that data, to ensure that this is not the protracted process that it currently all too often is.