Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateCharlotte Nichols
Main Page: Charlotte Nichols (Labour - Warrington North)Department Debates - View all Charlotte Nichols's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberThis Bill links with other legislation, and obviously the agencies. We do not seek to redefine extremism where those definitions already exist. As we expand on the changes that we are making, we will first ensure that anything that is already illegal goes off the table. Anything that is against the terms and conditions of those platforms that are hosting that content must not be seen. I will come to the safety net and user protection later.
Since Elon Musk’s takeover of Twitter, hate speech has ballooned on the platform and the number of staff members at Twitter identifying images of child sexual abuse and exploitation has halved. How can the Minister be sure that the social media companies are able to mark their own homework in the way that he suggests?
I thank my hon. Friend, and indeed I thank my right hon. Friend the Member for Basingstoke (Dame Maria Miller) for the amazing work that she has done in this area. We will table an amendment to the Bill to criminalise more behaviour relating to intimate image abuse, so more perpetrators will face prosecution and potentially time in jail. My hon. Friend has worked tirelessly in this area, and we have had a number of conversations. I thank her for that. I look forward to more conversations to ensure that we get the amendment absolutely right and that it does exactly what we all want.
The changes we are making will include criminalising the non-consensual sharing of manufactured intimate images, which, as we have heard, are more commonly known as deepfakes. In the longer term, the Government will also take forward several of the Law Commission’s recommendations to ensure that the legislation is coherent and takes account of advancements in technology.
We will also use the Bill to bring forward a further communication offence to make the encouragement of self-harm illegal. We have listened to parliamentarians and stakeholders concerned about such behaviour and will use the Bill to criminalise that activity, providing users with protections from that harmful content. I commend my right hon. Friend the Member for Haltemprice and Howden on his work in this area and his advocacy for such a change.
Intimate image abuse has been raised with me a number of times by younger constituents, who are particularly vulnerable to such abuse. Within the scope of what we are discussing, I am concerned that we have seen only one successful conviction for revenge porn, so if the Government base their intimate image work on the existing legislative framework for revenge porn, it will do nothing and protect no one, and will instead be a waste of everyone’s time and further let down victims who are already let down by the system.
We will actually base that work on the independent Law Commission’s recommendations, and have been working with it on that basis.
ONLINE SAFETY BILL (First sitting) Debate
Full Debate: Read Full DebateCharlotte Nichols
Main Page: Charlotte Nichols (Labour - Warrington North)Department Debates - View all Charlotte Nichols's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Public Bill CommitteesThat is absolutely the case. We are talking about cats because I chose them to illustrate the situation, but people may look at content about healthy eating, and that moves on to content that encourages them to be sick. The way the algorithms step it up is insidious; they get more and more extreme, so that the linger time is increased and people do not get bored. It is important that platforms look specifically at their habit-forming features.
A specific case on the platform TikTok relates to a misogynist who goes by the name of Andrew Tate, who has been banned from a number of social media platforms. However, because TikTok works by making clips shorter, which makes it more difficult for the company to identify some of this behaviour among users, young boys looking for videos of things that might interest them were very quickly shown misogynist content from Andrew Tate. Because they watched one video of him, they were then shown more and more. It is easy to see how the habit-forming behaviours built into platforms’ algorithms, which the hon. Lady identifies, can also be a means of quickly radicalising children into extreme ideologies.
Order. I think we have the message. I have to say to all hon. Members that interventions are interventions, not speeches. If Members wish to make speeches, there is plenty of time.
It is a pleasure to serve with you in the Chair, Sir Roger. I rise in support of amendments 99, and 96 and 97, as my hon. Friend the Member for Pontypridd did. I have an issue with the vagueness and ambiguity in the Bill. Ministerial direction is incredibly helpful, not only for Ofcom, but for the companies and providers that will use the Bill to make technologies available to do what we are asking them to do.
As the hon. Member for Aberdeen North said, if the Bill provided for that middle ground, that would be helpful for a number of purposes. Amendment 97 refers to livestreaming; in a number of cases around the world, people have livestreamed acts of terror, such as the shooting at the Christchurch mosque. Those offences were watched in real time, as they were perpetuated, by potentially hundreds of thousands of people. We have people on watch lists—people we are aware of. If we allowed them to use a social media platform but not the livestreaming parts, that could go some way to mitigating the risk of their livestreaming something like that. Their being on the site is perhaps less of a concern, as their general use of it could be monitored in real time. Under a risk analysis, we might be happy for people to be on a platform, but consider that the risk was too great to allow them to livestream. Having such a provision would be helpful.
My hon. Friend the Member for Luton North mentioned the onus always being on the victim. When we discuss online abuse, I really hate it when people say, “Well, just turn off your messages”, “Block them” or “Change your notification settings”, as though that were a panacea. Turning off the capacity to use direct messages is a much more effective way of addressing abuse by direct message than banning the person who sent it altogether—they might just make a new account—or than relying on the recipient of the message to take action when the platform has the capacity to take away the option of direct messaging. The adage is that sunlight is the best disinfectant. When people post in public and the post can be seen by anyone, they can be held accountable by anyone. That is less of a concern to me than what they send privately, which can be seen only by the recipient.
This group of amendments is reasonable and proportionate. They would not only give clear ministerial direction to Ofcom and the technology providers, and allow Ofcom to take the measures that we are discussing, but would pivot us away from placing the onus on the recipients of abusive behaviour, or people who might be exposed to it. Instead, the onus would be on platforms to make those risk assessments and take the middle ground, where that is a reasonable and proportionate step.
I will come to that in a second. The hon. Member for Luton North talked about putting the onus on the victim. Any element of choice is there for adults; the children will be protected anyway, as I will outline in a second. We all agree that the primary purpose of the Bill is to be a children’s protection measure.
Ofcom will set out in codes of practice the specific steps that providers can take to protect children who are using their service, and the Government expect those to include steps relating to children’s access to high-risk features, such as livestreaming or private messaging. Clause 11(4)(d) sets out that that providers may be required to take measures in the following areas:
“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.
The other areas listed are intentionally broad categories that allow for providers to take specific measures. For example, a measure in the area of blocking user access to particular content could include specific measures that restrict children’s access to parts of a service, if that is a proportionate way to stop users accessing that type of content. It can also apply to any of the features of a service that enable children to access particular content, and could therefore include children’s access to livestreaming and private messaging features. In addition, the child safety duties make it clear that providers need to use proportionate systems and processes that prevent children from encountering primary priority content that is harmful to them, and protect children and age groups at risk of harm from other content that is harmful to them.
While Ofcom will set out in codes of practice the steps that providers can take to meet these duties, we expect those steps, as we have heard, to include the use of age verification to prevent children accessing content that poses the greatest risk of harm to them. To meet that duty, providers may use measures that restrict children from accessing parts of the service. The Bill therefore allows Ofcom to require providers to take that step where it is proportionate. I hope that that satisfies the hon. Member for Aberdeen North, and gives her the direction that she asked for—that is, a direction to be more specific that Ofcom does indeed have the powers that she seeks.
The Bill states that we can expect little impact on child protection before 2027-28 because of the enforcement road map and when Ofcom is planning to set that out. Does the Minister not think that in the meantime, that sort of ministerial direction would be helpful? It could make Ofcom’s job easier, and would mean that children could be protected online before 2027-28.
The ministerial direction that the various platforms are receiving from the Dispatch Box, from our conversations with them and from the Bill’s progress as it goes through the House of Lords will be helpful to them. We do not expect providers to wait until the very last minute to implement the measures. They are starting to do so now, but we want them to go them further, quicker.
Government amendment 4 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details of the measures that they use to restrict access in their terms of service and apply them consistently. Providers will also need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service, as well as reviewing children’s use of higher-risk features, as I have said.
To meet the child safety risk assessment duties in clause 10, providers must assess: the risk of harm to children from functionalities that facilitate the presence or dissemination of harmful content; the level of risk from different kinds of harmful content, giving separate consideration to children in different age groups; the different ways in which the service is used, and the impact of such use on the level of risk of harm; and how the design and operation of the service may increase the risks identified.
The child safety duties in clause 11 apply across all areas of the service, including the way it is operated and used by children, as well as the content present on the service. For the reasons I have set out, I am not able to accept the amendments, but I hope that the hon. Member for Aberdeen North will take on board my assurances.
I cannot help but see the Government’s planned removal of clauses 12 and 13 as essentially wrecking amendments to the Bill. Taking those provisions out of the Bill makes it a Bill not about online safety, but about child protection. We have not had five years or so of going backwards and forwards, and taken the Bill through Committee and then unprecedentedly recommitted it to Committee, in order to fundamentally change what the Bill set out to do. The fact that, at this late stage, the Government are trying to take out these aspects of the Bill melts my head, for want of a better way of putting it.
My hon. Friend the Member for Batley and Spen was absolutely right when she talked about what clauses 12 and 13 do. In effect, they are an acknowledgement that adults are also harmed online, and have different experiences online. I strongly agree with the hon. Member for Aberdeen North about this not being the protect MPs from being bullied on Twitter Bill, because obviously the provisions go much further than that, but it is worth noting, in the hope that it is illustrative to Committee members, the very different experience that the Minister and I have in using Twitter. I say that as a woman who is LGBT and Jewish—and although I would not suggest that it should be a protected characteristic, the fact that I am ginger probably plays a part as well. He and I could do the same things on Twitter on the same day and have two completely different experiences of that platform.
The risk-assessment duties set out in clause 12, particularly in subsection (5)(d) to (f), ask platforms to consider the different ways in which different adult users might experience them. Platforms have a duty to attempt to keep certain groups of people, and categories of user, safe. When we talk about free speech, the question is: freedom of speech for whom, and at what cost? Making it easier for people to perpetuate, for example, holocaust denial on the internet—a category of speech that is lawful but awful, as it is not against the law in this country to deny that the holocaust happened—makes it much less likely that I, or other Jewish people, will want to use the platform.
The hon. Member makes a powerful point about the different ways in which people experience things. That tips over into real-life abusive interactions, and goes as far as terrorist incidents in some cases. Does she agree that protecting people’s freedom of expression and safety online also protects people in their real, day-to-day life?
I could not agree more. I suppose that is why this aspect of the Bill is so important, not just to me but to all those categories of user. I mentioned paragraphs (d) to (f), which would require platforms to assess exactly that risk. This is not about being offended. Personally, I have the skin of a rhino. People can say most things to me and I am not particularly bothered by it. My concern is where things that are said online are transposed into real-life harms. I will use myself as an example. Online, we can see antisemitic and conspiratorial content, covid misinformation, and covid misinformation that meets with antisemitism and conspiracies. When people decide that I, as a Jewish Member of Parliament, am personally responsible for George Soros putting a 5G chip in their arm, or whatever other nonsense they have become persuaded by on the internet, that is exactly the kind of thing that has meant people coming to my office armed with a knife. The kind of content that they were radicalised by on the internet led to their perpetrating a real-life, in-person harm. Thank God—Baruch Hashem—neither I nor my staff were in the office that day, but that could have ended very differently, because of the sorts of content that the Bill is meant to protect online users from.
The hon. Lady is talking about an incredibly important issue, but the Bill covers such matters as credible threats to life, incitement to violence against an individual, and harassment and stalking—those patterns of behaviour. Those are public order offences, and they are in the Bill. I would absolutely expect companies to risk-assess for that sort of activity, and to be required by Ofcom to mitigate it. On her point about holocaust denial, first, the shield will mean that people can protect themselves from seeing stuff. The further question would be whether we create new offences in law, which can then be transposed across.
I accept the points that the hon. Member raised, but he is fundamentally missing the point. The categories of information and content that these people had seen and been radicalised by would not fall under the scope of public order offences or harassment. The person was not sending me harassing messages before they turned up at my office. Essentially, social media companies and other online platforms have to take measures to mitigate the risk of categories of offences that are illegal, whether or not they are in the Bill. I am talking about what clauses 12 and 13 covered, whether we call it the “legal but harmful” category or “lawful but awful”. Whatever we name those provisions, by taking out of the Bill clauses relating to the “legal but harmful” category, we are opening up an area of harm that already exists, that has a real-world impact, and that the Bill was meant to go some way towards addressing.
The provisions have taken out the risk assessments that need to be done. The Bill says,
“(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;
(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;
(g) the nature, and severity, of the harm that might be suffered by adults”.
Again, the idea that we are talking about offence, and that the clauses need to be taken out to protect free speech, is fundamentally nonsense.
I have already mentioned holocaust denial, but it is also worth mentioning health-related disinformation. We have already seen real-world harms from some of the covid misinformation online. It led to people including Piers Corbyn turning up outside Parliament with a gallows, threatening to hang hon. Members for treason. Obviously, that was rightly dealt with by the police, but the kind of information and misinformation that he had been getting online and that led him to do that, which is legal but harmful, will now not be covered by the Bill.
I will also raise an issue I have heard about from a number of people dealing with cancer and conditions such as multiple sclerosis. People online try to discourage them from accessing the proper medical interventions for their illnesses, and instead encourage them to take more vitamin B or adopt a vegan diet. There are people who have died because they had cancer but were encouraged online to not access cancer treatment because they were subject to lawful but awful categories of harm.
I wonder if the hon. Member saw the story online about the couple in New Zealand who refused to let their child have a life-saving operation because they could not guarantee that the blood used would not be from vaccinated people? Is the hon. Member similarly concerned that this has caused real-life harm?
I am aware of the case that the hon. Member mentioned. I appreciate that I am probably testing the patience of everybody in the Committee Room, but I want to be clear just how abhorrent I find it that these provisions are coming out of the Bill. I am trying to be restrained, measured and reasonably concise, but that is difficult when there are so many parts of the change that I find egregious.
My final point is on self-harm and suicide content. For men under the age of 45, suicide is the biggest killer. In the Bill, we are doing as much as we can to protect young people from that sort of content. My real concern is this: many young people are being protected by the Bill’s provisions relating to children. They are perhaps waiting for support from child and adolescent mental health services, which are massively oversubscribed. The minute they tick over into 18, fall off the CAMHS waiting list and go to the bottom of the adult mental health waiting list—they may have to wait years for treatment of various conditions—there is no requirement or duty on the social media companies and platforms to do risk assessments.
ONLINE SAFETY BILL (Second sitting) Debate
Full Debate: Read Full DebateCharlotte Nichols
Main Page: Charlotte Nichols (Labour - Warrington North)Department Debates - View all Charlotte Nichols's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Public Bill CommitteesI remind the Committee that with this we are discussing the following:
Clause 13 stand part.
Government amendments 18, 23 to 25, 32, 33 and 39.
Clause 55 stand part.
Government amendments 42 to 45, 61 to 66, 68 to 70, 74, 80, 85, 92, 51 and 52, 54, 94 and 60.
It is a pleasure to serve under your chairship, Dame Angela. I did not make a note of the specific word I was on when we adjourned, so I hope Hansard colleagues will forgive me if the flow between what I said previously and what I say now is somewhat stilted.
I will keep this brief, because I was—purposefully—testing the patience of the Minister with some of my contributions. However, I did so to hammer home the fact that the removal of clauses 12 and 13 from the Bill is a fatal error. If the recommittal of the Bill is not to fundamentally undermine what the Bill set out to do five years or so ago, their removal should urgently be reconsidered. We have spent five years debating the Bill to get it to this point.
As I said, there are forms of harm that are not illegal, but they are none the less harmful, and they should be legislated for. They should be in the Bill, as should specific protections for adults, not just children. I therefore urge the Minister to keep clauses 12 and 13 in the Bill so that we do not undermine what it set out to do and all the work that has been done up to this point. Inexplicably, the Government are trying to undo that work at this late stage before the Bill becomes law.
It is a pleasure to see you in the Chair, Dame Angela—I wish it was a toastier room. Let me add to the points that the shadow Minister, my hon. Friend the Member for Pontypridd, made so powerfully about vulnerable people. There is no cliff edge when such a person becomes 18. What thought have the Minister and the Department given to vulnerable young adults with learning disabilities or spectrum disorders? Frankly, the idea that, as soon as a person turns 18, they are magically no longer vulnerable is for the birds—particularly when it comes to eating disorders, suicide and self-harm.
Adults do not live in isolation, and they do not just live online. We have a duty of care to people. The perfect example is disinformation, particularly when it comes to its harmful impact on public health. We saw that with the pandemic and vaccine misinformation. We saw it with the harm done to children by the anti-vaccine movement’s myths about vaccines, children and babies. It causes greater harm than just having a conversation online.
People do not stay in one lane. Once people start being sucked into conspiracy myths, much as we discussed earlier around the algorithms that are used to keep people online, it has to keep ramping up. Social media and tech companies do that very well. They know how to do it. That is why I might start looking for something to do with ramen recipes and all of a sudden I am on to a cat that has decided to make noodles. It always ramps up. That is the fun end of it, but on the serious end somebody will start to have doubts about certain public health messages the Government are sending out. That then tips into other conspiracy theories that have really harmful, damaging consequences.
I saw that personally. My hon. Friend the Member for Warrington North eloquently put forward some really powerful examples of what she has been subjected to. With covid, some of the anti-vaccinators and anti-mask-wearers who targeted me quickly slipped into Sinophobia and racism. I was sent videos of people eating live animals, and being blamed for a global pandemic.
The people who have been targeted do not stay in one lane. The idea that adults are not vulnerable, and susceptible, to such targeting and do not need protection from it is frankly for the birds. We see that particularly with extremism, misogyny and the incel culture. I take the point from our earlier discussion about who determines what crosses the legal threshold, but why do we have to wait until somebody is physically hurt before the Government act?
That is really regrettable. So, too, is the fact that this is such a huge U-turn in policy, with 15% of the Bill coming back to Committee. As we have heard, that is unprecedented, and yet, on the most pivotal point, we were unable to hear expert advice, particularly from the National Society for the Prevention of Cruelty to Children, Barnardo’s and the Antisemitism Policy Trust. I was struggling to understand why we would not hear expert advice on such a drastic change to an important piece of legislation—until I heard the hon. Member for Don Valley talk about offence. This is not about offence; it is about harm.
The hon. Member’s comments highlighted perfectly the real reason we are all here in a freezing cold Bill Committee, rehashing work that has already been solved. The Bill was not perfect by any stretch of the imagination, but it was better than what we have today. The real reason we are here is the fight within the Conservative party.
It is a pleasure to serve under your chairmanship, Dame Angela.
A lot of the discussion has replayed the debate from day two on Report about the removal of “legal but harmful” measures. Some of the discussion this morning and this afternoon has covered really important issues such as self-harm on which, as we said on the Floor of the House, we will introduce measures at a later stage. I will not talk about those measures now, but I would just say that we have already said that if we agree that the promotion of things such as self-harm is illegal, it should be illegal. Let us be very straight about how we deal with the promotion of self-harm.
The Bill will bring huge improvements for adult safety online. In addition to their duty to tackle illegal content, companies will have to provide adult users with tools to keep themselves safer. On some of the other clauses, we will talk about the triple shield that was mentioned earlier. If the content is illegal, it will still be illegal. If content does not adhere to the companies’ terms of service—that includes many of the issues that we have been debating for the last hour—it will have to be removed. We will come to user enforcement issues in further clauses.
The Minister mentions tools for adults to keep themselves safe. Does he not think that that puts the onus on the users—the victims—to keep themselves safe? The measures as they stand in the Bill put the onus on the companies to be more proactive about how they keep people safe.
The onus on adults is very much a safety net—very much a catch-all, after we have put the onus on the social media companies and the platforms to adhere to their own terms and conditions.
We have heard a lot about Twitter and the changes to Twitter. We can see the commercial imperative for mainstream platforms, certainly the category 1 platforms, to have a wide enough catch-all in their terms of service—anything that an advertiser, for example, would see as reasonably sensible—to be able to remain a viable platform in the first place. When Elon Musk first started making changes at Twitter, a comment did the rounds: “How do you build a multimillion-dollar company? You sell it to Elon Musk for £44 billion.” He made that change. He has seen the bottom falling out of his market and has lost a lot of the cash he put into Twitter. That is the commercial impetus that underpins a lot of the changes we are making.
One of the examples I alluded to, which is particularly offensive for Jewish people, LGBT people and other people who were persecuted in the Nazi holocaust, is holocaust denial. Does the Minister seriously think that it is only Jewish people, LGBT people and other people who were persecuted in the holocaust who find holocaust denial offensive and objectionable and who do not want to see it as part of their online experience? Surely having these sorts of safety nets in place and saying that we do not think that certain kinds of content—although they may not be against the law—have a place online protects everyone’s experience, whether they are Jewish or not. Surely, no one wants to see holocaust denial online.
No, but there is freedom of expression to a point—when it starts to reach into illegality. We have to have the balance right: someone can say something in public—in any session offline—but what the hon. Lady is suggesting is that, as soon as they hit a keyboard or a smartphone, there are two totally different regimes. That is not getting the balance right.
The Minister says that we should have freedom of speech up to a point. Does that point include holocaust denial? He has just suggested that if something is acceptable to say in person, which I do not think holocaust denial should be, it should be acceptable online. Surely holocaust denial is objectionable whenever it happens, in whatever context—online or offline.
I have been clear about where I set the line. [Interruption.] I have said that if something is illegal, it is illegal. The terms of service of the platforms largely cover the list that we are talking about. As my hon. Friend the Member for Folkestone and Hythe and I have both said, the terms of service of the vast majority of platforms—the big category 1 platforms—set a higher bar than was in our original Bill. The hon. Member for Luton North talked about whether we should have more evidence. I understand that the pre-legislative scrutiny committee heard evidence and came to a unanimous conclusion that the “legal but harmful” conditions should not be in the Bill.
I have talked a little already about these amendments, so let me sum up where I think we are. I talked about harmful health content and why it is not included. The Online Safety Bill will force social media companies to tackle health misinformation and disinformation online, where it constitutes a criminal offence. It includes the communications offence, which would capture posts encouraging dangerous hoax cures, where the sender knows the information to be false and intends to cause harm, such as encouraging drinking bleach to cure cancer, which we heard about a little earlier.
The legislation is only one part of the wider Government approach to this issue. It includes the work of the counter-disinformation unit, which brings together cross-Government monitoring and analysis capabilities and engages with platforms directly to ensure that appropriate action is taken, in addition to the Government’s work to build users’ resilience to misinformation through media literacy.
Including harmful health content as a category risks requiring companies to apply the adult user empowerment tools to an unfeasibly large volume of content—way beyond just the vaccine efficacy that was mentioned. That has implications both for regulatory burden and for freedom of expression, as it may capture important health advice. Similarly, on climate change, the Online Safety Bill itself will introduce new transparency, accountability and free speech duties and category one services. If a platform said that certain types of content are not allowed, it will be held to account for their removal.
We recognised that there was a heightened risk of disinformation surrounding the COP26 summit. The counter-disinformation unit led by the Department for Digital, Culture, Media and Sport brought together monitoring and analysis capabilities across Government to understand disinformation that posed a risk to public safety or to delegates or that represented attempts at interference from malign actors. We are clear that free debate is essential to a democracy and that the counter-disinformation unit should not infringe upon political debate. Government already work closely with the major social media platforms to encourage them to collaborate at speed to remove disinformation as per their terms of service.
Amendment (a) to amendment 15 and amendment (a) to amendment 16 would create that new category of content that incites hateful extremism. That is closely aligned with the approach that the Government are already taking with amendment 15, specifically subsections (8C) and (8D), which create a category of content that is abusive or incites hate on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. Those conditions would likely capture the majority of the kinds of content that the hon. Members are seeking to capture through their hateful extremism category. For example, it would capture antisemitic abuse and conspiracy theories, racist abuse and promotion of racist ideologies.
Furthermore, where companies’ terms of service say they prohibit or apply restrictions to the kind of content listed in the Opposition amendments, companies must ensure that those terms are consistently enforced. It comes back so much to the enforcement. They must also ensure that the terms of service are easily understandable.
If this is about companies enforcing what is in their terms of service for the use of their platforms, could it not create a perverse incentive for them to have very little in their terms of service? If they will be punished for not enforcing their terms of service, surely they will want them to be as lax as possible in order to limit their legal liability for enforcing them. Does the Minister follow?
I follow, but I do not agree. The categories of content in proposed new subsections (8C) and (8D), introduced by amendment 15, underpin a lot of this. I answered the question in an earlier debate when talking about the commercial impetus. I cannot imagine many mainstream advertisers wanting to advertise with a company that removed from its terms of service the exclusion of racial abuse, misogyny and general abuse. We have seen that commercial impetus really kicking in with certain platforms. For those reasons, I am unable to accept the amendments to the amendments, and I hope that the Opposition will not press them to a vote.
I note that many providers of 4G internet, including the one I have on my own phone, already block adult content. Essentially, if people want to look at pornography or other forms of content, they have to proactively opt in to be allowed to see it. Would it not make sense to make something as straightforward as that, which already exists, into the model that we want on the internet more widely, as opposed to leaving it to EE and others to do?
I absolutely agree. Another point that has been made is that this is not creating undue burden; the Government are already creating the burden for companies—I am not saying that it is a bad burden, but the Government are already creating it. We just want people to have the opportunity to opt into it, or out of it. That is the position that we are in.
I am sure that, like me, the shadow Minister will be baffled that the Government are against our proposals to have to opt out. Surely this is something that is of key concern to the Government, given that the former MP for Tiverton and Honiton might still be an MP if users had to opt in to watching pornography, rather than being accidentally shown it when innocently searching for tractors?
My hon. Friend makes a very good point. It goes to show the nature of this as a protection for all of us, even MPs, from accessing content that could be harmful to our health or, indeed, profession. Given the nature of the amendment, we feel that this is a safety net that should be available to all. It should be on by default.
I appreciate the hon. Lady’s remarks. We have tried to ensure that the Bill is proportionate, inasmuch as the Secretary of State can designate content if there is material risk of significant harm to an appreciable number of children in the United Kingdom. The Bill also requires the Secretary of State to consult Ofcom before making regulations on the priority categories of harm.
I appreciate that this point has been made about the same wording earlier today, but I really feel that the ambiguity of “appreciable number” is something that could do with being ironed out. The ambiguity and vagueness of that wording make it very difficult to enforce the provision. Does the Minister agree that “appreciable number” is too vague to be of real use in legislation such as this?
The different platforms, approaches and conditions will necessitate different numbers; it would be hard to pin a number down. The wording is vague and wide-ranging because it is trying to capture any number of scenarios, many as yet unknown. However, the regulations designating priority harms will be made under the draft affirmative resolution procedure.
As much as I am keen on the idea of Ofcom special agents conceptually, my concern on the transparency front is that, to appoint a special agent and send them in to look at the data, Ofcom would have to have cause to believe that there was an issue of concern with the data, whereas if that data is more transparently available to the research community, they can then proactively identify things that they can flag to Ofcom as a concern. Without that, we are relying on an annual cycle of Ofcom being able to intervene only when they have a concern, rather than the research community, which is much better placed to make that determination, being able to keep a watching brief on the company.
That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.
As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.
This has been a helpful debate. Everyone was right that transparency must be and is at the heart of the Bill. From when we were talking earlier today about how risk assessments and terms of service must be accessible to all, through to this transparency reporting section, it is important that we hold companies to account and that the reports play a key role in allowing users, Ofcom and civil society, including those in academia, to understand the steps that companies are taking to protect users.
Under clause 65, category 1 services, category 2A search services and category 2B user-to-user services need to publish transparency reports annually in accordance with the transparency report notice from Ofcom. That relates to the points about commerciality that my hon. Friend the Member for Folkestone and Hythe talked about. Ofcom will set out what information is required from companies in their notice, which will also specify the format, manner and deadline for the information to be provided to Ofcom. Clearly, it would not be proportionate to require every service provider within the scope of the overall regulatory framework to produce a transparency report—it is also important that we deal with capacity and proportionality—but those category threshold conditions will ensure that the framework is flexible and future-proofed.
I note what the Minister said about the commercial implications of some of these things, and some of those commercial implications might act as levers to push companies to do better on some things. By that same token, should this information not be more transparent and publicly available to give the user the choice he referred to earlier? That would mean that if a user’s data was not being properly protected and these companies were not taking the measures around safety that the public would expect, users can vote with their feet and go to a different platform. Surely that underpins a lot of what we have been talking about.
Yes, and that is why Ofcom will be the one that decides which information should be published, and from whom, to ensure that it is proportionate. At the end of the day, I have talked about the fact that transparency is at the heart of the Bill and that the transparency reports are important. To go to the original point raised by the hon. Member for Pontypridd about when these reports will be published, they will indeed be published in accordance with subsection 3(d) of the clause.
Question put and agreed to.
Clause 65 accordingly ordered to stand part of the Bill.
Schedule 8
Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services
Amendments made: 61, in schedule 8, page 203, line 13, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 62, in schedule 8, page 203, line 15, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 63, in schedule 8, page 203, line 17, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 64, in schedule 8, page 203, line 21, leave out from “or” to end of line 23 and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about user reporting of content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 65, in schedule 8, page 203, line 25, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 66, in schedule 8, page 203, line 29, leave out
“priority content that is harmful to adults”
and insert “relevant content”.
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 67, in schedule 8, page 203, line 41, at end insert—
“11A Measures taken or in use by a provider to comply with any duty set out in section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service).”
This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by NC3 and NC4.
Amendment 68, in schedule 8, page 204, line 2, leave out from “illegal content” to end of line 3 and insert
“or content that is harmful to children—”.
This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 69, in schedule 8, page 204, line 10, leave out from “illegal content” to “, and” in line 12 and insert
“and content that is harmful to children”.
This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 70, in schedule 8, page 204, line 14, leave out from “illegal content” to “present” in line 15 and insert
“and content that is harmful to children”.
This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).
Amendment 71, in schedule 8, page 205, line 38, after “Part 3” insert
“or Chapters 1 to 2A of Part 4”.—(Paul Scully.)
This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6 (and Chapter 1 of Part 4).
ONLINE SAFETY BILL (Third sitting) Debate
Full Debate: Read Full DebateCharlotte Nichols
Main Page: Charlotte Nichols (Labour - Warrington North)Department Debates - View all Charlotte Nichols's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Public Bill CommitteesLabour welcomes clause 207, which outlines the commencement and transitional provisions for the Bill to effectively come into existence. The Minister knows that Labour is concerned about the delays that have repeatedly held up the Bill’s progress, and I need not convince him of the urgent need for it to pass. I think contributions in Committee plus those from colleagues across the House as the Bill has progressed speak for themselves. The Government have repeatedly claimed they are committed to keeping children safe online, but have repeatedly failed to bring forward this legislation. We must now see commitments from the Minister that the Bill, once enacted, will make a difference right away.
Labour has specific concerns shared with stakeholders, from the Age Verification Providers Association to the Internet Watch Foundation, the NSPCC and many more, about the road map going forward. Ofcom’s plan for enforcement already states that it will not begin enforcement on harm to children from user-to-user content under part 3 of the Bill before 2025. Delays to the Bill as well as Ofcom’s somewhat delayed enforcement plans mean that we are concerned that little will change in the immediate future or even in the short term. I know the Minister will stand up and say that if the platforms want to do the right thing, there is nothing stopping them from doing so immediately, but as we have seen, they need convincing to take action when it counts, so I am not convinced that platforms will do the right thing.
If the Government’s argument is that there is nothing to stop platforms taking such actions early, why are we discussing the Bill at all? Platforms have had many years to implement such changes, and the very reason we need this Bill is that they have not been.
Exactly. My hon. Friend makes an incredibly important point that goes to the heart of why we are here in the first place. If the platforms were not motivated by commercial interest and we could trust them to do the right thing on keeping children safe and reducing harm on their platforms, we would not require this legislation in the first place. But sadly, we are where we are, which is why it is even more imperative that we get on with the job, that Ofcom is given the tools to act swiftly and tries to reduce the limit of when they come into effect and that this legislation is enacted so that it actually makes a lasting difference.
Ofcom has already been responsible for regulating video-sharing platforms for two years, yet still, despite being in year 3, it is only asking websites to provide a plan as to how they will be compliant. That means the reality is that we can expect little on child protection before 2027-28, which creates a massive gap compared with public expectations of when the Bill will be passed. We raised these concerns last time, and I felt little assurance from the Minister in post last time, so I am wondering whether the current Minister can improve on his predecessor by ensuring a short timeline for when exactly the Bill can be implemented and Ofcom can act.
We all understand the need for the Bill, which my hon. Friend the Member for Warrington North just pointed out. That is why we have been supportive in Committee and throughout the passage of the Bill. But the measures that the Bill introduces must come into force as soon as is reasonably possible. Put simply, the industry is ready and users want to be protected online and are ready too. It is just the Government, sadly, and the regulator that would be potentially holding up implementation of the legislation.
The Minister has failed to concede on any of the issues that we have raised in Committee, despite being sympathetic and supportive. His predecessor was also incredibly supportive and sympathetic on everything we raised in Committee, yet failed to take into account a single amendment or issue that we raised. I therefore make a plea to this Minister to at least see the need to press matters and the timescale that is needed here. We have not sought to formally amend this clause, so I seek the Minister’s assurance that this legislation will be dealt with swiftly. I urge him to work with Labour, SNP colleagues and colleagues across the House to ensure that the legislation and the provisions in it are enacted and that there are no further unnecessary delays.