Debates between Kim Leadbeater and Alex Davies-Jones during the 2019-2024 Parliament

Tue 17th Jan 2023
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 21st Jun 2022
Tue 14th Jun 2022
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting

Online Safety Bill

Debate between Kim Leadbeater and Alex Davies-Jones
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Does my hon. Friend agree that, as we discussed in the Bill Committee, there is clear evidence that legal but harmful content is often the gateway to far more dangerous radicalisation and extremism, be it far-right, Islamist, incel or other? Will she therefore join me in supporting amendment 43 to ensure that by default such content is hidden from all adult users?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely support my hon. Friend’s comments and I was pleased to see her champion that cause in the Bill Committee. Of course I support amendment 43, tabled in the names of SNP colleagues, to ensure that the toggle is on by default. Abhorrent material is being shared and amplified—that is the key point, amplified—online by algorithms and by the processes and systems in place. It is obvious that the Government just do not get that. That said, there is a majority in Parliament and in the country for strengthening the Online Safety Bill, and Labour has been on the front foot in arguing for a stronger Bill since First Reading last year.

It is also important to recognise the sheer number of amendments and changes we have seen to the Bill so far. Even today, there are many more amendments tabled by the Government. If that does not give an indication of the mess they have made of getting this legislation over the line in a fit and proper state, I do not know what does.

I have said it before, and I am certain I will say it again, but we need to move forward with this Bill, not backward. That is why, despite significant Government delay, we will support the Bill’s Third Reading, as each day of inaction allows more harm to spread online. With that in mind, I too will make some progress.

I will first address new clause 1, tabled in my name and that of my hon. Friend the Member for Manchester Central (Lucy Powell). This important addition to the Bill will go some way to address the gaps around support for individual complaints. We in the Opposition have repeatedly queried Ministers and the Secretary of State on the mechanisms available for individuals who have appeals of complaints. That is why new clause 1 is so important. It is vital that platforms’ complaints procedures are fit for purpose, and this new clause will finally see the Secretary of State publishing a report on the options available to individuals.

We already know that the Bill in its current form fails to consider an appropriate avenue for individual complaints. This is a classic case of David and Goliath, and it is about time those platforms went further in giving their users a transparent, effective complaints process. That substantial lack of transparency underpins so many of the issues Labour has with the way the Government have handled—or should I say mishandled—the Bill so far, and it makes the process by which the Government proceeded to remove the all-important clauses on legal but harmful content, in a quiet room on Committee Corridor just before Christmas, even more frustrating.

That move put the entire Bill at risk. Important sections that would have put protections in place to prevent content such as health and foreign-state disinformation, the promotion of self-harm, and online abuse and harassment from being actively pushed and promoted were rapidly removed by the Government. That is not good enough, and it is why Labour has tabled a series of amendments, including new clauses 4, 5, 6 and 7, that we think would go some way towards correcting the Government’s extremely damaging approach.

Under the terms of the Bill as currently drafted, platforms could set whatever terms and conditions they want and change them at will. We saw that in Elon Musk’s takeover at Twitter, when he lifted the ban on covid disinformation overnight because of his own personal views. Our intention in tabling new clause 4 is to ensure that platforms are not able to simply avoid safety duties by changing their terms and conditions whenever they see fit. This group of amendments would give Ofcom the power to set minimum standards for platforms’ terms and conditions, and to direct platforms to change them if they do not meet those standards.

ONLINE SAFETY BILL (Second sitting)

Debate between Kim Leadbeater and Alex Davies-Jones
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, this clause requires providers of relevant services to publish annual transparency reports and sets out Ofcom’s powers in relation to those reports. The information set out in transparency reports is intended to help users to understand the steps that providers are taking to help keep them safe and to provide Ofcom with the information required to hold them to account.

These duties on regulated services are very welcome indeed. Labour has long held the view that mandatory transparency reporting and reporting mechanisms are vital to hold platforms to account, and to understand the true nature of how online harm is driven and perpetuated on the internet.

I will reiterate the points that were made in previous Committee sittings about our concerns about the regularity of these transparency reports. I note that, sadly, those reports remain unchanged and therefore they will only have to be submitted to Ofcom annually. It is important that the Minister truly considers the rapid rate at which the online world can change and develop, so I urge him to reconsider this point and to make these reports a biannual occurrence. Labour firmly believes that increasing the frequency of the transparency reports will ensure that platforms and services remain on the pulse, and are forced to be aware of and act on emergent risks. In turn, that would compel Ofcom to do the same in its role as an industry regulator.

I must also put on the record some of our concerns about subsections (12) and (13), which state that the Secretary of State of the day could amend by regulation the frequency of the transparency reporting, having consulted Ofcom first. I hope that the Minister can reassure us that this approach will not result in our ending up in a position where, perhaps because of Ofcom’s incredible workload, transparency reporting becomes even less frequent than an annual occurrence. We need to see more transparency, not less, so I really hope that he can reassure me on this particular point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Does my hon. Friend agree that transparency should be at the heart of this Bill and that the Government have missed an opportunity to accelerate the inclusion of a provision in the Bill, namely the requirement to give researchers and academics access to platform data? Data access must be prioritised in the Bill and without such prioritisation the UK will fall behind the rest of Europe in safety, research and innovation. The accessibility and transparency of that data from a research perspective are really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. We both made the point at length in previous sittings of the Committee about the need to ensure transparency, access to the data, and access to reporting for academics, civil society and researchers.

That also goes to the point that it is not for this Committee or this Minister—it is not in his gift—to determine something that we have all discussed in this place at length, which is the potential requirement for a standalone Committee specifically to consider online harm. Such a Committee would look at whether this legislation is actively doing what we need it to do, whether it needs to be reviewed, whether it could look at the annual reports from Ofcom to determine the length and breadth of harm on the internet, and whether or not this legislation is actually having an impact. That all goes to the heart of transparency, openness and the review that we have been talking about.

I want to go further and raise concerns about how public the reports will be, as we have touched on. The Government claim that their so-called triple shield approach will give users of platforms and services more power and knowledge to understand the harms that they may discover online. That is in direct contradiction to the Bill’s current approach, which does not provide any clarity about exactly how the transparency reports will be made available to the public. In short, we feel that the Government are missing a significant opportunity. We have heard many warnings about what can happen when platforms are able to hide behind a veil of secrecy. I need only point to the revelations of whistleblowers, including Frances Haugen, to highlight the importance of that point.

As the Bill stands, once Ofcom has issued a notice, companies will have to produce a transparency report that

“must…be published in the manner and by the date specified in the notice”.

I want to press the Minister on that and ask him to clarify the wording. We are keen for the reports to be published publicly and in an accessible way, so that users, civil society, researchers and anyone else who wants to see them can make sense of them. The information contained in the transparency reports is critical to analysing trends and harms, so I hope that the Minister will clarify those points in his response.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Does my hon. Friend agree that if the Government are to achieve their objective—which we all share—for the Bill to be world-leading legislation, we cannot rely on whistleblowers to tell us what is really going on in the online space? That is why transparency is vital. This is the perfect opportunity to provide that transparency, so that we can do some proper research into what is going on out there. We cannot rely on whistleblowers to give us such information.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We want the Bill to work. We have always wanted the Bill to work. We want it to achieve its aim of keeping children, adults and everyone who uses the internet safe from the harms that are perpetuated there. If there is no transparency, how will we know that the platforms are breaking the rules covertly, and whether they are hiding content and getting round the rules? That is what they do; we know it, because we have heard it from whistleblowers, but we cannot rely on whistleblowers alone to highlight exactly what happens behind the closed doors of the platforms.

We need the transparency and the reports to be made public, so that we can see whether the legislation is working. If that does not happen, although we have waited five years, we will need another piece of legislation to fix it. We know that the Bill is not perfect, and the Minister knows that—he has said so himself—but, ultimately, we need to know that it works. If it does not, we have a responsibility as legislators to put something in place that does. Transparency is the only way in which we will figure that out.

ONLINE SAFETY BILL (First sitting)

Debate between Kim Leadbeater and Alex Davies-Jones
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 12 is extremely important because it outlines the platforms’ duties in relation to keeping adults safe online. The Government’s attempts to remove the clause through an amendment that thankfully has not been selected are absolutely shocking. In addressing Government amendments 18, 23, 24, 25, 32, 33 and 39, I must ask the Minister: exactly how will this Bill do anything to keep adults safe online?

In the original clause 12, companies had to assess the risk of harm to adults and the original clause 13 outlined the means by which providers had to report these assessments back to Ofcom. This block of Government amendments will make it impossible for any of us—whether that is users of a platform or service, researchers or civil society experts—to understand the problems that arise on these platforms. Labour has repeatedly warned the Government that this Bill does not go far enough to consider the business models and product design of platforms and service providers that contribute to harm online. By tabling this group of amendments, the Government are once again making it incredibly difficult to fully understand the role of product design in perpetuating harm online.

We are not alone in our concerns. Colleagues from Carnegie UK Trust, who are a source of expertise to hon. Members across the House when it comes to internet regulation, have raised their concerns over this grouping of amendments too. They have raised specific concerns about the removal of the transparency obligation, which Labour has heavily pushed for in previous Bill Committees.

Previously, service providers had been required to inform customers of the harms their risk assessment had detected, but the removal of this risk assessment means that users and consumers will not have the information to assess the nature or risk on the platform. The Minister may point to the Government’s approach in relation to the new content duties in platforms’ and providers’ terms of service, but we know that there are risks arising from the fact that there is no minimum content specified for the terms of service for adults, although of course all providers will have to comply with the illegal content duties.

This approach, like the entire Bill, is already overly complex—that is widely recognised by colleagues across the House and is the view of many stakeholders too. In tabling this group of amendments, the Minister is showing his ignorance. Does he really think that all vulnerabilities to harm online simply disappear at the age of 18? By pushing these amendments, which seek to remove these protections from harmful but legal content to adults, the Minister is, in effect, suggesting that adults are not susceptible to harm and therefore risk assessments are simply not required. That is an extremely narrow-minded view to take, so I must push the Minister further. Does he recognise that many young, and older, adults are still highly likely to be impacted by suicide and self-harm messaging, eating disorder content, disinformation and abuse, which will all be untouched by these amendments?

Labour has been clear throughout the passage of the Bill that we need to see more, not less, transparency and protection from online harm for all of us—whether adults or children. These risk assessments are absolutely critical to the success of the Online Safety Bill and I cannot think of a good reason why the Minister would not support users in being able to make an assessment about their own safety online.

We have supported the passage of the Bill, as we know that keeping people safe online is a priority for us all and we know that the perfect cannot be the enemy of the good. The Government have made some progress towards keeping children safe, but they clearly do not consider it their responsibility to do the same for adults. Ultimately, platforms should be required to protect everyone: it does not matter whether they are a 17-year-old who falls short of being legally deemed an adult in this country, an 18-year-old or even an 80-year-old. Ultimately, we should all have the same protections and these risk assessments are critical to the online safety regime as a whole. That is why we cannot support these amendments. The Government have got this very wrong and we have genuine concerns that this wholesale approach will undermine how far the Bill will go to truly tackling harm online.

I will also make comments on clause 55 and the other associated amendments. I will keep my comments brief, as the Minister is already aware of my significant concerns over his Department’s intention to remove adult safety duties more widely. In the previous Bill Committee, Labour made it clear that it supports, and thinks it most important, that the Bill should clarify specific content that is deemed to be harmful to adults. We have repeatedly raised concerns about missing harms, including health misinformation and disinformation, but really this group of amendments, once again, will touch on widespread concerns that the Government’s new approach will see adults online worse off. The Government’s removal of the “legal but harmful” sections of the Online Safety Bill is a major weakening—not a strengthening—of the Bill. Does the Minister recognise that the only people celebrating these decisions will be the executives of big tech firms, and online abusers? Does he agree that this delay shows that the Government have bowed to vested interests over keeping users and consumers safe?

Labour is not alone in having these concerns. We are all pleased to see that child safety duties are still present in the Bill, but the NSPCC, among others, is concerned about the knock-on implications that may introduce new risks to children. Without adult safety duties in place, children will be at greater risk of harm if platforms do not identify and protect them as children. In effect, these plans will now place a significant greater burden on platforms to protect children than adults. As the Bill currently stands, there is a significant risk of splintering user protections that can expose children to adult-only spaces and harmful content, while forming grooming pathways for offenders, too.

The reality is that these proposals to deal with harms online for adults rely on the regulator ensuring that social media companies enforce their own terms and conditions. We already know and have heard that that can have an extremely damaging impact for online safety more widely, and we have only to consider the very obvious and well-reported case study involving Elon Musk’s takeover of Twitter to really get a sense of how damaging that approach is likely to be.

In late November, Twitter stopped taking action against tweets in violation of coronavirus rules. The company had suspended at least 11,000 accounts under that policy, which was designed to remove accounts posting demonstrably false or misleading content relating to covid-19 that could lead to harm. The company operated a five-strike policy, and the impact on public health around the world of removing that policy will likely be tangible. The situation also raises questions about the platform’s other misinformation policies. As of December 2022, they remain active, but for how long remains unclear.

Does the Minister recognise that as soon as they are inconvenient, platforms will simply change their terms and conditions, and terms of service? We know that simply holding platforms to account for their terms and conditions will not constitute robust enough regulation to deal with the threat that these platforms present, and I must press the Minister further on this point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

My hon. Friend is making an excellent speech. I share her deep concerns about the removal of these clauses. The Government have taken this tricky issue of the concept of “legal but harmful”—it is a tricky issue; we all acknowledge that—and have removed it from the Bill altogether. I do not think that is the answer. My hon. Friend makes an excellent point about children becoming 18; the day after they become 18, they are suddenly open to lots more harmful and dangerous content. Does she also share my concern about the risks of people being drawn towards extremism, as well as disinformation and misinformation?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a valid point. This is not just about misinformation and disinformation; it is about leading people to really extreme, vile content on the internet. As we all know, that is a rabbit warren. That situation does not change as soon as a 17-year-old turns 18 on their 18th birthday—that they are then exempt when it comes to seeing this horrendous content. The rules need to be there to protect all of us.

As we have heard, terms and conditions can change overnight. Stakeholders have raised the concern that, if faced with a clearer focus on their terms of service, platforms and providers may choose to make their terms of service shorter, in an attempt to cut out harmful material that, if left undealt with, they may be held liable for.

In addition, the fact that there is no minimum requirement in the regime means that companies have complete freedom to set terms of service for adults, which may not reflect the risks to adults on that service. At present, service providers do not even have to include terms of service in relation to the list of harmful content proposed by the Government for the user empowerment duties—an area we will come on to in more detail shortly as we address clause 14. The Government’s approach and overreliance on terms of service, which as we know can be so susceptible to rapid change, is the wrong approach. For that reason, we cannot support these amendments.

I would just say, finally, that none of us was happy with the term “legal but harmful”. It was a phrase we all disliked, and it did not encapsulate exactly what the content is or includes. Throwing the baby out with the bathwater is not the way to tackle that situation. My hon. Friend the Member for Batley and Spen is right that this is a tricky area, and it is difficult to get it right. We need to protect free speech, which is sacrosanct, but we also need to recognise that there are so many users on the internet who do not have access to free speech as a result of being piled on or shouted down. Their free speech needs to be protected too. We believe that the clauses as they stand in the Bill go some way to making the Bill a meaningful piece of legislation. I urge the Minister not to strip them out, to do the right thing and to keep them in the Bill.

Online Safety Bill (Fourteenth sitting)

Debate between Kim Leadbeater and Alex Davies-Jones
Committee stage
Tuesday 21st June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It did not, but there were examples of disinformation, misinformation and the spreading of falsehoods, and none of these powers existed at the time. It seems weird—if I can use that term—that these exist now. Surely, the more appropriate method would be for the Secretary of State to write a letter to Ofcom to which it had to have regard. As it stands, this dangerous clause ensures the Secretary of State has the power to interfere with day-to-day enforcement. Ultimately, it significantly undermines Ofcom’s overall independence, which we truly believe should be at the heart of the Bill.

With that in mind, I will now speak to our crucial new clause 10, which instead would give Ofcom the power to take particular steps, where it considers that there is a threat to the health and safety of the public or national security, without the need for direction from the Secretary of State. Currently, there is no parliamentary scrutiny of the powers outlined in clause 146; it says only that the Secretary of State must publish their reasoning unless national security is involved. There is no urgency threshold or requirement in the clause. The Secretary of State is not required to take advice from an expert body, such as Public Health England or the National Crime Agency, in assessing reasonable grounds for action. The power is also not bounded by the Bill’s definition of harm.

These instructions do two things. First, they direct Ofcom to use its quite weak media literacy duties to respond to the circumstances. Secondly, a direction turns on a power for Ofcom to ask a platform to produce a public statement about what the platform is doing to counter the circumstances or threats in the direction order—that is similar in some ways to the treatment of harm to adults. This is trying to shame a company into doing something without actually making it do it. The power allows the Secretary of State directly to target a given company. There is potential for the misuse of such an ability.

The explanatory notes say:

“the Secretary of State could issue a direction during a pandemic to require OFCOM to; give priority to ensuring that health misinformation and disinformation is effectively tackled when exercising its media literacy function; and to require service providers to report on the action they are taking to address this issue.”

Recent experience of the covid pandemic and the Russian invasion of Ukraine suggests that the Government can easily legislate when required in an emergency and can recall Parliament. The power in the Bill is a strong power, cutting through regulatory independence and targeting individual companies to evoke quite a weak effect. It is not being justified as an emergency power where the need to move swiftly is paramount. Surely, if a heavier-duty action is required in a crisis, the Government can legislate for that and explain to Parliament why the power is required in the context of a crisis.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

It is really important to make sure that the Bill does not end up being a cover for the Secretary of State of the day to significantly interfere with the online space, both now and in the future. At the moment, I am not satisfied that the Secretary of State’s powers littered through the Bill are necessary. I share other hon. Members’ concerns about what this could mean for both the user experience and online safety more broadly. I hope my hon. Friend agrees that the Minister needs to provide us—not just us here today, but civil society and others who might be listening—with more reassurance that the Secretary of State’s powers really are necessary.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. We talk time and again about this Bill being world leading, but with that comes a responsibility to show global leadership. Other countries around the world will be looking to us, and this Parliament, when they adopt their own, similar legislation, and we need to be mindful of that when looking at what powers we give to a Secretary of State—particularly in overruling any independence of Ofcom or Parliament’s sovereignty for that matter.

New clause 10 provides a viable alternative. The Minister knows that this is an area where even his Back Benchers are divided. He must closely consider new clause 10 and recognise that placing power in Ofcom’s hands is an important step forward. None of us wants to see a situation where the Secretary of State is able to influence the regulator. We feel that, without this important clause and concession, the Government could be supporting a rather dangerous precedent in terms of independence in regulatory systems more widely.

Online Safety Bill (Tenth sitting)

Debate between Kim Leadbeater and Alex Davies-Jones
Committee stage
Tuesday 14th June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is absolutely right that the Government have included a commitment to children in the form of defining primary priority content that is harmful. We all know of the dangerous harms that exist online for children, and while the Opposition support the overarching aims of the Bill, we feel the current definitions do not go far enough—that is a running theme with this Bill.

The Bill does not adequately address the risks caused by the design—the functionalities and features of services themselves—or those created by malign contact with other users, which we know to be an immense problem. Research has found that online grooming of young girls has soared by 60% in the last three years—and four in five victims are girls. We also know that games increasingly have addictive gambling-style features. Those without user-to-user functionalities, such as Subway Surfers, which aggressively promotes in-app purchases, are currently out of scope of the Bill.

Lastly, research by Parent Zone found that 91% of children say that loot boxes are available in the games they play and 40% have paid to open one. That is not good enough. I urge the Minister to consider his approach to tackling harmful content and the impact that it can have in all its forms. When considering how children will be kept safe under the new regime, we should consider concerns flagged by some of the civil society organisations that work with them. Organisations such as the Royal College of Psychiatrists, The Mix, YoungMinds and the Mental Health Foundation have all been instrumental in their calls for the Government to do more. While welcoming the intention to protect children, they note that it is not clear at present how some categories of harm, including material that damages people’s body image, will be regulated—or whether it will be regulated at all.

While the Bill does take steps to tackle some of the most egregious, universally damaging material that children currently see, it does not recognise the harm that can be done through the algorithmic serving of material that, through accretion, will cause harm to children with particular mental health vulnerabilities. For example, beauty or fitness-related content could be psychologically dangerous to a child recovering from an eating disorder. Research from the Mental Health Foundation shows how damaging regular exposure to material that shows conventionally perfect images of bodies, often edited digitally and unattainable, are to children and young people.

This is something that matters to children, with 84% of those questioned in a recent survey by charity The Mix saying the algorithmic serving of content was a key issue that the Bill should address. Yet in its current form it does not give children full control over the content they see. Charities also tell us about the need to ensure that children are exposed to useful content. We suggest that the Government consider a requirement for providers to push material on social media literacy to users and to provide the option to receive content that can help with recovery where it is available, curated by social media companies with the assistance of trusted non-governmental organisations and public health bodies. We also hope that the Government can clarify that material damaging to people’s body image will be considered a form of harm.

Additionally, beyond the issue of the content itself that is served to children, organisations including YoungMinds and the Royal College of Psychiatrists have raised the potential dangers to mental health inherent in the way services can be designed to be addictive.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

My hon Friend raises an important point about media literacy, which we have touched on a few times during this debate. We have another opportunity here to talk about that and to say how important it is to think about media literacy within the scope of the Bill. It has been removed, and I think we need to put it back into the Bill at every opportunity—I am talking about media literacy obligations for platforms to help to responsibly educate children and adults about the risks online. We need to not lose sight of that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. She is right to talk about the lack of a social and digital media strategy within the Bill, and the need to educate children and adults about the harmful content that we see online. How to stay safe online in all its capacities is absolutely fundamental to the Bill. We cannot have an Online Safety Bill without teaching people how to be safe online. That is important for how children and young people interact online. We know that they chase likes and the self-esteem buzz they get from notifications popping up on their phone or device. That can be addictive, as has been highlighted by mental health and young persons’ charities.

I urge the Minister to address those issues and to consider how the Government can go further, whether through this legislation or further initiatives, to help to combat some of those issues.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We all know that managing harmful content, unlike illegal content, is more about implementing systems that prevent people from encountering it rather than removing it entirely. At the moment, there are no duties on the Secretary of State to consult anyone other than Ofcom ahead of making regulations under clauses 53 and 54. We have discussed at length the importance of transparency, and surely the Minister can agree that the process should be widened, as we have heard from those on the Government Back Benches.

Labour has said time and again that it should not be for the Secretary of State of the day to determine what constitutes harmful content for children or adults. Without the important consultation process outlined in amendment 62, there are genuine concerns that that could lead to a damaging precedent whereby a Secretary of State, not Parliament, has the ability to determine what information is harmful. We all know that the world is watching as we seek to work together on this important Bill, and Labour has genuine concerns that without a responsible consultation process, as outlined in amendment 62, we could inadvertently be suggesting to the world that this fairly dogmatic approach is the best way forward.

Amendment 62 would require the Secretary of State to consult other stakeholders before making regulations under clauses 53 and 54. As has been mentioned, we risk a potentially dangerous course of events if there is no statutory duty on the Secretary of State to consult others when determining the definition of harmful content. Let me draw the Minister’s attention to the overarching concerns of stakeholders across the board. Many are concerned that harmful content for adults requires the least oversight, although there are potential gaps that mean that certain content—such as animal abuse content—could completely slip through the net. The amendment is designed to ensure that sufficient consultation takes place before the Secretary of State makes important decisions in directing Ofcom.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

On that point, I agree wholeheartedly with my hon. Friend. It is important that the Secretary of State consults campaign organisations that have expertise in the relevant areas. Much as we might want the Secretary of State to be informed on every single policy issue, that is unrealistic. It is also important to acknowledge the process that we have been through with the Bill: the expertise of organisations has been vital in some of the decisions that we have had to make. My hon. Friend gave a very good example, and I am grateful to animal welfare groups for their expertise in highlighting the issue of online abuse of animals.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. As parliamentarians we are seen as experts in an array of fields. I do not purport to be an expert in all things, as it is more a jack of all trades role, and it would be impossible for one Secretary of State to be an expert in everything from animal abuse to online scam ads, from fraud to CSAM and terrorism. That is why it is fundamental that the Secretary of State consults with experts and stakeholders in those fields, for whom these things are their bread and butter—their day job every day. I hope the Minister can see that regulation of the online space is a huge task to take on for us all. It is Labour’s view that any Secretary of State would benefit from the input of experts in specific fields. I urge him to support the amendment, especially given the wider concerns we have about transparency and power sharing in the Bill.

It is welcome that clause 56 will force Ofcom, as the regulator, to carry out important reviews that will assess the extent to which content is harmful to children and adults when broadly appearing on user-to-user services. As we have repeatedly said, transparency must be at the heart of our approach. While Labour does not formally oppose the clause, we have concerns about subsection (5), which states:

“The reports must be published not more than three years apart.”

The Minister knows that the Bill has been long awaited, and we need to see real, meaningful change and updates now. Will he tell us why it contains a three-year provision?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I agree with the hon. Member and welcome her intervention. We will be discussing these issues time and again during our proceedings. What is becoming even more apparent is the need to include women and girls in the Bill, call out violence against women and girls online for what it is, and demand that the Government go further to protect women and girls. This is yet another example of where action needs to happen. I hope the Minister is hearing our pleas and that this will happen at some point as we make progress through the Bill.

More needs to be done to tackle this problem. Pornography websites need to verify that every individual in pornographic videos published on their site is an adult and gave their permission for the video to be published, and enable individuals to withdraw their consent for pornography of them to remain on the site. These are rock-bottom safety measures for preventing the most appalling abuses on pornography websites.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I add my voice to the arguments made by my hon. Friend and the hon. Member for Aberdeen North. Violence against women and girls is a fundamental issue that the Bill needs to address. We keep coming back to that, and I too hope that the Minister hears that point. My hon. Friend has described some of the most horrific harms. Surely, this is one area where we have to be really clear. If we are to achieve anything with the Bill, this is an area that we should be working on.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I wholeheartedly agree with my hon. Friend. As I have said, the amendments would put in place rock-bottom safety measures that could prevent the most appalling abuses on pornography websites, and it is a scandal that, hitherto, they have not been implemented. We have the opportunity to change that today by voting for the amendments and ensuring that these measures are in place. I urge the Minister and Conservative Members to do the right thing.

Online Safety Bill (Third sitting)

Debate between Kim Leadbeater and Alex Davies-Jones
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you. The Antisemitism Policy Trust has made the case that search services should be eligible for inclusion as a high-risk category. Is that still your position? What is the danger, currently, of excluding them from that provision?

Danny Stone: Very much so. You heard earlier about the problems with advertising. I recognise that search services are not the same as user-to-user services, so there does need to be some different thinking. However, at present, they are not required to address legal harms, and the harms are there.

I appeared before the Joint Committee on the draft Bill and talked about Microsoft Bing, which, in its search bar, was prompting people with “Jews are” and then a rude word. You look at “Gays are”, today, and it is prompting people with “Gays are using windmills to waft homosexual mists into your home”. That is from the search bar. The first return is a harmful article. Do the same in Google, for what it’s worth, and you get “10 anti-gay myths debunked.” They have seen this stuff. I have talked to them about it. They are not doing the work to try to address it.

Last night, using Amazon Alexa, I searched “Is George Soros evil?” and the response, was “Yes, he is. According to an Alexa Answers contributor, every corrupt political event.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The problem with that is that the search prompts—the things that you are being directed to; the systems here—are problematic, because one person could give an answer to Amazon and that prompts the response. The second one, about the White Helmets, was a comment on a website that led Alexa to give that answer.

Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that. Something that forces those search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently, would be very wise.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you to the witnesses for joining us today. The Bill contains duties to protect content of “democratic importance” and “journalistic content”. What is your view of these measures and their likely effectiveness?

Liron Velleman: These are both pretty dangerous clauses. We are very concerned about what I would probably be kind and call their unintended consequences. They are loopholes that could allow some of the most harmful and hateful actors to spread harm on social media. I will take “journalistic” first and then move on to “democratic”.

A number of companies mentioned in the previous evidence session are outlets that could be media publications just by adding a complaints system to their website. There is a far-right outlet called Urban Scoop that is run by Tommy Robinson. They just need to add a complaints system to their website and then they would be included as a journalist. There are a number of citizen journalists who specifically go to our borders to harass people who are seeking refuge in this country. They call themselves journalists; Tommy Robinson himself calls himself a journalist. These people have been specifically taken off platforms because they have repeatedly broken the terms of service of those platforms, and we see this as a potential avenue for them to make the case that they should return.

We also see mainstream publications falling foul of the terms of service of social media companies. If I take the example of the Christchurch massacre, social media companies spent a lot of time trying to take down both the livestream of the attack in New Zealand and the manifesto of the terrorist, but the manifesto was then put on the Daily Mail website—you could download the manifesto straight from the Daily Mail website—and the livestream was on the Daily Mirror and The Sun’s websites. We would be in a situation where social media companies could take that down from anyone else, but they would not be able to take it down from those news media organisations. I do not see why we should allow harmful content to exist on the platform just because it comes from a journalist.

On “democratic”, it is still pretty unclear what the definition of democratic speech is within the Bill. If we take it to be pretty narrow and just talk about elected officials and candidates, we know that far-right organisations that have been de-platformed from social media companies for repeatedly breaking the terms of service—groups such as Britain First and, again, Tommy Robinson—are registered with the Electoral Commission. Britain First ran candidates in the local elections in 2022 and they are running in the Wakefield by-election, so, by any measure, they are potentially of “democratic importance”, but I do not see why they should be allowed to break terms of service just because they happen to have candidates in elections.

If we take it on a wider scale and say that it is anything of “democratic importance”, anyone who is looking to cause harm could say, “A live political issue is hatred of the Muslim community.” Someone could argue that that or the political debate around the trans community in the UK is a live political debate, and that would allow anyone to go on the platform and say, “I’ve got 60 users and I’ve got something to say on this live political issue, and therefore I should be on the platform,” in order to cause that harm. To us, that is unacceptable and should be removed from the Bill. We do not want a two-tier internet where some people have the right to be racist online, so we think those two clauses should be removed.

Stephen Kinsella: At Clean up the Internet this is not our focus, although the proposals we have made, which we have been very pleased to see taken up in the Bill, will certainly introduce friction. We keep coming back to friction being one of the solutions. I am not wearing this hat today, but I am on the board of Hacked Off, and if Hacked Off were here, I think they would say that the solution—although not a perfect solution—might be to say that a journalist, or a journalistic outlet, will be one that has subjected itself to proper press regulation by a recognised press regulator. We could then possibly take quite a lot of this out of the scope of social media regulation and leave it where I think it might belong, with proper, responsible press regulation. That would, though, lead on to a different conversation about whether we have independent press regulation at the moment.