Online Safety Bill (Sixteenth sitting) Debate

Full Debate: Read Full Debate

Alex Davies-Jones

Main Page: Alex Davies-Jones (Labour - Pontypridd)

Online Safety Bill (Sixteenth sitting)

Alex Davies-Jones Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - -

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. As my hon. Friend the Member for Worsley and Eccles South mentioned when speaking to new clause 11, Labour has genuine concerns about supply chain risk assessment duties. That is why we have tabled new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties drawing on existing legislation.

As we know, platforms, particularly those supporting user-to-user generated content, often employ services from third parties. At our evidence sessions we heard from Danny Stone of the Antisemitism Policy Trust that this has included Twitter explaining that racist GIFs were not its own but were provided by another service. The hands-off approach that platforms have managed to get away with for far too long is exactly what the Bill is trying to fix, yet without this important new clause we fear there will be very little change.

We have already raised issues with the reliance on third party providers more widely, particularly content moderators, but the same problems also apply to some types of content. Labour fears a scenario in which a company captured by the regulatory regime established by the Bill will argue that an element of its service is not within the ambit of the regulator simply because it is part of a supply chain, represented by, but not necessarily the responsibility of, the regulated services.

The contracted element, supported by an entirely separate company, would argue that it is providing business-to-business services. That is not user-to-user generated content per se but content designed and delivered at arm’s length, provided to the user-to-user service to deploy to its users. The result would likely be a timely, costly and unhelpful legal process during which systems could not be effectively regulated. The same may apply in relation to moderators, where complex contract law would need to be invoked.

We recognise that in UK legislation there are concerns and issues around supply chains. The Bribery Act 2010, for example, says that a company is liable if anyone performing services for or on the company’s behalf is found culpable of specific actions. We therefore strongly urge the Minister to consider this new clause. We hope he will see the extremely compelling reasons why liability should be introduced for platforms failing to ensure that associated parties, considered to be a part of a regulated service, help to fulfil and abide by relevant duties.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The new clause seeks to impose liability on a provider where a company providing regulated services on its behalf does not comply with the duties in the Bill. The provider would be liable regardless of whether it has any control over the service in question. We take the view this would impose an unreasonable burden on businesses and cause confusion over which companies are required to comply with the duties in the Bill.

As drafted, the Bill ensures legal certainty and clarity over which companies are subject to duties. Clause 180 makes it clear that the Bill’s duties fall on companies with control over the regulated service. The point about who is in control is very important, because the liability should follow the control. These companies are responsible for ensuring that any third parties, such as contractors or individuals involved in running the service, are complying with the Bill’s safety duties, so that they cannot evade their duties in that way.

Companies with control over the regulated service are best placed to keep users safe online, assess risk, and put in place systems and processes to minimise harm, and therefore bear the liability if there is a transgression under the Bill as drafted. Further, the Bill already contains robust provisions in clause 161 and schedule 14 that allow Ofcom to hold parent and subsidiary companies jointly liable for the actions of other companies in a group structure. These existing mechanisms promote strong compliance within groups of companies and ensure that the entities responsible for breaches are the ones held responsible. That is why we feel the Bill as drafted achieves the relevant objectives.

Question put, That the clause be read a Second time.

--- Later in debate ---
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

New clause 15—Media literacy strategy

“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The strategy must—

(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),

(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;

(c) explain why OFCOM considers that the steps it proposes to take will be effective;

(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.

(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.

(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.

(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—

(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;

(b) the advisory committee on disinformation and misinformation, and

(c) any other person that OFCOM consider appropriate.

(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—

(a) revise the strategy, or

(b) publish an explanation of why they have decided not to revise it.

(7) If OFCOM decides to revise the strategy they must—

(a) consult in accordance with subsection (3), and

(b) publish the revised strategy.”

This new clause requires Ofcom to publish a strategy related to their duty to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 16—Media literacy strategy: progress report

“(1) OFCOM must report annually on the delivery of the strategy required under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The report must include—

(a) a description of the steps taken in accordance with the strategy during the year to which the report relates; and

(b) an assessment of the extent to which those steps have had an effect on the media literacy of the public in that year.

(3) The assessment referred to in subsection (2)(b) must be made in accordance with the approach set out by OFCOM in the strategy (see section (Duty to promote media literacy: regulated user-to-user services and search services) (2)(d).

(4) OFCOM must—

(a) publish the progress report in such manner as they consider appropriate; and

(b) send a copy of the report to the Secretary of State who must lay the copy before Parliament.”

This new clause is contingent on NC15.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The UK has a vast media literacy skills and knowledge gap, which leaves the population at risk of harm. Indeed, research from Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information. Similarly, about 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so.

Good media literacy is our first line of defence against bad information online. It can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and wellbeing, social cohesion and democracy. Clause 103 of the draft Bill proposed a new media duty for Ofcom to replace the one in section 11 of the Communications Act 2003, but sadly the Government scrapped it from the final Bill.

Media literacy initiatives in the Online Safety Bill are now mentioned only in the context of risk assessments, but there is no active requirement for internet companies to promote media literacy. The draft Bill’s media literacy provision needed to be strengthened, not cut. New clauses 14, 15 and 16 would introduce a new, stronger media literacy duty on Ofcom, with specific objectives. They would require the regulator to produce a statutory strategy for delivering on it and then to report on progress made towards increasing media literacy under the strategy. There is no logical reason for the Minister not to accept these important new clauses or work with Labour on them.

Over the past few weeks, we have debated a huge range of issues that are being perpetuated online as we speak, from vile, misogynistic content about women and girls to state-sponsored disinformation. It is clear that the lessons have not been learned from the past few years, when misinformation was able to significantly undermine public health, most notably throughout the pandemic. Harmful and, more importantly, false statistics were circulated online, which caused significant issues in encouraging the uptake of the vaccine. We have concerns that, without a robust media literacy strategy, the consequences of misinformation and disinformation could go further.

The issues that Labour has raised about the responsibility of those at the top—the Government—have been well documented. Only a few weeks ago, we spoke about the Secretary of State actually contributing to the misinformation discourse by sharing a picture of the Labour leader that was completely out of context. How can we be in a position where those at the top are contributing to this harmful discourse? The Minister must be living in a parallel universe if he cannot see the importance of curbing these harmful behaviours online as soon as possible. He must know that media literacy is at the very heart of the Bill’s success more widely. We genuinely feel that a strengthened media literacy policy would be a huge step forward, and I sincerely hope that the Minister will therefore accept the justification behind these important new clauses.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I agree entirely on these new clauses. Although the Bill will make things safer, it will do that properly only if supported by proper media literacy and the upskilling of everybody who spends any portion of their lives online. They all need better media literacy, and I am not excluding myself from that. Everybody, no matter how much time they have spent online, can learn more about better ways to fact-check and assess risk, and about how services use our data.

I pay tribute to all those involved in media literacy—all the educators at all levels, including school teachers delivering it as part of the curriculum, school teachers delivering it not as part of the curriculum, and organisations such as CyberSafe Scotland in my constituency, which is working incredibly hard to upskill parents and children about the internet. They also include organisations such as the Silver City Surfers in Aberdeen, where a group of young people teaches groups of elderly people how to use the internet. All those things are incredibly helpful and useful, but we need to ensure that Ofcom is at the top of that, producing materials and taking its duties seriously. It must produce the best possible information and assistance for people so that up-to-date media literacy training can be provided.

As we have discussed before, Ofcom’s key role is to ensure that when threats emerge, it is clear and tells people, “This is a new threat that you need to be aware of,” because the internet will grow and change all the time, and Ofcom is absolutely the best placed organisation to be recognising the new threats. Obviously, it would do that much better with a user advocacy panel on it, but given its oversight and the way it will be regulating all the providers, Ofcom really needs to take this issue as seriously as it can. It is impossible to overstate the importance of media literacy, so I give my wholehearted backing to the three new clauses.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously recognise and support the intent behind the new clause, which is to make sure that work is undertaken by Ofcom specifically, and the Government more widely, on media literacy. That is important for the reasons laid out by the hon. Members for Aberdeen North and for Batley and Spen.

Ofcom already has a statutory duty to promote media literacy in relation to electronic media, which includes everything in scope of the Bill and more beyond. That is set out in the Communications Act 2003, so the statutory duty exists already. The duty proposed in new clause 14 is actually narrower in scope than the existing statutory duty on Ofcom, and I do not think it would be a very good idea to give Ofcom an online literacy duty with a narrower scope than the one it has already. For that reason, I will resist the amendment, because it narrows the duties rather than widens them.

I would also point out that a number of pieces of work are being done non-legislatively. The campaigns that the hon. Member for Batley and Spen mentioned—dating often, I think, back to the 1980s—were of course done on a non-legislative basis and were just as effective for it. In that spirit, Ofcom published “Ofcom’s approach to online media literacy” at the end of last year, which sets out how Ofcom plans to expand, and is expanding, its media literacy programmes, which cover many of the objectives specified in the new clause. Therefore, Ofcom itself has acted already—just recently—via that document.

Finally, I have two points about what the Government are doing. First, about a year ago the Government published their own online media literacy strategy, which has been backed with funding and is being rolled out as we speak. When it comes to disinformation more widely, which we have debated previously, we also have the counter-disinformation unit working actively on that area.

Therefore, through the Communications Act 2003, the statutory basis exists already, and on a wider basis than in these new clauses; and, through the online media literacy strategy and Ofcom’s own approach, as recently set out, this important area is well covered already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

We feel that we cannot have an online safety Bill without a core digital media literacy strategy. We are disappointed that clause 103 was removed from the draft Bill. We do not feel that the current regime, under the Communications Act 2003, is robust enough. Clearly, the Government do not think it is robust enough, which is why they tried to replace it in the first place. We are sad to see that now replaced altogether. We fully support these new clauses.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My hon. Friend the Member for Ochil and South Perthshire is not present and he had intended to move this new clause. If the Committee does not mind, I will do more reading and look at my notes more than I would normally when giving a speech.

Misinformation and disinformation arise during periods of uncertainty, either acutely, such as during a terror attack, or over a long period, as with the pandemic. That often includes information gaps and a proliferation of inaccurate claims that spread quickly. Where there is a vacuum of information, we can have bad actors or the ill-informed filling it with false information.

Information incidents are not dealt with effectively enough in the Bill, which is focused on regulating the day-to-day online environment. I accept that clause 146 gives the Secretary of State powers of direction in certain special circumstances, but their effectiveness in real time would be questionable. The Secretary of State would have to ask Ofcom to prioritise its media literacy function or to make internet companies report on what they are doing in response to a crisis. That is just too slow, given the speed at which such incidents can spread.

The new clause might involve Ofcom introducing a system whereby emerging incidents could be reported publicly and different actors could request the regulator to convene a response group. The provision would allow Ofcom to be more proactive in its approach and, in I hope rare moments, to provide clear guidance. That is why the new clause is a necessary addition to the Bill.

Many times, we have seen horrendous incidents unfold on the internet, in a very different way from how they ever unfolded in newspapers, on news websites or among people talking. We have seen the untold and extreme harm that such information incidents can cause, as significant, horrific events can be spread very quickly. We could end up in a situation where an incident happens and, for example, a report spreads that a Muslim group was responsible when there is absolutely no basis of truth to that. A vacuum can be created and bad actors step into it in order to spread discrimination and lies, often about minority groups who are already struggling. That is why we move the new clause.

For the avoidance of doubt, new clause 45, which was tabled by Labour, is also to be debated in this group. I am more than happy to support it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

As we know, the new clause would give Ofcom a proactive role in identifying and responding to misinformation incidents that can occur in a moment of crisis. As we have discussed, there are huge gaps in the Bill’s ability to sufficiently arm Ofcom with the tools it will likely need to tackle information incidents in real time. It is all very well that the Bill will ensure that things such as risk assessments are completed, but, ultimately, if Ofcom is not able to proactively identify and respond to incidents in a crisis, I have genuine concerns about how effective this regulatory regime will be in the wider sense. Labour is therefore pleased support the new clause, which is fundamental to ensuring that Ofcom can be the proactive regulator that the online space clearly needs.

The Government’s methods of tackling disinformation are opaque, unaccountable and may not even work. New clause 45, which would require reporting to Parliament, may begin to address this issue. When Ministers are asked how they tackle misinformation or disinformation harms, they refer to some unaccountable civil service team involved in state-based interference in online media.

I thank those at Carnegie UK Trust for their support when researching the following list, and for supporting my team and me to make sense of the Bill. First, we have the counter-disinformation unit, which is based in the Department for Digital, Culture, Media and Sport and intends to address mainly covid issues that breach companies’ terms of service and, recently, the Russia-Ukraine conflict. In addition, the Government information cell, which is based in the Foreign, Commonwealth and Development Office, focuses on war and national security issues, including mainly Russia and Ukraine. Thirdly, there is the so-called rapid response unit, which is based in the Cabinet Office, and mainly tackles proactive counter-messaging.

Those teams appear to nudge service providers in different ways where there are threats to national security or the democratic process, or risks to public health, yet we have zero record of their effectiveness. The groups do not publish logs of action to any external authority for oversight of what they raise with companies using the privilege authority of Her Majesty’s Government, nor do they publish the effectiveness of their actions. As far as we know, they are not rooted in expert independent external advisers. That direct state interference in the media is very worrying.

In our recent debate on amendment 83, which calls on the Government to include health misinformation and disinformation in the Bill, the Minister clearly set out why he thinks the situation is problematic. He said,

“We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.”––[Official Report, Online Safety Public Bill Committee, 14 June 2022; c. 408.]

Until we know more about those units, the boundary between their actions and that of a press office remains unclear. In the new regulatory regime, Ofcom needs to be kept up to date on the issues they are raising. The Government should reform the system and bring those units out into the open. We support Carnegie’s longer term strategic goal to set up a new external oversight body and move the current Government functions under Ofcom’s independent supervision. The forthcoming National Security Bill may tackle that, but I will leave that for the Minister to consider.

There must be a reporting system that requires the Government to set out their operational involvement with social media companies to address misinformation and disinformation, which is why we have tabled new clause 45. I hope the Minister will see that the current efforts in these units are hugely lacking in transparency, which we all want and have learned is fundamental to keep us all safe online.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We agree that it is important that the Bill contains measures to tackle disinformation and misinformation that may emerge during serious information incidents, but the Bill already contains measures to address those, including the powers vested in the Secretary of State under clause 146, which, when debated, provoked some controversy. Under that clause, the Secretary of State will have the power to direct Ofcom when exercising its media literacy functions in the context of an issue of public health or safety or national security.

Moreover, Ofcom will be able to require platforms to issue a public statement about the steps they are taking to respond to a threat to public health or safety or to national security. As we discussed, it is appropriate that the Secretary of State will make those directions, given that the Government have the access to intelligence around national security and the relevant health information. Ofcom, as a telecoms regulator, obviously does not have access to that information, hence the need for the Secretary of State’s involvement.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Given that clarification, I will not press the new clause. The Minister has made the case strongly enough and has clarified clause 85(1) to my satisfaction. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 23

Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

violence against women or girls.

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).” —(Alex Davies-Jones.)

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

Brought up, and read the First time.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move, That the clause be read a Second time.

This new clause would apply provisions applied to priority illegal content also to content that constitutes, encourages or promotes violence against women and girls. As it stands, the Bill is failing women and girls. In an attempt to tackle that alarming gap, the new clause uses the Istanbul convention definition of VAWG, given that the Home Secretary has so recently agreed to ratify the convention—just a decade after was signed.

The Minister might also be aware that GREVIO—the Group of Experts on Action against Violence against Women and Domestic Violence—which monitors the implementation of the Istanbul convention, published a report in October 2021 on the digital dimension of violence against women and girls. It stated that domestic laws are failing to place the abuse of women and girls online

“in the context of a continuum of violence against women that women and girls are exposed to in all spheres of life, including in the digital sphere.”

The purpose of naming VAWG in the Bill is to require tech companies to be responsible for preventing and addressing VAWG as a whole, rather than limiting their obligations only to specific criminal offences listed in schedule 7 and other illegal content. It is also important to note that the schedule 7 priority list was decided on without any consultation with the VAWG sector. Naming violence against women and girls will also ensure that tech companies are held to account for addressing emerging forms of online hate, which legislation is often unable to keep up with.

We only need to consider accounts from survivors of online violence against women and girls, as outlined in “VAWG Principles for the Online Safety Bill”, published in September last year, to really see the profound impact that the issue is having on people’s lives. Ellesha, a survivor of image-based sexual abuse, was a victim of voyeurism at the hands of her ex-partner. She was filmed without her consent and was later notified by someone else that he had uploaded videos of her to Pornhub. She recently spoke at an event that I contributed to—I believe the right hon. Member for Basingstoke and others also did—on the launch of the “Violence Against Women and Girls Code of Practice”. I am sure we will come to that code of practice more specifically on Report. Her account was genuinely difficult to listen to.

This is an issue that Ellesha, with the support of EVAW, Glitch, and a huge range of other organisations, has campaigned on for some time. She says:

“Going through all of this has had a profound impact on my life. I will never have the ability to trust people in the same way and will always second guess their intentions towards me. My self confidence is at an all time low and although I have put a brave face on throughout this, it has had a detrimental effect on my mental health.”

Ellesha was informed by the police that they could not access the websites where her ex-partner had uploaded the videos, so she was forced to spend an immense amount of time trawling through all of the videos uploaded to simply identify herself. I can only imagine how distressing that must have been for her.

Pornhub’s response to the police inquiries was very vague in the first instance, and it later ignored every piece of following correspondence. Eventually the videos were taken down, likely by the ex-partner himself when he was released from the police station. Ellesha was told that Pornhub had only six moderators at the time—just six for the entire website—and it and her ex-partner ultimately got away with allowing the damaging content to remain, even though the account was under his name and easily traced back to his IP address. That just is not good enough, and the Minister must surely recognise that the Bill fails women in its current form.

If the Minister needs any further impetus to genuinely consider the amendment, I point him to a BBC report from last week that highlighted how much obscene material of women and girls is shared online without their consent. The BBC’s Angus Crawford investigated Facebook accounts and groups that were seen to be posting pictures and videos of upskirting. Naturally, Meta—Facebook’s owner—said that it had a grip on the problem and that those accounts and groups had all been removed, yet the BBC was able to find thousands of users sharing material. Indeed, one man who posted videos of himself stalking schoolgirls in New York is now being investigated by the police. This is the reality of the internet; it can be a powerful, creative tool for good, but far too often it seeks to do the complete opposite.

I hate to make this a gendered argument, but there is a genuine difference between the experiences of men and women online. Last week the Minister came close to admitting that when I queried whether he had ever received an unsolicited indecent picture. I am struggling to understand why he has failed to consider these issues in a Bill proposed by his Department.

The steps that the Government are taking to tackle violence against women and girls offline are broadly to be commended, and I welcome a lot of the initiatives. The Minister must see sense and do the right thing by also addressing the harms faced online. We have a genuine opportunity in the Bill to prevent violence against women and girls online, or at least to diminish some of the harms they face. Will he please do the right thing?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister is right to raise the issue of women and girls being disproportionately—one might say overwhelmingly—the victims of certain kinds of abuse online. We heard my right hon. Friend the Member for Basingstoke, the shadow Minister and others set that out in a previous debate. The shadow Minister is right to raise the issue.

Tackling violence against women and girls has been a long-standing priority of the Government. Indeed, a number of important new offences have already been and are being created, with protecting women principally in mind—the offence of controlling or coercive behaviour, set out in the Serious Crime Act 2015 and amended in the Domestic Abuse Act 2021; the creation of a new stalking offence in 2012; a revenge porn offence in 2015; and an upskirting offence in 2019. All of those offences are clearly designed principally to protect women and girls who are overwhelmingly the victims of those offences. Indeed, the cyber-flashing offence created by clause 156 —the first time we have ever had such an offence in this jurisdiction—will, again, overwhelmingly benefit women and girls who are the victims of that offence.

All of the criminal offences I have mentioned—even if they are not mentioned in schedule 7, which I will come to in a moment—will automatically flow into the Bill via the provisions of clause 52(4)(d). Criminal offences where the victim is an individual, which these clearly all are, automatically flow into the provisions of the Bill, including the offences I just listed, which have been created particularly with women in mind.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I hope I have made very clear in everything I have said, which I do not propose to repeat, that the way the Bill operates, in several different areas, and the way the criminal law has been constructed over the past 10 years, building on the work of previous Governments, is that it is designed to make sure that the crimes committed overwhelmingly against women and girls are prioritised. I think the Bill does achieve the objective of providing that protection, which every member of this Committee wishes to see delivered. I have gone through it in some detail. It is woven throughout the fabric of the Bill, in multiple places. The objective of new clause 23 is more than delivered.

In conclusion, we will be publishing a list of harms, including priority harms for children and adults, which will then be legislated for in secondary legislation. The list will be constructed with the vulnerability of women and girls particularly in mind. When Committee members see that list, they will find it reassuring on this topic. I respectfully resist the new clause, because the Bill is already incredibly strong in this important area as it has been constructed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

The Bill is strong, but it could be stronger. It could be, and should be, a world-leading piece of legislation. We want it to be world-leading and we feel that new clause 23 would go some way to achieving that aim. We have cross-party support for tackling violence against women and girls online. Placing it on the face of the Bill would put it at the core of the Bill—at its heart—which is what we all want to achieve. With that in mind, I wish to press the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - -

I beg to move, That the clause be read a Second time.

This new clause would require the Secretary of State to publish and lay before Parliament a report on the harms caused to users by synthetic media content, also known as deepfakes. The report must contain particular reference to the harms caused to those working in the entertainment industry.

The Government define artificial intelligence as

“technologies with the ability to perform tasks that would otherwise require human intelligence, such as visual perception, speech recognition, and language translation”.

That kind of technology has advanced rapidly in recent years, and commercial AI companies can be found across all areas of the entertainment industries, including voice, modelling, music, dance, journalism and gaming—the list goes on.

One key area of development is AI-made performance synthetisation, which is the process of creating a synthetic performance. That has a wide range of applications, including automated audiobooks, interactive digital avatars and “deepfake” technology, which often, sadly, has more sinister implications. Innovation for the entertainment industry is welcome and, when used ethically and responsibly, can have various benefits. For example, AI systems can create vital sources of income for performers and creative workers. From an equalities perspective, it can be used to increase accessibility for disabled workers.

However, deepfake technology has received significant attention globally due to its often-malicious application. Deepfakes have been defined as,

“realistic digital forgeries of videos or audio created with cutting-edge machine learning techniques.”

An amalgamation of artificial intelligence, falsification and automation, deepfakes use deep learning to replicate the likeness and actions of real people. Over the past few years, deepfake technology has become increasingly sophisticated and accessible. Various apps can be downloaded for free, or a low cost, to utilise deepfake technology.

Deepfakes can cause short-term and long-term social harms to individuals working in the entertainment industry, and to society more broadly. Currently, deepfakes are mostly used in pornography, inflicting emotional and reputational damage, and in some cases violence towards the individual—mainly women. The US entertainment union, the Screen Actors Guild, estimates that 96% of deepfakes are pornographic and depict women, and 99% of deepfake subjects are from the entertainment industry.

However, deepfakes used without consent pose a threat in other key areas. For example, deepfake technology has the power to alter the democratic discourse. False information about institutions, policies, and public leaders, powered by a deepfake, can be exploited to spin information and manipulate belief. For example, deepfakes have the potential to sabotage the image and reputation of a political candidate and may alter the course of an election. They could be used to impersonate the identities of business leaders and executives to facilitate fraud, and also have the potential to accelerate the already declining trust in the media.

Alongside the challenges presented by deepfakes, there are issues around consent for performers and creative workers. In a famous case, the Canadian voiceover artist Bev Standing won a settlement after TikTok synthesised her voice without her consent and used it for its first ever text-to-speech voice function. Many artists in the UK are also having their image, voice or likeness used without their permission. AI systems have also started to replace jobs for skilled professional performers because using them is often perceived to be a cheaper and more convenient way of doing things.

Audio artists are particularly concerned by the development of digital voice technology for automated audiobooks, using the same technology used for digital voice assistants such as Siri and Alexa. It is estimated that within one or two years, high-end synthetic voices will have reached human levels. Equity recently conducted a survey on this topic, which found that 65% of performers responding thought that the development of AI technology poses a threat to employment opportunities in the performing arts sector. That figure rose to 93% for audio artists. Pay is another key issue; it is common for artists to not be compensated fairly, and sometimes not be paid at all, when engaging with AI. Many artists have also been asked to sign non-disclosure agreements without being provided with the full information about the job they are taking part in.

Government policy making is non-existent in this space. In September 2021 the Government published their national AI strategy, outlining a 10-year plan to make Britain a global AI superpower. In line with that strategy, the Government have delivered two separate consultations looking at our intellectual property system in relation to AI.

None Portrait The Chair
- Hansard -

Order. I am sorry, but I must interrupt the hon. Lady to adjourn the sitting until this afternoon, when Ms Rees will be in the Chair.

Before we leave the room, my understanding is that it is hoped that the Bill will report this afternoon. That is a matter for the usual channels; it is nothing to do with the Chair. However, of course, it is an open-ended session, so if you are getting close to the mark, you may choose to go on. If that poses a problem for Ms Rees, I am prepared to take the Chair again to see it through if we have to. On the assumption that I do not, thank you all very much indeed for the courtesy you have shown throughout this session, which has been exemplary. I also thank the staff; thank you very much.