Online Safety Bill Debate
Full Debate: Read Full DebateLord Parkinson of Whitley Bay
Main Page: Lord Parkinson of Whitley Bay (Conservative - Life peer)Department Debates - View all Lord Parkinson of Whitley Bay's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, the Government are supportive of improving data sharing and encouraging greater collaboration between companies and researchers, subject to the appropriate safeguards. However, the data that companies hold about users can, of course, be sensitive; as such, mandating access to data that are not publicly available would be a complex matter, as noble Lords noted in their contributions. The issue must be fully thought through to ensure that the risks have been considered appropriately. I am grateful for the consideration that the Committee has given this matter.
It is because of this complexity that we have given Ofcom the task of undertaking a report on researchers’ access to information. Ofcom will conduct an in-depth assessment of how researchers can currently access data. To the point raised by the noble Lord, Lord Knight, and my noble friend Lord Bethell, let me provide reassurance that Ofcom will assess the impact of platforms’ policies that restrict access to data in this report, including where companies charge for such access. The report will also cover the challenges that constrain access to data and how such challenges might be addressed. These insights will provide an evidence base for any guidance that Ofcom may issue to help improve data access for researchers in a safe and secure way.
Amendments 230 and 231 seek to require Ofcom to publish a report into researchers’ access to data more rapidly than within the currently proposed two years. I share noble Lords’ desire to develop the evidence base on this issue swiftly, but care must be taken to balance Ofcom’s timelines to ensure that it can deliver its key priorities in establishing the core parts of the regulatory framework that the Bill will bring in; for example, the illegal content and child safety duties. Implementing these duties must be the immediate priority for Ofcom to ensure that the Bill meets its objective of protecting people from harm. It is crucial that we do not divert attention away from these areas and that we allow Ofcom to carry out this work as soon as is practicable.
Further to this, considering the complex matter of researchers’ access to data will involve consultation with interested parties, such as the Information Commissioner’s Office, the Centre for Data Ethics and Innovation, UK Research and Innovation, representatives of regulated services and others—including some of those parties mentioned by noble Lords today—as set out in Clause 146(3). This is an extremely important issue that we need to get right. Ofcom must be given adequate time to consult as it sees necessary and undertake the appropriate research.
Before the Minister succeeds in disappointing us, can he clarify something for us? Once Ofcom has published the report, it has the power to issue guidance. What requirement is there for platforms to abide by that guidance? We want there to be some teeth at the end of all this. There is a concern that a report will be issued, followed by some guidance, but that nothing much else will happen.
It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.
We are sympathetic to the amendment. It is complex, and this has been a useful debate—
I wonder whether the Minister has an answer to the academic community, who now see their European colleagues getting ahead through being able to access data through other legislation in other parts of the world. Also, we have a lot of faith in Ofcom, but it seems a mistake to let it be the only arbiter of what needs to be seen.
We are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.
I do not expect the Minister to have an answer to this today, but it will be useful to get this on the record as it is quite important. Can he let us know the Government’s thinking on the other piece of the equation? We are getting the platforms to disclose the data, and an important regulatory element is the research organisations that receive it. In the EU, that is being addressed with a code of conduct, which is a mechanism enabled by the general data protection regulation that has been approved by the European Data Protection Board and creates this legal framework. I am not aware of equivalent work having been done in the UK, but that is an essential element. We do not want to find that we have the teeth to persuade the companies to disclose the data, but not the other piece we need—probably overseen by the Information Commissioner’s Office rather than Ofcom—which is a mechanism for approving researchers to receive and then use the data.
We are watching with interest what is happening in other jurisdictions. If I can furnish the Committee with any information in the area the noble Lord mentions, I will certainly follow up in writing.
I have a question, in that case, in respect of the jurisdictions. Why should we have weaker powers for our regulator than others?
I do not think that we do. We are doing things differently. Of course, Ofcom will be looking at all these matters in its report, and I am sure that Parliament will have an ongoing interest in them. As jurisdictions around the world continue to grapple with these issues, I am sure that your Lordships’ House and Parliament more broadly will want to take note of those developments.
But surely, there is no backstop power. There is the review but there is no backstop which would come into effect on an Ofcom recommendation, is there?
We will know once Ofcom has completed its research and examination of these complex issues; we would not want to pre-judge its conclusions.
With that, if there are no further questions, I invite the noble Lord to withdraw his amendment.
My Lords, this was a short but important debate with some interesting exchanges at the end. The noble Baroness, Lady Harding, mentioned the rapidly changing environment generated by generative AI. That points to the need for wider ecosystem-level research on an independent basis than we fear we might get as things stand, and certainly wider than the skilled persons we are already legislating for. The noble Lord, Lord Bethell, referred to the access that advertisers already have to insight. It seems a shame that we run the risk, as the noble Baroness, Lady Kidron, pointed out, of researchers in other jurisdictions having more privileged access than researchers in this country, and therefore becoming dependent on those researchers and whistleblowers to give us that wider view. We could proceed with a report and guidance as set out in the Bill but add in some reserved powers in order to take action if the report suggests that Ofcom might need and want that. The Minister may want to reflect on that, having listened to the debate. On that basis, I am happy to beg leave to withdraw the amendment.
My Lords, this has been a broad and mixed group of amendments. I will be moving the amendments in my name, which are part of it. These introduce the new offence of encouraging or assisting serious self-harm and make technical changes to the communications offences. If there can be a statement covering the group and the debate we have had, which I agree has been well informed and useful, it is that this Bill will modernise criminal law for communications online and offline. The new offences will criminalise the most damaging communications while protecting freedom of expression.
Amendments 264A, 266 and 267, tabled by the noble Lord, Lord Clement-Jones, and my noble friend Lady Buscombe, would expand the scope of the false communications offence to add identity theft and financial harm to third parties. I am very grateful to them for raising these issues, and in particular to my noble friend Lady Buscombe for raising the importance of financial harm from fake reviews. This will be addressed through the Digital Markets, Competition and Consumers Bill, which was recently introduced to Parliament. That Bill proposes new powers to address fake and misleading reviews. This will provide greater legal clarity to businesses and consumers. Where fake reviews are posted, it will allow the regulator to take action quickly. The noble Baroness is right to point out the specific scenarios about which she has concern. I hope she will look at that Bill and return to this issue in that context if she feels it does not address her points to her satisfaction.
Identity theft is dealt with by the Fraud Act 2006, which captures those using false identities for their own benefit. It also covers people selling or using stolen personal information, such as banking information and national insurance numbers. Adding identity theft to the communications offences here would duplicate existing law and expand the scope of the offences too broadly. Identity theft, as the noble Lord, Lord Clement-Jones, noted, is better covered by targeted offences rather than communications offences designed to protect victims from psychological and physical harm. The Fraud Act is more targeted and therefore more appropriate for tackling these issues. If we were to add identity theft to Clause 160, we would risk creating confusion for the courts when interpreting the law in these areas—so I hope the noble Lord will be inclined to side with clarity and simplicity.
Amendment 265, tabled by my noble friend Lord Moylan, gives me a second chance to consider his concerns about Clause 160. The Government believe that the clause is necessary and that the threshold of harm strikes the right balance, robustly protecting victims of false communications while maintaining people’s freedom of expression. Removing “psychological” harm from Clause 160 would make the offence too narrow and risk excluding communications that can have a lasting and serious effect on people’s mental well-being.
But psychological harm is only one aspect of Clause 160; all elements of the offence must be met. This includes a person sending a knowingly false message with an intention to cause non-trivial harm, and without reasonable excuse. It has also been tested extensively as part of the Law Commission’s report Modernising Communications Offences, when determining what the threshold of harm should be for this offence. It thus sets a high bar for prosecution, whereby a person cannot be prosecuted solely on the basis of a message causing psychological harm.
The noble Lord, Lord Allan, rightly recalled Section 127 of the Communications Act and the importance of probing issues such as this. I am glad he mentioned the Twitter joke trial—a good friend of mine acted as junior counsel in that case, so I remember it well. I shall spare the blushes of the noble Baroness, Lady Merron, in recalling who the Director of Public Prosecutions was at the time. But it is important that we look at these issues, and I am happy to speak further with my noble friend Lord Moylan and the noble Baroness, Lady Fox, about this and their broader concerns about freedom of expression between now and Report, if they would welcome that.
My noble friend Lord Moylan said that it would be unusual, or novel, to criminalise lying. The offence of fraud by false representation already makes it an offence dishonestly to make a false representation—to breach the ninth commandment—with the intention of making a gain or causing someone else a loss. So, as my noble and learned friend Lord Garnier pointed out, there is a precedent for lies with malicious and harmful intent being criminalised.
Amendments 267AA, 267AB and 268, tabled my noble friend Lady Buscombe and the noble Baroness, Lady Kennedy of The Shaws, take the opposite approach to those I have just discussed, as they significantly lower and expand the threshold of harm in the false and threatening communications offences. The first of these would specify that a threatening communications offence is committed even if someone encountering the message did not fear that the sender specifically would carry out the threat. I am grateful to the noble Baroness for her correspondence on this issue, informed by her work in Scotland. The test here is not whether a message makes a direct threat but whether it conveys a threat—which can certainly cover indirect or implied threats.
I reassure the noble Baroness and other noble Lords that Clause 162 already captures threats of “death or serious harm”, including rape and disfigurement, as well as messages that convey a threat of serious harm, including rape and death threats, or threats of serious injury amounting to grievous bodily harm. If a sender has the relevant intention or recklessness, the message will meet the required threshold. But I was grateful to see my right honourable friend Edward Argar watching our debates earlier, in his capacity as Justice Minister. I mentioned the matter to him and will ensure that his officials have the opportunity to speak to officials in Scotland to look at the work being done with regard to Scots law, and to follow the points that the noble Baroness, Lady Bennett, made about pictures—
I am grateful to the Minister. I was not imagining that the formulations that I played with fulfilled all of the requirements. Of course, as a practising lawyer, I am anxious that we do not diminish standards. I thank the noble Baroness, Lady Fox, for raising concerns about freedom of speech, but this is not about telling people that they are unattractive or ugly, which is hurtful enough to many women and can have very deleterious effects on their self-confidence and willingness to be public figures. Actually, I put the bar reasonably high in describing the acts that I was talking about: threats that somebody would kill, rape, bugger or disfigure you, or do whatever to you. That was the shocking thing: the evidence showed that it was often at that high level. It is happening not just to well-known public figures, who can become somewhat inured to this because they can find a way to deal with it; it is happening to schoolgirls and young women in universities, who get these pile-ons as well. We should reckon with the fact that it is happening on a much wider basis than many people understand.
Yes, we will ensure that, in looking at this in the context of Scots law, we have the opportunity to see what is being done there and that we are satisfied that all the scenarios are covered. In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007, so I hope that that satisfies her that that element is covered—but we will certainly look at all of this.
I turn to government Amendment 268AZA, which introduces the new serious self-harm offence, and Amendments 268AZB and 268AZC, tabled by the noble Lords, Lord Allan and Lord Clement-Jones. The Government recognise that there is a gap in the law in relation to the encouragement of non-fatal self-harm. The new offence will apply to anyone carrying out an act which intends to, and is capable of, encouraging or assisting another person seriously to self-harm by means of verbal or electronic communications, publications or correspondence.
I say to the noble Baroness, Lady Finlay of Llandaff, that the new clause inserted by Amendment 268AZA is clear that, when a person sends or publishes a communication that is an offence, it is also clear that, when a person forwards on another person’s communication, that will be an offence too. The new offence will capture only the most serious behaviour and avoid criminalising vulnerable people who share their experiences of self-harm. The preparation of these clauses was informed by extensive consultation with interested groups and campaign bodies. The new offence includes two key elements that constrain the offence to the most culpable offending; namely, that a person’s act must be intended to encourage or assist the serious self-harm of another person and that serious self-harm should amount to grievous bodily harm. If a person does not intend to encourage or assist serious self-harm, as will likely be the case with recovery and supportive material, no offence will be committed. The Law Commission looked at this issue carefully, following evidence from the Samaritans and others, and the implementation will be informed by an ongoing consultation as well.
I am sorry to interrupt the Minister, but the Law Commission recommended that the DPP’s consent should be required. The case that the Minister has made on previous occasions in some of the consultations that he has had with us is that this offence that the Government have proposed is different from the Law Commission one, and that is why they have not included the DPP’s consent. I am rather baffled by that, because the Law Commission was talking about a high threshold in the first place, and the Minister is talking about a high threshold of intent. Even if he cannot do so now, it would be extremely helpful to tie that down. As the noble Baroness and my noble friend said, 130 organisations are really concerned about the impact of this.
The Law Commission recommended that the consent, but not the personal consent, of the Director of Public Prosecutions should be required. We believe, however, that, because the offence already has tight parameters due to the requirement for an intention to cause serious self-harm amounting to grievous bodily harm, as I have just outlined, an additional safeguard of obtaining the personal consent of the Director of Public Prosecutions is not necessary. We would expect the usual prosecutorial discretion and guidance to provide sufficient safeguards against inappropriate prosecutions in this area. As I say, we will continue to engage with those groups that have helped to inform the drafting of these clauses as they are implemented to make sure that that assessment is indeed borne out.
I will follow up in writing on that point.
Before I conclude, I will mention briefly the further government amendments in my name, which make technical and consequential amendments to ensure that the communications offences, including the self-harm offence, have the appropriate territorial extent. They also set out the respective penalties for the communications offences in Northern Ireland, alongside a minor adjustment to the epilepsy trolling offence, to ensure that its description is more accurate.
I hope that noble Lords will agree that the new criminal laws that we will make through this Bill are a marked improvement on the status quo. I hope that they will continue to support the government amendments. I express my gratitude to the Law Commission and to all noble Lords—
Just before the Minister sits down—I assume that he has finished his brief on the self-harm amendments; I have been waiting—I have two questions relating to what he said. First, if I heard him right, he said that the person forwarding on is also committing an offence. Does that also apply to those who set up algorithms that disseminate, as opposed to one individual forwarding on to another individual? Those are two very different scenarios. We can see how one individual forwarding to another could be quite targeted and malicious, and we can see how disseminating through an algorithm could have very widespread harms across a lot of people in a lot of different groups—all types of groups—but I am not clear from what he said that that has been caught in his wording.
Secondly—I will ask both questions while I can—I asked the Minister previously why there have been no prosecutions under the Suicide Act. I understood from officials that this amendment creating an offence was to reflect the Suicide Act and that suicide was not included in the Bill because it was already covered as an offence by the Suicide Act. Yet there have been no prosecutions and we have had deaths, so I do not quite understand why I have not had an answer to that.
I will have to write on the second point to try to set that out in further detail. On the question of algorithms, the brief answer is no, algorithms would not be covered in the way a person forwarding on a communication is covered unless the algorithm has been developed with the intention of causing serious self-harm; it is the intention that is part of the test. If somebody creates an algorithm intending people to self-harm, that could be captured, but if it is an algorithm generally passing it on without that specific intention, it may not be. I am happy to write to the noble Baroness further on this, because it is a good question but quite a technical one.
It needs to be addressed, because these very small websites already alluded to are providing some extremely nasty stuff. They are not providing support to people and helping decrease the amount of harm to those self-harming but seem to be enjoying the spectacle of it. We need to differentiate and make sure that we do not inadvertently let one group get away with disseminating very harmful material simply because it has a small website somewhere else. I hope that will be included in the Minister’s letter; I do not expect him to reply now.
Some of us are slightly disappointed that my noble friend did not respond to my point on the interaction of Clause 160 with the illegal content duty. Essentially, what appears to be creating a criminal offence could simply be a channel for hyperactive censorship on the part of the platforms to prevent the criminal offence taking place. He has not explained that interaction. He may say that there is no interaction and that we would not expect the platforms to take any action against offences under Clause 160, or that we expect a large amount of action, but nothing was said.
If my noble friend will forgive me, I had better refresh my memory of what he said—it was some time ago—and follow up in writing.
My Lords, I will be extremely brief. There is much to chew on in the Minister’s speech and this was a very useful debate. Some of us will be happier than others; the noble Baroness, Lady Buscombe, will no doubt look forward to the digital markets Bill and I will just have to keep pressing the Minister on the Data Protection and Digital Information Bill.
There is a fundamental misunderstanding about digital identity theft. It will not necessarily always be fraud that is demonstrated—the very theft of the identity is designed to be the crime, and it is not covered by the Fraud Act 2006. I am delighted that the Minister has agreed to talk further with the noble Baroness, Lady Kennedy, because that is a really important area. I am not sure that my noble friend will be that happy with the response, but he will no doubt follow up with the Minister on his amendments.
The Minister made a very clear statement on the substantive aspect of the group, the new crime of encouraging self-harm, but further clarification is still needed. We will look very carefully at what he said in relation to what the Law Commission recommended, because it is really important that we get this right. I know that the Minister will talk further with the noble Baroness, Lady Finlay, who is very well versed in this area. In the meantime, I beg leave to withdraw my amendment.
My Lords, this is a real hit-and-run operation from the noble Lord, Lord Stevenson. He has put down an amendment on my favourite subject in the last knockings of the Bill. It is totally impossible to deal with this now—I have been thinking and talking about the whole area of AI governance and ethics for the past seven years—so I am not going to try. It is important, and the advisory committee under Clause 139 should take it into account. Actually, this is much more a question of authenticity and verification than of content. Trying to work out whether something is ChatGPT or GPT-4 content is a hopeless task; you are much more likely to be able to identify whether these are automated users such as chatbots than you are to know about the content itself.
I will leave it there. I missed the future-proofing debate, which I would have loved to have been part of. I look forward to further debates with the noble Viscount, Lord Camrose, on the deficiencies in the White Paper and to the Prime Minister’s much more muscular approach to AI regulation in future.
I am sure that the noble Lord, Lord Stevenson of Balmacara, is smiling over a sherry somewhere about the debate he has facilitated. His is a useful probing amendment and we have had a useful discussion.
The Government certainly recognise the potential challenges posed by artificial intelligence and digitally manipulated content such as deepfakes. As we have heard in previous debates, the Bill ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated where appropriate. Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service.
The labelling of this content via draft legislation is not something to which I can commit today. The Government’s AI regulation White Paper sets out the principles for the responsible development of artificial intelligence in the UK. These principles, such as safety, transparency and accountability, are at the heart of our approach to ensuring the responsible development and use of AI. As set out in the White Paper, we are building an agile approach that is designed to be adaptable in response to emerging developments. We do not wish to introduce a rigid, inflexible form of legislation for what is a flexible and fast-moving technology.
The public consultation on these proposals closed yesterday so I cannot pre-empt our response to it. The Government’s response will provide an update. I am joined on the Front Bench by the Minister for Artificial Intelligence and Intellectual Property, who is happy to meet with the noble Baroness, Lady Kidron, and others before the next stage of the Bill if they wish.
Beyond labelling such content, I can say a bit to make it clear how the Bill will address the risks coming from machine-generated content. The Bill already deals with many of the most serious and illegal forms of manipulated media, including deepfakes, when they fall within scope of services’ safety duties regarding illegal content or content that is potentially harmful to children. Ofcom will recommend measures in its code of practice to tackle such content, which could include labelling where appropriate. In addition, the intimate image abuse amendments that the Government will bring forward will make it a criminal offence to send deepfake images.
In addition to ensuring that companies take action to keep users safe online, we are taking steps to empower users with the skills they need to make safer choices through our work on media literacy. Ofcom, for example, has an ambitious programme of work through which it is funding several initiatives to build people’s resilience to harm online, including initiatives designed to equip people with the skills to identify disinformation. We are keen to continue our discussions with noble Lords on media literacy and will keep an open mind on how it might be a tool for raising awareness of the threats of disinformation and inauthentic content.
With gratitude to the noble Lords, Lord Stevenson and Lord Knight, and everyone else, I hope that the noble Lord, Lord Knight, will be content to withdraw his noble friend’s amendment.
My Lords, I am grateful to everyone for that interesting and quick debate. It is occasionally one’s lot that somebody else tables an amendment but is unavoidably detained in Jerez, drinking sherry, and monitoring things in Hansard while I move the amendment. I am perhaps more persuaded than my noble friend might have been by the arguments that have been made.
We will return to this in other fora in response to the need to regulate AI. However, in the meantime, I enjoyed in particular the John Booth quote from the noble Baroness, Lady Bennett. In respect of this Bill and any of the potential harms around generative AI, if we have a Minister who is mindful of the need for safety by design when we have concluded this Bill then we will have dealt with the bits that we needed to deal with as far as this Bill is concerned.
My Lords, what more can I say than that I wish to be associated with the comments made by the noble Baroness and then by the noble Lord, Lord Clement-Jones? I look forward to the Minister’s reply.
I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.
As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.
The Bill already includes a definition of age assurance in Clause 207, which is
“measures designed to estimate or verify the age or age-range of users of a service”.
As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.
This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.