Committee (6th Day)
15:17
Relevant document: 28th Report from the Delegated Powers Committee
Clause 14: Duties to protect news publisher content
Amendment 50B
Moved by
50B: Clause 14, page 15, line 30, leave out “subsection (2)(a)” and insert “this section”
Member’s explanatory statement
This is a technical amendment to make it clear that clause 14(9), which sets out circumstances which do not count as a provider “taking action” in relation to news publisher content, applies for the purposes of the whole clause.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, His Majesty’s Government are committed to defending the invaluable role of our free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information. That is why this Bill includes strong protections for recognised news publishers. The Bill does not impose new duties on news publishers’ content, which is exempt from the Bill’s safety duties. In addition, the Bill includes strong safeguards for news publisher content, set out in Clause 14. In order to benefit from these protections, publishers will have to meet a set of stringent criteria, set out in Clause 50.

I am aware of concerns in your Lordships’ House and another place that the definition of news publishers is too broad and that these protections could therefore create a loophole to be exploited. That is why the Government are bringing forward amendments to the definition of “recognised news publisher” to ensure that sanctioned entities cannot benefit from these protections. I will shortly explain these protections in detail but I would like to be clear that narrowing the definition any further would pose a critical risk to our commitment to self-regulation of the press. We do not want to create requirements which would in effect put Ofcom in the position of a press regulator. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit. 

Government Amendments 126A and 127A propose changes to the criteria for recognised news publishers. These criteria already exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or the purpose of which is to support a proscribed organisation under that Act. We are clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections either. The amendments we are tabling today will therefore tighten the recognised news publisher criteria further by excluding entities that have been designated for sanctions imposed by both His Majesty’s Government and the United Nations Security Council. I hope noble Lords will accept these amendments, in order to ensure that content from publishers which pose a security threat to this country cannot benefit from protections designed to defend a free press.

In addition, the Government have also tabled amendments 50B, 50C, 50D, 127B, 127C and 283A, which are aimed at ensuring that the protections for news publishers in Clause 14 are workable and do not have unforeseen consequences for the operation of category 1 services. Clause 14 gives category 1 platforms a duty to notify recognised news publishers and offer a right of appeal before taking action against any of their content or accounts.

Clause 14 sets out the circumstances in which companies must offer news publishers an appeal. As drafted, it states that platforms must offer this before they take down news publisher content, before they restrict users’ access to such content or where they propose to “take any other action” in relation to publisher content. Platforms must also offer an appeal if they propose to take action against a registered news publisher’s account by giving them a warning, suspending or banning them from using a service or in any way restricting their ability to use a service.

These amendments provide greater clarity about what constitutes “taking action” in relation to news publisher content, and therefore when category 1 services must offer an appeal. They make it clear that a platform must offer this before they take down such content, add a warning label or take any other action against content in line with any terms of service that allow or prohibit content. This will ensure that platforms are not required to offer publishers a right of appeal every time they propose to carry out routine content curation and similar routine actions. That would be unworkable for platforms and would be likely to inhibit the effectiveness of the appeal process.

As noble Lords know, the Bill has a strong focus on user empowerment and enabling users to take control of their online experience. The Government have therefore tabled amendments to Clause 52 to ensure that providers are required only to offer publishers a right of appeal in relation to their own moderation decisions, not where a user has voluntarily chosen not to view certain types of content. For example, if a user has epilepsy and has opted not to view photo-sensitive content, platforms will not be required to offer publishers a right of appeal before restricting that content for the user in question.

In addition, to ensure that the Bill maintains strong protections for children, the amendments make it clear that platforms are not required to offer news publishers an appeal before applying warning labels to content viewed by children. The amendments also make it clear that platforms would be in breach of the legislation if they applied warning labels to content encountered by adults without first offering news publishers an appeal, but in order to ensure that the Bill maintains strong protections for children, that does not apply to warning labels on content encountered by children. I beg to move.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the amendments the Government have tabled, but I ask the Minister to clarify the effect of Amendment 50E. I declare an interest as chair of the Communications and Digital Select Committee, which has discussed Amendment 50E and the labelling of content for children with the news media organisations. This is a very technical issue, but from what my noble friend was just saying, it seems that content that would qualify for labelling for child protection purposes, and which therefore does not qualify for a right of appeal before the content is so labelled, is not content that would normally be encountered by adults but might happen to appeal to children. I would like to be clear that we are not giving the platforms scope for adding labels to content that they ought not to be adding labels to. That aside, as I say, I am grateful to my noble friend for these amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, like the noble Baroness, Lady Stowell, I have no major objection and support the Government’s amendments. In a sense the Minister got his retaliation in first, because we will have a much more substantial debate on the scope of Clause 14. At this point I welcome any restriction on Clause 14 in the way that the Minister has stated.

Yet to come we have the whole issue of whether an unregulated recognised news publisher, effectively unregulated by the PRP’s arrangements, should be entitled to complete freedom in terms of below-the-line content, where there is no moderation and it does not have what qualifies as independent regulation. Some debates are coming down the track and—just kicking the tyres on the Minister’s amendments—I think the noble Baroness, Lady Stowell, made a fair point, which I hope the Minister will answer.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his very clear and precise introduction of these amendments. As the noble Lord, Lord Clement-Jones, said, we will return to some of the underlying issues in future debates. It may be that this is just an aperitif to give us a chance to get our minds around these things, as the noble Baroness, Lady Stowell, said.

It is sometimes a bit difficult to understand exactly what issue is being addressed by some of these amendments. Even trying to say them got us into a bit of trouble. I think I follow the logic of where we are in the amendments that deal with the difference between adult material and children’s material, but it would benefit us all if the Minister could repeat it, perhaps a little slower this time, and we will see if we can agree that that is the way forward.

Broadly speaking, we accept the arrangements. They clarify the issues under which the takedown and appeal mechanisms will work. They are interfacing with the question of how the Bill deals with legal but harmful material, particularly for those persons who might wish not to see material and will not be warned about it under any process currently in the Bill but will have a toggle to turn to. It safeguards children who would not otherwise be covered by that. That is a fair balance to be struck.

Having said that, we will be returning to this. The noble Lord, Lord Clement-Jones, made the good point that we have a rather ironic situation where a press regulation structure set up and agreed by Parliament is not in operation across the whole of the press, but we do not seem to make any accommodation for that. This is perhaps something we should return to at a later date.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I want very briefly to probe something. I may have got the wrong end of the stick, but I want to just ask about the recognised news publishers. The Minister’s explanation about what these amendments are trying to do was very clear, but I have some concerns.

I want to know how this will affect how we understand what a recognised news publisher is in a world in which we have many citizen journalists, blogs and online publications. One of the democratising effects of the internet has been in opening up spaces for marginalised voices, campaign journalism and so on. I am worried that we may inadvertently put them into a category of being not recognised; maybe the Minister can just explain that.

I am also concerned that, because this is an area of some contention, this could be a recipe for all sorts of litigious disputes with platforms about content removal, what constitutes those carve-outs and what is a recognised news, journalism or publishing outlet.

I know we will come on to this, but for now I am opposed to Amendment 127 in this group—or certainly concerned that it is an attempt to coerce publishers into a post-Leveson regulatory structure by denying them the protections that the Bill will give news publishers, unless they sign up in certain ways. I see that as blackmail and bullying, which I am concerned about. Much of the national press and many publishers have refused to join that kind of regulatory regime post Leveson, as is their right; I support them in the name of press freedom. Any comments or clarifications would be helpful.

15:30
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am sorry; in my enthusiasm to get this day of Committee off to a swift start, I perhaps rattled through that rather quickly. On Amendment 50E, which my noble friend Lady Stowell asked about, I make clear that platforms will be in breach of their duties if, without applying the protection, they add warning labels to news publishers’ content that they know will be seen by adult users, regardless of whether that content particularly appeals to children.

As the noble Lord, Lord Clement-Jones, and others noted, we will return to some of the underlying principles later on, but the Government have laid these amendments to clarify category 1 platforms’ duties to protect recognised news publishers’ content. They take some publishers out of scope of the protections and make it clearer that category 1 platforms will have only to offer news publishers an appeal before taking punitive actions against their content.

The noble Baroness, Lady Fox, asked about how we define “recognised news publisher”. I am conscious that we will debate this more in later groups, but Clause 50 sets out a range of criteria that an organisation must meet to qualify as a recognised news publisher. These include the organisation’s “principal purpose” being the publication of news, it being subject to a “standards code” and its content being “created by different persons”. The protections for organisations are focused on publishers whose primary purpose is reporting on news and current affairs, recognising the importance of that in a democratic society. I am grateful to noble Lords for their support.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

What my noble friend said is absolutely fine with me, and I thank him very much for it. It might be worth letting the noble Baroness, Lady Fox, know that Amendment 127 has now been moved to the group that the noble Lord, Lord Clement-Jones, referred to. I thought it was worth offering that comfort to the noble Baroness.

Amendment 50B agreed.
Amendments 50C to 50E
Moved by
50C: Clause 14, page 15, line 44, leave out subsection (11)
Member’s explanatory statement
This amendment omits a provision about OFCOM’s guidance under clause 171, as that provision is now to be made in clause 171 itself.
50D: Clause 14, page 16, line 3, leave out paragraph (b)
Member’s explanatory statement
This amendment omits the definition of “taking action” in relation to content, as that is now dealt with by the amendment in the Minister’s name below.
50E: Clause 14, page 16, line 10, at end insert—
“(13A) In this section references to “taking action” in relation to content are to—(a) taking down content,(b) restricting users’ access to content, or(c) adding warning labels to content, except warning labels normally encountered only by child users,and also include references to taking any other action in relation to content on the grounds that it is content of a kind which is the subject of a relevant term of service (but not otherwise).(13B) A “relevant term of service” means a term of service which indicates to users (in whatever words) that the presence of a particular kind of content, from the time it is generated, uploaded or shared on the service, is not tolerated on the service or is tolerated but liable to result in the provider treating it in a way that makes it less likely that other users will encounter it.”Member’s explanatory statement
This amendment provides a revised definition of what it means to “take action” in relation to news publisher content, to ensure that the clause only applies to actions other than those set out in subsection (13A)(a), (b) or (c) in the circumstances set out in subsection (13B).
Amendments 50C to 50E agreed.
Clause 14, as amended, agreed.
Clause 15: Duties to protect journalistic content
Amendment 50F
Moved by
50F: Clause 15, page 17, line 14, at end insert—
“(8A) In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.”Member’s explanatory statement
This amendment indicates that the size and capacity of a provider is important in construing the reference to “proportionate systems and processes” in clause 15 (duties to protect journalistic content).
Amendment 50F agreed.
Amendment 51 not moved.
Clause 15, as amended, agreed.
Amendment 52
Moved by
52: After Clause 15, after Clause 15, insert the following new Clause—
“Health disinformation and misinformation
(1) This section sets out the duties about harmful health disinformation and misinformation which apply in relation to Category 1 services.The duties(2) A duty to carry out and keep up to date a risk assessment of the risks presented by harmful health disinformation and misinformation that is present on the service.(3) A duty to develop and maintain a policy setting out the service’s approach to the treatment of harmful health disinformation and misinformation on the service. (4) A duty to explain in the policy how the service’s approach to the treatment of harmful disinformation and misinformation is designed to mitigate or manage any risks identified in the latest risk assessment.(5) A duty to summarise the policy in the terms of service, and to include provisions in the terms of service about how that content is to be treated on the service.(6) A duty to ensure that the policy, and any related terms of service, are—(a) clear and accessible, and(b) applied consistently.(7) In this section, “harmful health disinformation and misinformation” means content which contains information which—(a) is false or misleading in a material respect; and(b) presents a material risk of significant harm to the health of an appreciable number of persons in the United Kingdom.”Member’s explanatory statement
This new Clause would introduce a variety of duties on Category 1 platforms, in relation to their treatment of content which represents harmful health misinformation and disinformation.
Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

My Lords, I shall speak to this group which includes Amendments 52, 99 and 222 in my name. These are complemented by Amendments 223 and 224 in the name of my noble friend Lord Knight. I am grateful to the noble Lords, Lord Clement-Jones and Lord Bethell, and to the noble Baroness, Lady Bennett, for putting their names to the amendments in this group. I am also grateful to the noble Lord, Lord Moylan, for tabling Amendments 59, 107 and 264. I appreciate also the work done by the APPG on Digital Regulation and Responsibility and by Full Fact on this group, as well as on many others in our deliberations.

These amendments would ensure that platforms were required to undertake a health misinformation and disinformation risk assessment. They would also require that they have a clear policy in their terms of service on dealing with harmful, false and misleading health information, and that there are mechanisms to support and monitor this, including through the effective operation of an advisory committee which Ofcom would be required to consult. I appreciate that the Minister may wish to refer to the false communication offence in Clause 160 as a reason why these amendments are not required. In order to pre-empt this suggestion, I put it to him that the provision does not do the job, as it covers only a user sending a knowingly false communication with the intention of causing harm, which does not cover most of the online health misinformation and disinformation about which these amendments are concerned.

Why does all this matter? The stakes are high. False claims about miracle cures, unproven treatments and dangerous remedies can and do spread rapidly, leading people to make the poorest of health decisions, with dire consequences. We do not have to go far back in time to draw on the lessons of our experience. It is therefore disappointing that the Government have not demonstrated, through this Bill, that they have learned the lessons of the Covid-19 pandemic. This is of concern to many health practitioners and representatives, as well as to Members of your Lordships’ House. We all remember the absolute horror of seeing false theories being spread quickly online, threatening to undermine the life-saving vaccine rollout. In recent years, the rising anti-vaccine sentiment has certainly contributed to outbreaks of preventable diseases that had previously been eradicated. This is a step backwards.

In 2020, an estimated 5,800 people globally were admitted to hospital because of false information online relating to Covid-19, with at least 800 people believed to have died because they followed this misinformation or disinformation. In 2021, the Royal College of Obstetricians and Gynaecologists found that only 40% of women offered the vaccine against Covid-19 had accepted it, with many waiting for more evidence that it would be safe. It is shocking to recall that, in October 2021, one in five of the most critically ill Covid patients was an unvaccinated, pregnant woman.

If we look beyond Covid-19, we see misinformation and disinformation affecting many other aspects of health. I will give a few examples. There are false claims about cancer treatment—for example, lemons treat cancer better than chemotherapy; tumours are there to save your life; cannabis oil cures cancer; rubbing hydrogen peroxide on your skin will treat cancer. Just last year, the lack of publicly available information about Mpox fuelled misinformation online. There is an issue about the Government’s responsibility for ensuring that there is publicly available information about health risks. In this respect, the lack of it—the void—led to a varied interpretation and acceptance of the public health information that was available, limited though it was. UNAIDS also expressed concern that public messaging on Mpox used language and imagery that reinforced homophobic and racist stereotypes.

For children, harmful misinformation has linked the nasal flu vaccine to an increase in Strep A infections. In late 2022, nearly half of all parents falsely believed these claims, such that the uptake of the flu vaccine among two and three year-olds dropped by around 11%. It is not just that misinformation and disinformation may bombard us online and affect us; there are also opportunities for large, language-model AIs such as ChatGPT to spread misinformation.

The Government had originally promised to include protections from harmful false health content in their indicative list of harmful content that companies would have been required to address under the now removed adult safety duties, yet we find that the Bill maintains the status quo, whereby platforms are left to their own devices as to how they tackle health misinformation and disinformation, without the appropriate regulatory oversight. It is currently up to them, so they can remove it at scale or leave it completely unchecked, as we recently saw when Twitter stopped enforcing its Covid-19 misinformation policy. This threatens not just people’s health but their freedom of expression and ability to make proper informed decisions. With that in mind, I look forward to amendments relating to media literacy in the next group that the Committee will consider.

I turn to the specific amendments. The new clause proposed in Amendment 52 would place a duty on category 1 platforms to undertake a health misinformation risk assessment and set out a policy on their treatment of health misinformation content. It would also require that the policy and related terms of service are consistently applied and clear and accessible—something that we have previously debated in this Committee. It also defines what is meant by

“harmful health disinformation and misinformation”—

and, again, on that we have discussed the need for clarity and definition.

Amendment 99 would require Ofcom to consult an advisory committee on disinformation and misinformation when preparing draft codes of practice or amendments to such codes. Amendment 222 is a probing amendment and relates to the steps, if any, that Ofcom will be expected to take to avoid the advisory committee being dominated by representatives of regulated services. It is important to look at how the advisory committee is constructed, as that will be key not just to the confidence that it commands but to its effectiveness.

Amendment 223, in the name of my noble friend Lord Knight, addresses the matter of timeliness in respect of the establishment of the advisory committee, which should be within six months of the Bill being passed. Amendment 224, also in the name of my noble friend Lord Knight, would require the advisory committee to consider as part of its first report whether a dedicated Ofcom code of practice in this area would be effective in the public interest. This would check that we have the right building blocks in place. With that in mind, I beg to move.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a great honour to rise after the noble Baroness, Lady Merron, who spoke so clearly about Amendment 52 and the group of amendments connected with health misinformation, some of which stand also in my name.

As the noble Baroness rightly pointed out, we have known for a long time the negative impact of social media, with all its death scrolls, algorithms and rabbit holes on vaccine uptake. In 2018, the University of Southampton did a study of pregnant women and found that those who reported using social media to research antenatal vaccinations were 58% less likely to accept the whooping cough vaccine. Since then, things have only got worse.

15:45
As a junior Health Minister during the pandemic, I saw how the successful vaccine rollout was at severe risk of being undermined by misinformation, amplified by foreign actors and monetised by cynical commercial interests. The challenge was enormous. The internet, as we know, is a highly curated environment that pushes content, functions and services that create an emotional response and retain our attention. Social media algorithms are absolutely the perfect tool for conspiracy theorists, and a pandemic necessarily raises everyone’s concerns. It was unsurprising that a lot of people went down various rabbit holes on health information.
The trust between our clinical professionals and their patients relies on a shared commitment to evidence-based science. That can quickly go out of the window if the algorithms are pushing rousing content that deliberately plays into people’s worst fears and anxieties, thereby displacing complex and nuanced analysis with simplistic attention-seeking hooks, based sometimes on complete nonsense. The noble Baroness, Lady Merron, mentioned lemons for cancer as a vivid example of that.
At the beginning of the vaccine programme, a thorough report by King’s College London, funded by the NIHR health protection research unit, found that 14% of British adults believed the real purpose of mass vaccination against coronavirus was to track and control the population. That rose to an astonishing 42% among those who got their information from WhatsApp, 39% for YouTubers, 29% from the Twitterati and 28% from Facebookers. I remember that, when those statistics came through, it put this important way out of the pandemic in jeopardy.
I remind the Committee that a great many people make money out of such fear. I highly recommend the Oxford University Journal of Communication article on digital profiteering for a fulsome and nuanced guide to the economics of the health misinformation industry. I also remind noble Lords that foreign actors and states are causing severe trouble in this area. “Foreign disinformation” social media campaigns are linked to falling vaccination rates, according to an international time-trend analysis published by BMJ Global Health.
As it happens, in the pandemic, the DHSC, the Cabinet Office and a wide group throughout government worked incredibly thoughtfully on a communications strategy that sought to answer people’s questions and apply the sunlight of transparency to the vaccine process. It balanced the rights to freedom of expression with protecting our central strategy for emerging from the pandemic through the vaccine rollout. I express considerable thanks to those officials, and the social media industry, who leant into the issue more out of a sense of good will than any legal obligation. I was aware of some of the legal ambiguities around those times.
Since then, things have gone backwards, not forwards. Hesitancy in the UK has risen, with a big impact on vaccine take-up rates. We are behind on 13 out of the 14 routine vaccine programmes, well behind the 95% target set by the World Health Organization. The results are clear: measles is rising because of vaccine uptake falling, and that is true of many common, avoidable diseases. As for the platforms, Twitter’s recent decision at the end of last year to suddenly stop enforcing its Covid-19 misinformation policy was a retrograde step and possibly the beginning of a worrying trend that we should all be conscious of, and is one of the motivating reasons for this amendment.
Unfortunately, the Government’s decision to remove from the Bill the provisions on content harmful to adults, and with that the scope to include harmful health content, has had unintended consequences and left a big gap. We will have learned nothing from the pandemic if we do not act to plug that gap. The amendment and associated amendments in the group seek to address this by introducing three duties, as the noble Baroness, Lady Merron explained.
The first requirement is an assessment of the risks presented by harmful health disinformation and misinformation. Anyone who has been listening to these debates will recognise that this very much runs with the grain of the Bill’s approach and is consistent with many of the good things already in the Bill. Risk assessments are a very valuable tool in our approach to misinformation. I remind noble Lords that, for this Bill, “content” has a broad meaning that includes services and functions of a site, including the financial exploitation of that content. Secondly, the amendment would require large platforms to publish a policy setting out their approach to health misinformation. Each policy would have to explain how it is designed to mitigate or manage risks and should be kept up to date and maintained. That kind of transparency is at the heart of how we hold platforms to account. Lastly, platforms would be required to summarise their health misinformation policy in terms that consumers can properly understand.
This approach is consistent with the spirit of the Bill’s treatment of many harms: we are seeking transparency and we are creating accountability, but we are not mandating protocols. The consequences are clear. Users, health researchers and internet analysts would be able to see clearly how a platform proposes to deal with health misinformation that they may encounter on a particular service and make informed decisions as a result. The regulator would be able to see clearly what the nature of these risks is.
May I briefly tackle some natural concerns? On the question of protection of freedom of expression, my noble friend Lord Moylan rightly reminded us on Tuesday of Article 19 of the UN Universal Declaration of Human Rights: everyone has the freedom to express opinions and speech. On this point, I make it clear that this amendment would not require platforms to remove health misinformation from their service or to prescribe particular responses. In fact, I would go further. I recognise that it is important to have a full debate about the efficacy, safety and financial wisdom of treatments, cures and vaccines. This amendment would do nothing to close down that debate. It is about clarity. The purpose of the amendment is to prevent providers ducking the question about how they handle health misinformation. To that extent, it would help both those who are worried about health misinformation and those who are worried about being branded as sharing health misinformation to know where the platforms are coming from. It would ensure that providers establish what is happening on their service, what the associated risks to their users are, and then to shine a light on how they intend to deal with it.
I also make it clear that this is not just about videos, articles and tweets. We should also be considering whether back-end payment mechanisms, including payment intermediaries, donation collection services and storefront support, should be used to monetise health misinformation and enable bad actors. During the pandemic, the platforms endorsed the principle that no company should be profiting from Covid-19 vaccine misinformation, for instance. It is vital that this is considered as part of the platforms’ response to health misinformation. We should have transparency about whether platforms such as PayPal and Google are accepting donations, membership or merchandise payments from known misinformation businesses. Is Amazon, for instance, removing products that are used to disseminate health misinformation? Are crowdfunding websites hosting health misinformation campaigns from bad actors?
To anticipate my noble friend the Minister, I say that he will likely remind us that there are measures already in place in the Bill if the content is criminal or likely to be viewed by children, and I welcome those provisions. However, as the Bill stands, the actual policies on misinformation and the financial exploitation of that content will be a matter of platform discretion, with no clarity for users or the regulator. It will be out of sight of clear regulatory oversight. This is a mistake, just as Twitter has just shown, and that is why we need this change.
Senior clinicians including Sir Jeremy Farrar, Professor John Bell and the noble Lord, Lord Darzi, have written to the Secretary of State to raise their concerns. These are serious players voicing serious concerns. The approach in Amendment 52 is, in my view, the best and most proportionate way to protect those who are most vulnerable to false and misleading information.
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendments 59, 107 and 264 in this group, all of which are in my name. Like the noble Baroness, Lady Merron, I express gratitude to Full Fact for its advice and support in preparing them.

My noble friend Lord Bethell has just reminded us of the very large degree of discretion that is given to platforms by the legislation in how they respond to information that we might all agree, or might not agree, is harmful, misinformation or disinformation. We all agree that those categories exist. We might disagree about what falls into them, but we all agree that the categories exist, and the discretion given to the providers in how to handle it is large. My amendments do not deal specifically with health-related misinformation or disinformation but are broader.

The first two, Amendments 59 and 107—I am grateful to my noble friend Lord Strathcarron for his support of Amendment 59—try to probe what the Government think platforms should do when harmful material, misinformation and disinformation appear on their platforms. As things stand, the Government require that the platforms should decide what content is not allowed on their platforms; then they should display this in their terms of service; and they should apply a consistent approach in how they manage content that is in breach of their terms of service. The only requirement is for consistency. I have no objection to their being required to behave consistently, but that is the principal requirement.

What Amendments 59 and 107 do—they have similar effects in different parts of the Bill; one directly on the platforms; the other in relation to codes of practice—is require them also to act proportionately. Here, it might be worth articulating briefly the fact that there are two views about platforms and how they respond, both legitimate. One is that some noble Lords may fear that platforms will not respond at all: in other words, they will leave harmful material on their site and will not properly respond.

The other fear, which is what I want to emphasise, is that platforms will be overzealous in removing material, because they will have written their terms of service, as I said on a previous day in Committee, not only for their commercial advantage but also for their legal advantage. They will have wanted to give themselves a wide latitude to remove material, or to close accounts, because that will help cover their backs legally. Of course, once they have granted themselves those powers, the fear is that they will use them overzealously, even in cases where that would be an overreaction. These two amendments seek to oblige the platforms to respond proportionately, to consider alternative approaches to cancellation and removal of accounts and to be obliged to look at those as well.

There are alternative approaches that they could consider. Some companies already set out to promote good information, if you like, and indeed we saw that in the Covid-19 pandemic. My noble friend Lord Bethell said that they did so, and they did so voluntarily. This amendment would not explicitly but implicitly encourage that sort of behaviour as a first resort, rather than cancellation, blocking and removal of material as a first resort. They would still have the powers to cancel, block and remove; it is a question of priority and proportionality.

There are also labels that providers can put on material that they think is dubious, saying, “Be careful before you read this”, or before you retweet it; “This is dubious material”. Those practices should also be encouraged. These amendments are intended to do that, but they are intended, first and foremost, to probe what the Government’s attitude is to this, whether they believe they have any role in giving guidance on this point and how they are going to do so, whether through legislation or in some other way, because many of us would like to know.

Amendment 264, supported by my noble friend Lord Strathcarron and the noble Lord, Lord Clement-Jones, deals with quite a different matter, although it falls under the general category of misinformation and disinformation: the role the Government take directly in seeking to correct misinformation and disinformation on the internet. We know that No. 10 has a unit with this explicit purpose and that during the Covid pandemic it deployed military resources to assist it in doing so. Nothing in this amendment would prevent that continuing; nothing in it is intended to create scare stories in people’s minds about an overweening Government manipulating us. It is intended to bring transparency to that process.

16:00
Amendment 264 requires that once a year, within six months of the enactment of the Bill and annually thereafter, the Government would be required to produce a report setting out relevant representations they had made to providers during that previous year. It specifies the relevant representations: trying to persuade platforms to modify their terms of service, to restrict or remove a particular user’s access or to take down, reduce the visibility of or restrict access to content. The Secretary of State would be required to present a new report to Parliament once a year so that we understood what was happening. As I say, it would not inhibit the Government doing it—there may well be good reasons for their doing so—but in this age people feel entitled to know.
Concerns might be expressed that, in doing so, national security might be compromised in some way because of the involvement of the Army or whatever. However, as drafted, this amendment gives the Secretary of State the power, simply if he considers something to be harmful to national security, not to publish it and to withhold it, so I think no national security argument can be made against this. Instead, he would be required to summarise it in a report to the Intelligence and Security Committee of Parliament. It would not enter the public domain. That is a grown-up thing to ask for. I am sustained in that view by the support for the amendment from at least one opposition spokesman.
Those are the two things I am trying to achieve, which in many ways speak for themselves. I hope my noble friend will feel able to support them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I have given notice in this group that I believe Clause 139 should not stand part of the Bill. I want to remove the idea of Ofcom having any kind of advisory committee on misinformation and disinformation, at least as it has been understood. I welcome the fact that the Government have in general steered clear of putting disinformation and misinformation into the Bill, because the whole narrative around it has become politicised and even weaponised, often to delegitimise opinions that do not fit into a narrow set of official opinions or simply to shout abuse at opponents. We all want the truth—if only it was as simple as hiring fact-checkers or setting up a committee.

I am particularly opposed to Amendment 52 from the noble Baroness, Lady Merron, and the noble Lord, Lord Bethell. They have both spoken very eloquently of their concerns, focusing on harmful health misinformation and disinformation. I oppose it because it precisely illustrates my point about the danger of these terms being used as propaganda.

There was an interesting and important investigative report brought out in January this year by Big Brother Watch entitled Inside Whitehall’s Ministry of Truth—How Secretive “Anti-Misinformation” Teams Conducted Mass Political Monitoring. It was rather a dramatic title. We now know that the DCMS had a counter-disinformation unit that had a special relationship with social media companies, and it used to recommend that content was removed. Interestingly, in relation to other groups we have discussed, it used third-party contractors to trawl through Twitter looking for perceived terms of service violations as a reason for content to be removed. This information warfare tactic, as we might call it, was used to target politicians and high-profile journalists who raised doubts or asked awkward questions about the official pandemic response. Dissenting views were reported to No.10 and then often denounced as misinformation, with Ministers pushing social media platforms to remove posts and promote Government-sponsored lines.

It has been revealed that a similar fake news unit was in the Cabinet Office. It got Whitehall departments to attack newspapers for publishing articles that analysed Covid-19 modelling, not because it was accurate—it was not accurate in many instances—but because it feared that any scepticism would affect compliance with the rules. David Davis MP appeared in an internal report on vaccine hesitancy, and his crime was arguing against vaccine passports as discriminatory, which was a valid civil liberties opposition but was characterised as health misinformation. A similar approach was taken to vaccine mandates, which led to tens of thousands of front-line care workers being sacked even though, by the time this happened, the facts were known: the vaccine was absolutely invaluable in protecting individual health, but it did not stop transmission, so there was no need for vaccine mandates to be implemented. The fact that this was not discussed is a real example of misinformation, but we did not have it in the public sphere.

Professor Carl Heneghan’s Spectator article that questioned whether the rule of six was an arbitrary number was also flagged across Whitehall as misinformation, but we now know that the rule of six was arbitrary. Anyone who has read the former Health Secretary Matt Hancock’s WhatsApp messages, which were leaked to the Telegraph and which many of us read with interest, will know that many things posed as “the science” and factual were driven by politics more than anything else. Covid policies were not all based on fact, yet it was others who were accused of misinformation.

Beyond health, the Twitter files leaked by Elon Musk, when he became its new owner, show the dangers of using the terms misinformation and disinformation to pressure big tech platforms into becoming tools of political censorship. In the run-up to the 2020 election, Joe Biden’s presidential campaign team routinely flagged tweets and accounts it wanted to be censored, and we have all seen the screengrab of email exchanges between executives as evidence of that. Twitter suppressed the New York Post’s infamous Hunter Biden laptop exposé on the spurious grounds that it was “planted Russian misinformation”. The Post was even locked out of its own account. It took 18 months for the Washington Post and the New York Times to get hold of, and investigate, Hunter Biden’s emails, and both determined that the New York Post’s original report was indeed legitimate and factually accurate, but it was suppressed as misinformation when it might have made some political difference in an election.

We might say that all is fair in love and war and elections but, to make us think about what we mean by “misinformation” and why it is not so simple, was the Labour Party attack ad that claimed Rishi Sunak did not believe that paedophiles should go to jail fair comment or disinformation, and who decides? I know that Tobias Ellwood MP called for a cross-party inquiry on the issue, calling on social media platforms to do more to combat “malicious political campaigns”. I am not saying that I have a view one way or another on this, but my question is: in that instance, who gets to label information as “malicious” or “fake” or “misinformation”? Who gets the final say? Is it a black and white issue? How can we avoid it becoming partisan?

Yesterday, at the Second Reading of the Illegal Migration Bill, I listened very carefully to the many contributions. Huge numbers of noble Lords continually claimed that all those in the small boats crossing the channel were fleeing war and persecution—fleeing for their lives. Factually that was inaccurate, according to detailed statistics and evidence, yet no one called those contributors “peddlers of misinformation”, because those speaking are considered to be compassionate and on the righteous side of the angels—at least in the case of the most reverend Primate the Archbishop of Canterbury—and, as defined by this House, they were seen to be saying the truth, regardless of the evidence. My point is that it was a political argument, yet here we are focusing on this notion that the public are being duped by misinformation.

What about those who tell children that there are 140 genders to choose from, or that biological sex is immutable? I would say that is dangerous misinformation or disinformation; others would say that me saying that is bigoted. There is at least an argument to be had, but it illustrates that the labelling process will always be contentious, and therefore I have to ask: who is qualified to decide?

A number of amendments in this group put forward a variety of “experts” who should be, for example, on the advisory committee—those who should decide and those who should not—and I want to look at this notion of expertise in truth. For example, in the report by the Communications and Digital Committee in relation to an incident where Facebook marked as “false” a post on Covid by a professor of evidence-based medicine at Oxford University, the committee asked Facebook about the qualifications of those who made that judgment—of the fact-checkers. It was told that they were

“certified by the International Fact-Checking Network”.

Now, you know, who are they? The professor of evidence-based medicine at Oxford University might have a bit more expertise here, and I do not want a Gradgrind version of truth in relation to facts, and so on.

If it were easy to determine the truth, we would be able to wipe out centuries of philosophy, but if we are going to have a committee determining the truth, could we also have some experts in civil liberties—maybe the Free Speech Union, Big Brother Watch, and the Index on Censorship—on a committee to ensure that we do not take down accurate information under the auspices of “misinformation”? Are private tech companies, or professional fact-checkers, or specially selected experts, best placed to judge the reliability of all sorts of information and of the truth, which I would say requires judgement, analysis and competing perspectives?

Too promiscuous a use of the terms “misinformation” and “disinformation” can also cause problems, and often whole swathes of opinion are lumped together. Those who raised civil liberties objections to lockdown where denounced “Covidiots”, conspiracy theorists peddling misinformation and Covid deniers, on a par with those who suggested that the virus was linked to everything from 5G masts to a conscious “plandemic”.

Those who now raise queries about suppressing any reference to vaccine harms, or who are concerned that people who have suffered proven vaccine-related harms are not being shown due support, are often lumped in with those who claim the vaccine was a crime against humanity. All are accused of misinformation, with no nuance and no attempt at distinguishing very different perspectives. Therefore, with such wide-ranging views labelled as “misinformation” as a means of censorship, those good intentions can backfire—and I do believe that there are good intentions behind many of these amendments.

16:15
To conclude, banning inaccurate ideas—if they are actually censored as misinformation or disinformation—can push them underground and allow them to fester unchallenged in echo chambers. It can also create martyrs. How often do we hear those who have embraced full-blown conspiracy theories, often peddling cranky and scaremongering theories, say, “They’re trying to silence me because they know that what I’m saying is true. What are they afraid of?” Historically, I think the best solution to bad speech is more speech and more argument; the fullest debate, discussion, scholarship, investigation and research—yes, googling, using Wikipedia or reading the odd book—and, of course, judgment and common sense to figure it out.
We should also remember from our history that what is labelled as false by a minority of people can be invaluable scepticism, challenging a consensus and eventually allowing truth to emerge. The fact—the truth—was once that the world was flat. Luckily, the fact-checkers were not around to ban the minority who challenged that view, and now we know a different truth.
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, I have attached my name to Amendments 52 and 99 in the name of the noble Baroness, Lady Merron, respectively signed by the noble Lords, Lord Bethell and Lord Clement-Jones, and Amendment 222 in her name. I entirely agree with what both the noble Baroness, Lady Merron, and the noble Lord, Lord Bethell, said. The noble Lord in particular gave us a huge amount of very well-evidenced information on the damage done during the Covid pandemic—and continuing to be done—by disinformation and misinformation. I will not repeat what they said about the damage done by the spread of conspiracy theories and anti-vaccination falsehoods and the kind of malicious bots, often driven by state actors, that have caused such damage.

I want to come from a different angle. I think we were—until time prevented it, unfortunately—going to hear from the noble Baroness, Lady Finlay of Llandaff, which would have been a valuable contribution to this debate. Her expert medical perspective would have been very useful. I think that she and I were the only two Members in the Committee who took part in the passage of the Medicines and Medical Devices Act. I think it was before the time of the noble Lord, Lord Bethell—he is shaking his head; I apologise. He took part in that as well. I also want to make reference to discussions and debates I had with him over changes to regulations on medical testing.

The additional point I want to make about disinformation and misinformation—this applies in particular to Amendment 222 about the independence of the advisory committee on disinformation and misinformation—is that we are now seeing in our medical system a huge rise in the number of private actors. These are companies seeking to encourage consumers or patients to take tests outside the NHS system and to get involved in a whole set of private provision. We are seeing a huge amount of advertising of foreign medical provision, given the pressures that our NHS is under. In the UK we have had traditionally, and still have, rules that place severe restrictions on the direct advertising of medicines and medical devices to patients— unlike, for example, the United States, where it is very much open slather, with some disastrous and very visible impacts.

We need to think about the fact that the internet, for better or for worse, is now a part of our medical system. If people feel ill, the first place they go—before they call the NHS, visit their pharmacist or whatever—is very often the internet, through these providers. We need to think about this in the round and as part of the medical system. We need to think about how our entire medical ecology is working, and that is why I believe we need amendments like these.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

The noble Baroness makes two incredibly important points. We are seeking to give people greater agency on their own health and the internet has been an enormous bonus in doing that, but of course that environment needs to be curated extremely well. We are also seeking to make use of health tech—non-traditional clinical interventions, some of which do not pierce the skin and therefore fall outside the normal conversation with GPs—and giving people the power to make decisions about the use of these new technologies for themselves. That is why curation of the health information environment is so important. Does the noble Baroness have any reflections on that.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I thank the noble Lord for his intervention. He has made me think of the fact that a particular area where this may be of grave concern is cosmetic procedures, which I think we debated during the passage of the Health and Care Act. These things are all interrelated, and it is important that we see them in an interrelated way as part of what is now the health system.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to a number of amendments in this group. I want to make the point that misinformation and disinformation was probably the issue we struggled with the most in the pre-legislative committee. We recognised the extraordinary harm it did, but also—as the noble Baroness, Lady Fox, said—that there is no one great truth. However, algorithmic spread and the drip, drip, drip of material that is not based on any search criteria or expression of an opinion but simply gives you more of the same, particularly the most shocking, moves very marginal views into the mainstream.

I am concerned that our debates over the last five days have concentrated so much on content, and that the freedom we seek does not take enough account of the way in which companies currently exercise control over the information we see. Correlations such as “Men who like barbecues are also susceptible to conspiracy theories” are then exploited to spread toxic theories that end in real-world harm or political tricks that show, for example, the Democrats as a paedophile group. Only last week I saw a series of pictures, presented as “evidence”, of President Biden caught in a compromising situation that gave truth to that lie. As Maria Ressa, the Nobel Peace Prize winner for her contribution to the freedom of expression, said in her acceptance speech:

“Tech sucked up our personal experiences and data, organized it with artificial intelligence, manipulated us with it, and created behavior at a scale that brought out the worst in humanity”.


That is the background to this set of amendments that we must take seriously.

As the noble Lord, Lord Bethell, said, Amendment 52 will ensure that platforms undertake a health misinformation risk assessment and provide a clear policy on dealing with harmful, false and misleading information. I put it to the Committee that, without this requirement, we will keep the status quo in which clicks are king, not health information.

It is a particular pleasure to support the noble Lord, Lord Moylan, on his Amendments 59 and 107. Like him, I am instinctively against taking material down. There are content-neutral ways of marking or questioning material, offering alternatives and signposting to diverse sources—not only true but diverse. These can break this toxic drip feed for long enough for people to think before they share, post and make personal decisions about the health information that they are receiving.

I am not incredibly thrilled by a committee for every occasion, but since the Bill is silent on the issue of misinformation and disinformation—which clearly will be supercharged by the rise of large language data models—it would be good to give a formal role to this advisory committee, so that it can make a meaningful and formal contribution to Ofcom as it develops not only this code of conduct but all codes of conduct.

Likewise, I am very supportive of Amendment 222, which seeks independence for the chair of the advisory body. I have seen at first hand how a combination of regulatory capture and a very litigious sector with deep pockets slows down progress and transparency. While the independence of the chair should be a given, our collective lived experience would suggest otherwise. This amendment would make that requirement clear.

Finally, and in a way most importantly, Amendment 224 would allow Ofcom to consider after the effect whether the code of conduct is necessary. This strikes a balance between adding to its current workload, which we are trying not to do, and tying one hand behind its back in the future. I would be grateful to hear from the Minister why we would not give Ofcom this option as a reasonable piece of future-proofing, given that this issue will be ever more important as AI creates layers of misinformation and disinformation at scale.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 52, tabled by my noble friend Lady Merron. This is an important issue which must be addressed in the Bill if we are to make real progress in making the internet a safer space, not just for children but for vulnerable adults.

We have the opportunity to learn lessons from the pandemic, where misinformation had a devastating impact, spreading rapidly online like the virus and threatening to undermine the vaccine rollout. If the Government had kept their earlier promise to include protection from harmful false health content in their indicative list of harmful content that companies would have been required to address under the now removed adult safety duties, these amendments would not be necessary.

It is naive to think that platforms will behave responsibly. Currently, they are left to their own devices in how they tackle health misinformation, without appropriate regulatory oversight. They can remove it at scale or leave it completely unchecked, as illustrated by Twitter’s decision to stop enforcing its Covid-19 misinformation policies, as other noble Lords have pointed out.

It is not a question of maintaining free speech, as some might argue. It was the most vulnerable groups who suffered from the spread of misinformation online—pregnant women and the BAME community, who had higher illness rates. Studies have shown that, proportionately, more of them died, not just because they were front-line workers but because of rumours spread in the community which resulted in vaccine hesitancy, with devastating consequences. As other noble Lords have pointed out, in 2021 the Royal College of Obstetricians and Gynaecologists found that only 42% of women who had been offered the vaccine accepted it, and in October that year one in five of the most critically ill Covid patients were unvaccinated, pregnant women. That is a heartbreaking statistic.

Unfortunately, it is not just vaccine fears that are spread on the internet. Other harmful theories can affect patients with cancer, mental health issues and sexual health issues, and, most worryingly, can affect children’s health. Rumours and misinformation play on the minds of the most vulnerable. The Government have a duty to protect people, and by accepting this amendment they would go some way to addressing this.

Platforms must undertake a health misinformation risk assessment and have a clear policy on dealing with harmful, false and misleading health information in their terms of service. They have the money and the expertise to do this, and Parliament must insist. As my noble friend Lady Merron said, I do not think that the Minister can say that the false communications offence in Clause 160 will address the problem, as it covers only a user sending a knowingly false communication with the intention of causing harm. The charity Full Fact has stated that this offence will exclude most health misinformation that it monitors online.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a very interesting debate. I absolutely agree with what the noble Baroness, Lady Kidron, said right at the beginning of her speech. This was one of the most difficult areas that the Joint Committee had to look at. I am not saying that anything that we said was particularly original. We tried to say that this issue could be partly addressed by greater media literacy, which, no doubt, we will be talking about later today; we talked about transparency of system design, and about better enforcement of service terms and conditions. But things have moved on. Clearly, many of us think that the way that the current Bill is drafted is inadequate. However, the Government did move towards proposing a committee to review misinformation and disinformation. That is welcome, but I believe that these amendments are taking the thinking and actions a step forward.

16:30
I do not agree with the noble Baroness, Lady Fox. What she has been saying is really a counsel of despair on not being able to deal with misinformation and disinformation. I was really interested to hear what the noble Lord, Lord Bethell, had to say about his experience —this is pretty difficult stuff to tackle when you are in a position of that sort. I support the noble Baronesses, Lady Bennett and Lady Healy, in what they had to say about this particular aspect. As the noble Baroness, Lady Kidron, said, it is about the system and the amplification that takes place, which brings out the worst in humanity.
The Puttnam report, by the Democracy and Digital Technologies Committee, also raised this. If Lord Puttnam had not retired from this House, he would be here today, saying that we need to do a lot more about this than we are proposing even in the amendments. In the report, the committee talked about a pandemic of misinformation. Nowhere is that more apparent than in health. The report was prescient; it came out in June 2020, some three years ago, well before we heard and saw all kinds of disinformation about vaccines.
We are seeing increasing numbers of commentators talking about the impact of misinformation and disinformation. We have had Ciaran Martin, former head of the National Cyber Security Centre, talking about the dangers to democracy. We have heard Sir Jeremy Fleming, head of GCHQ, saying that the main threat from AI is disinformation. We have had some really powerful statements, quite apart from seeing the impact of disinformation and misinformation on social media platforms.
On these Benches, we believe that the Government have a responsibility to intervene on misinformation and to support legislation to stop the spread of fake news. I believe that the public have an expectation that the Government do that and that the large social media companies address this issue on their platforms, hence my support for the amendments in these groups.
It has to be balanced. That is why I support the amendments by the noble Lord, Lord Moylan, as well. We have a common interest in trying to make sure that, while preventing misinformation and disinformation, we do it in a proportional way, as he described. That is of great importance.
The noble Lord, Lord Bethell, did not quote at length from the letter from Full Fact and all the health professionals, but, notably, it says:
“One key way that we can protect the future of our healthcare system is to ensure that internet companies have clear policies on how they identify the harmful health misinformation that appears on their platforms, as well as consistent approaches in dealing with it”.
It is powerful testimony from some very experienced and senior health professionals.
The focus of many of these amendments is on the way that the advisory committee will operate. Having an independent chair is of great importance, as is having a time limit within which there must be a report, along with other aspects.
The noble Lord, Lord Moylan, referred in one of the amendments to addressing the opacity of existing government methods for tackling disinformation. He mentioned one unit, but there are three units that I have been briefed about. There is the counter-disinformation unit in DCMS, which addresses mainly Covid issues that breach companies’ terms of service, and, recently, Russia/Ukraine issues. Then we have the Government Information Cell, which is based in the FCDO, and the rapid response unit, which I think he referred to, in the Cabinet Office. Ministers referred to these and said that the principal focus of the DCMS unit during the pandemic was Covid et cetera, but we do not know very much about what these units do or what their criteria are. Do they have any relationship with Ofcom? Will they have a relationship with Ofcom? It is important that we have something that reduces that level of opacity and opens up what those units do to a greater degree of scrutiny.
The only direct reference to misinformation in the Bill as it stands is to the advisory committee, so it is important that we know how it fits in with Ofcom’s wider regulatory functions, and that there is a duty to create a code of practice on information and misinformation. The advisory committee should be creative in the way it operates. One of the difficult issues we found is that there is not a great deal of knowledge out there about how to tackle misinformation and disinformation in a systemic way.
Finally, I was very interested in the briefing that noble Lords probably all received from Adobe, which talked about the Content Authenticity Initiative. That is exactly the kind of thing the advisory committee should be exploring. Apparently, it has more than 1,000 members, including media and tech companies, NGOs and so on. Its ambition is to promote the adoption of an open industry standard for content authenticity and provenance. That may sound like the holy grail, but it is something we should be trying to work towards.
These amendments are a means of at least groping towards a better way of tackling misinformation and disinformation, which, as we have heard, can have a huge impact, particularly in health.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this debate has demonstrated the diversity of opinion regarding misinformation and disinformation—as the noble Lord said, the Joint Committee gave a lot of thought to this issue—as well as the difficulty of finding the truth of very complex issues while not shutting down legitimate debate. It is therefore important that we legislate in a way that takes a balanced approach to tackling this, keeping people safe online while protecting freedom of expression.

The Government take misinformation and disinformation very seriously. From Covid-19 to Russia’s use of disinformation as a tool in its illegal invasion of Ukraine, it is a pervasive threat, and I pay tribute to the work of my noble friend Lord Bethell and his colleagues in the Department of Health and Social Care during the pandemic to counter the cynical and exploitative forces that sought to undermine the heroic effort to get people vaccinated and to escape from the clutches of Covid-19.

We recognise that misinformation and disinformation come in many forms, and the Bill reflects this. Its focus is rightly on tackling the most egregious, illegal forms of misinformation and disinformation, such as content which amounts to the foreign interference offence or which is harmful to children—for instance, that which intersects with named categories of primary priority or priority content.

That is not the only way in which the Bill seeks to tackle it, however. The new terms of service duties for category 1 services will hold companies to account over how they say they treat misinformation and disinformation on their services. However, the Government are not in the business of telling companies what legal content they can and cannot allow online, and the Bill should not and will not prevent adults accessing legal content. In addition, the Bill will establish an advisory committee on misinformation and disinformation to provide advice to Ofcom on how they should be tackled online. Ofcom will be given the tools to understand how effectively misinformation and disinformation are being addressed by platforms through transparency reports and information-gathering powers.

Amendment 52 from the noble Baroness, Lady Merron, seeks to introduce a new duty on platforms in relation to health misinformation and disinformation for adult users, while Amendments 59 and 107 from my noble friend Lord Moylan aim to introduce new proportionality duties for platforms tackling misinformation and disinformation. The Bill already addresses the most egregious types of misinformation and disinformation in a proportionate way that respects freedom of expression by focusing on misinformation and disinformation that are illegal or harmful to children.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I am curious as to what the Bill says about misinformation and disinformation in relation to children. My understanding of primary priority and priority harms is that they concern issues such as self-harm and pornography, but do they say anything specific about misinformation of the kind we have been discussing and whether children will be protected from it?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am sorry—I am not sure I follow the noble Baroness’s question.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Twice so far in his reply, the Minister has said that this measure will protect children from misinformation and disinformation. I was just curious because I have not seen any sight of that, either in discussions or in the Bill. I was making a distinction regarding harmful content that we know the shape of—for example, pornography and self-harm, which are not, in themselves, misinformation or disinformation of the kind we are discussing now. It is news to me that children are going to be protected from this, and I am delighted, but I was just checking.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that is what the measure does—for instance, where it intersects with the named categories of primary priority or priority content in the Bill, although that is not the only way the Bill does it. This will be covered by non-designated content that is harmful to children. As we have said, we will bring forward amendments on Report—which is perhaps why the noble Baroness has not seen them in the material in front of us—regarding material harms to children, and they will provide further detail and clarity.

Returning to the advisory committee that the Bill sets up and the amendments from the noble Baroness, Lady Merron, and my noble friend Lord Moylan, all regulated service providers will be forced to take action against illegal misinformation and disinformation in scope of the Bill. That includes the new false communication offences in the Bill that will capture communications where the sender knows the information to be false but sends it intending to cause harm—for example, hoax cures for a virus such as Covid-19. The noble Baroness is right to say that that is a slightly different approach from the one taken in her amendment, but we think it an appropriate and proportionate response to tackling damaging and illegal misinformation and disinformation. If a platform is likely to be accessed by children, it will have to protect them from encountering misinformation and disinformation content that meets the Bill’s threshold for content that is harmful to children. Again, that is an appropriate and proportionate response.

Turning to the points made by my noble friend Lord Moylan and the noble Baroness, Lady Fox, services will also need to have particular regard to freedom of expression when complying with their safety duties. Ofcom will be required to set out steps that providers can take when complying with their safety duties in the codes of practice, including what is proportionate for different providers and how freedom of expression can be protected.

16:45
My noble friend Lord Bethell and the noble Baroness, Lady Merron, are concerned that health misinformation and disinformation will not be adequately covered by this. Their amendment seeks to tackle that but, in doing so, mimics provisions on content harmful to adults previously included in the Bill which the Government consciously removed last year following debates in another place. The Government take concerns about health-related misinformation and disinformation very seriously. Our approach will serve a purpose of transparency and accountability by ensuring that platforms are transparent and accountable to their users about what they will and will not allow on their services.
Under the new terms of service for category 1 services, if certain types of misinformation and disinformation are prohibited in platforms’ terms of service, they will have to remove it. That will include anti-vaccination falsehoods and health-related misinformation and disinformation if it is prohibited in their terms of service. This is an appropriate response which prevents services from arbitrarily removing or restricting legal content, however controversial it may be, or suspending or banning users where it is not in accordance with their expressed terms of service.
The Bill will protect people from the most egregious types of health-related misinformation and disinformation while still protecting freedom of expression and allowing users to ask genuine questions about health-related matters. There are many examples from recent history—Primodos, Thalidomide and others—which point to the need for legitimate debate about health-related matters, sometimes against companies which have deep pockets to defend the status quo.
My noble friend Lord Bethell also raised concerns about the role that algorithms play in pushing content. I reassure him that all companies will face enforcement action if illegal content in scope of the Bill is being promoted to users via algorithms. Ofcom will have a range of powers to assess whether companies are fulfilling their regulatory requirements in relation to the operation of their algorithms.
In circumstances where there is a significant threat to public health, the Bill already provides additional powers for the Secretary of State to require Ofcom to prioritise specified objectives when carrying out its media literacy activity and to require that companies report on the action they are taking to address the threat. The advisory committee on misinformation and disinformation will also be given the flexibility and expertise to consider providing advice to Ofcom on this issue, should it choose to.
Amendments 99 and 222 from the noble Baroness, Lady Merron, and Amendments 223 and 224 from the noble Lord, Lord Knight of Weymouth, relate to the advisory committee. Disinformation is a pervasive and evolving threat. The Government believe that responding to the issue effectively requires a multifaceted, whole-of-society approach. That is what the advisory committee seeks to do by bringing together technology companies, civil society organisations and sector experts to advise Ofcom in building cross-sector understanding and technical knowledge of the challenges and how best to tackle them. The Government see this as an essential part of the Bill’s response to this issue.
I understand the desire of noble Lords to ensure that the committee is conducting its important work as quickly as possible, but it is imperative that Ofcom has the appropriate time and space to appoint the best possible committee and that its independence as a regulator is respected. Ofcom is well versed in setting up statutory committees and ensuring that committees established under statute meet their obligations while maintaining impartiality and integrity. To seek to prescribe timeframes or their composition risks impeding Ofcom’s ability to run a transparent process that finds the most suitable candidates. Considering the evolving nature of disinformation and the online realm, the advisory committee will also need the flexibility to adapt and respond. It would therefore not be appropriate for the Bill to be overly prescriptive about the role of the advisory committee or to mandate the things on which it must report.
The noble Baroness, Lady Fox of Buckley, asked whether the committee could include civil liberties representatives. It is for Ofcom to decide who is on the committee, but Ofcom must have regard to the desirability of including, among others, people representing the interests of UK users of regulated services, which could include civil liberties groups.
The noble Baroness, Lady Kidron, raised the challenges of artificial intelligence. Anything created by artificial intelligence and shared on an in-scope service by a user will qualify as user-generated content. It would therefore be covered by the Bill’s safety duties, including to protect children from harmful misinformation and disinformation, and to ensure that platforms properly enforce their terms of service for adults.
I turn to the points raised in my noble friend Lord Moylan’s Amendment 264. Alongside this strong legislative response, the Government will continue their operational response to tackling misinformation and disinformation. As part of this work, the Government meet social media companies on a regular basis to discuss a range of issues. These meetings are conducted in the same way that the Government would engage with any other external party, and in accordance with the well-established transparency processes and requirements.
The Government’s operational work also seeks to understand misinformation and disinformation narratives that are harmful to the UK, to build an assessment of their risk and threat. We uphold the same commitment to freedom of expression in our operational response as we do in our legislative response. As I said, we are not in the business of telling companies what legal content they can and cannot allow. Indeed, under the Bill, category 1 services must set clear terms of service that are easy for users to understand and are consistently enforced, ensuring new levels of transparency and accountability.
Our operational response will accompany our legislative response. The measures have been designed to provide a strong response to tackle misinformation and disinformation, ensuring users’ safety while promoting a thriving and lively democracy where freedom of expression is protected.
The noble Baroness, Lady Fox, and the noble Lord, Lord Clement-Jones, asked about the counter-disinformation unit run, or rather led, by the Department for Science, Innovation and Technology. That works to understand attempts to artificially manipulate the information environment, and to understand the scope, scale and reach of misinformation and disinformation. It responds to acute information incidents, such as Russian information operations during the war in Ukraine, those we saw during the pandemic and those around important events such as general elections. It does not monitor individuals; rather, its focus is on helping the Government understand online misinformation and disinformation narratives and threats.
When harmful narratives are identified, the unit works with departments across Whitehall to deploy the appropriate response, which could involve a direct rebuttal on social media or awareness-raising campaigns to promote the facts. Therefore, the primary purpose is not to monitor for harmful content to flag to social media companies—the noble Baroness raised this point—but the department may notify the relevant platform if, in the course of its work, it identifies content that potentially violates platforms’ terms of service, including co-ordinated, inauthentic or manipulative behaviour. It is then up to the platform to decide whether to take action against the content, based on its own assessment and terms of service.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The Minister mentioned “acute” examples of misinformation and used the example of the pandemic. I tried to illustrate that perhaps, with hindsight, what were seen as acute examples of misinformation turned out to be rather more accurate than we were led to believe at the time. So my concern is that there is already an atmosphere of scepticism about official opinion, which is not the same as misinformation, as it is sometimes presented. I used the American example of the Hunter Biden laptop so we could take a step away.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

This might be an appropriate moment for me to say—on the back of that—that, although my noble friend explained current government practice, he has not addressed my point on why there should not be an annual report to Parliament that describes what government has done on these various fronts. If the Government regularly meet newspaper publishers to discuss the quality of information in their newspapers, I for one would have entire confidence that the Government were doing so in the public interest, but I would still quite like—I think the Government would agree on this—a report on what was happening, making an exception for national security. That would still be a good thing to do. Will my noble friend explain why we cannot be told?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

While I am happy to elaborate on the work of the counter-disinformation unit in the way I just have, the Government cannot share operational details about its work, as that would give malign actors insight into the scope and scale of our capabilities. As my noble friend notes, this is not in the public interest. Moreover, reporting representations made to platforms by the unit would also be unnecessary as this would overlook both the existing processes that govern engagements with external parties and the new protections that are introduced through the Bill.

In the first intervention, the noble Baroness, Lady Fox, gave a number of examples, some of which are debatable, contestable facts. Companies may well choose to keep them on their platforms within their terms of service. We have also seen deliberate misinformation and disinformation during the pandemic, including from foreign actors promoting more harmful disinformation. It is right that we take action against this.

I hope that I have given noble Lords some reassurance on the points raised about the amendments in this group. I invite them not to press the amendments.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am most grateful to noble Lords across the Committee for their consideration and for their contributions in this important area. As the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, both said, this was an area of struggle for the Joint Committee. The debate today shows exactly why that is so, but it is a struggle worth having.

The noble Lord, Lord Bethell, talked about there being a gap in the Bill as it stands. The amendments include the introduction of risk assessments and transparency and, fundamentally, explaining things in a way that people can actually understand. These are all tried and tested methods and can serve only to improve the Bill.

I am grateful to the Minister for his response and consideration of the amendments. I want to take us back to the words of the noble Baroness, Lady Kidron. She explained it beautifully—partly in response to the comments from the noble Baroness, Lady Fox. This is about tackling a system of amplification of misinformation and disinformation that moves the most marginal of views into the mainstream. It deals with restricting the damage that, as I said earlier, can produce the most dire circumstances. Amplification is the consideration that these amendments seek to tackle.

I am grateful to the noble Lord, Lord Moylan, for his comments, as well as for his amendments. I am sure the noble Lord has reflected that some of the previous amendments he brought before the House somewhat put the proverbial cat among the Committee pigeons. On this occasion, I think the noble Lord has nicely aligned the cats and the pigeons. He has managed to rally us all—with the exception of the Minister—behind these amendments.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

The noble Baroness is entirely right to emphasise amplification. May I put into the mix the very important role of the commercialisation of health misinformation? The more you look at the issue of health misinformation, the more you realise that its adverse element is to do with making money out of people’s fears. I agree with the noble Baroness, Lady Fox, that there should be a really healthy discussion about the efficacy, safety and value for money of modern medicines. That debate is worth having. The Minister rightly pointed out some recent health scandals that should have been chased down much more. The commercialisation of people’s fears bears further scrutiny and is currently a gap in the Bill.

Baroness Merron Portrait Baroness Merron (Lab)
- Hansard - - - Excerpts

I certainly agree with the noble Lord, Lord Bethell, on that point. It is absolutely right to talk about the danger of commercialisation and how it is such a driver of misinformation and disinformation; I thank him for drawing that to the Committee’s attention. I also thank my noble friend Lady Healy for her remarks, and her reflection that these amendments are not a question of restricting free speech and debate; they are actually about supporting free speech and debate but in a safe and managed way.

17:00
The Minister gave the Committee the assurance that the Bill in its current form tackles the most egregious forms of disinformation and misinformation. If only it were so, we would not have had cause to bring forward these amendments. I again refer to the point in the Minister’s response when, as I anticipated, he referred to the false communications offence in Clause 160. I repeat the point gently but firmly to the Minister that this just does not address the amplification point that we seek to focus on. One might argue that perhaps it is more liberal and proportionate to allow misinformation and disinformation but to focus on tackling their amplification. That is where our efforts should be.
With those comments, with thanks to the Minister and other noble Lords, and in the hope that the Minister will have the opportunity to reflect on the points raised in this debate, I beg leave to withdraw.
Amendment 52 withdrawn.
Amendment 52A
Moved by
52A: After Clause 15, insert the following new Clause—
“Duty to inform users about accuracy of content on a service
(1) This section sets out a duty to make available information to allow users to establish the reliability and accuracy of content which applies in relation to Category 1 services.(2) A duty, where a service provides access to both journalistic and other forms of content, to make available to users such information that may be necessary to allow users to establish the reliability and accuracy of content encountered on the service.”Member’s explanatory statement
This amendment is to probe what steps, if any, a carrier of journalistic content is expected to take to improve users’ media literacy skills.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I move this amendment in my name as part of a group of amendments on media literacy. I am grateful to Full Fact, among others, for some assistance around these issues, and to Lord Puttnam. He has retired from this House, of course, but it was my pleasure to serve on the committee that he chaired on democracy and digital technology. He remains in touch and is watching from his glorious retirement in the Republic of Ireland—and he is pressing that we should address issues around media literacy in particular.

The Committee has been discussing the triple shield. We are all aware of the magic of threes—the holy trinity. Three is certainly a magic number, but we also heard about the three-legged stool. There is more stability in four, and I put it to your Lordships that, having thought about “illegal” as the first leg, “terms of service” as the second and “user empowerment tools” as the third, we should now have, as a fourth leg underpinning a better and safer environment for the online world, “better media literacy”, so that users have confidence and competence online as a result.

To use the user empowerment tools effectively, we need to be able to understand the business models of the platforms, and how we are paying for their services with our data and our attention; how platforms use our data; our data rights as individuals; and the threat of scams, catfishing, phishing and fraud, which we will discuss shortly. Then there is the national cyber threat. I was really struck, when we were on that committee that Lord Puttnam chaired, by hearing how nations such as Finland and the Baltic states regard media literacy as a national mission to protect them particularly from the threat of cyberwarfare from Russia.

We have heard about misinformation and disinformation. There are issues of emerging technologies that we all need to be more literate about. I remember, some six or seven years ago, my wife was in a supermarket queue with her then four year-old daughter who turned to her and asked what an algorithm was. Could any of us then confidently be able to reply and give a good answer? I know that some would be happy to do so, but we equally need to be able to answer what machine learning is, what large-language models are, or what neural networks are in order to understand the emerging world of artificial intelligence.

Ofcom already has a duty under the Communications Act 2002. Incidentally, Lord Puttnam chaired the Joint Committee on that Act. It is worth asking ourselves: how is it going for Ofcom in the exercise of that duty? We can recall, I am sure, the comments last Tuesday in this Committee of the noble Baroness, Lady Buscombe, who said:

“I took the Communications Act 2003 through for Her Majesty’s Opposition, and we were doing our absolute best to future-proof the legislation. There was no mention of the internet in that piece of legislation”.—[Official Report, 9/5/23; col. 1709.]


There is no doubt in my mind that, as a result of all the changes that have taken place in the last 20 years, the duty in that Act needs updating, and that is what we are seeking to do.

It is also possible to look at the outcomes. What is the state of media literacy in the nation at the moment? I was lucky enough this weekend to share a platform at a conference with a young woman, Monica. She lives in Greenwich, goes to Alleyn’s School, is articulate and is studying computer science at A-level. When asked about the content of the computer science curriculum, which is often prayed in aid in terms of the digital and media literacy of our young people, she reminded the audience that she still has to learn about floppy disks because the curriculum struggles to keep up to date. She is not learning about artificial intelligence in school because of that very problem. The only way in which she could do so, and she did, was through an extended project qualification last year.

We then see Ofcom’s own reporting on levels of media literacy in adults. Among 16 to 24 year-olds, which would cover Monica, for example, according to the most recent report out earlier this year or at the end of last, only two-thirds are confident and able to recognise scam ads, compared to 76% of the population in England. Young people are less confident in recognising search-engine advertising than the majority: only 42% of young people are confident around differentiating between organic and advertising content on search. Of course, young people are better at thinking about the truthfulness of “factual” information online. For adults generally, the report showed that only 45% of us are confident and able to recognise search-engine advertising, and a quarter of us struggle to identify scam emails and factful truthfulness online. You are less media literate and therefore more vulnerable if you are from the poorer parts of the population. If you are older, you are still yet more vulnerable to scam emails, although above average on questioning online truth and spotting ads in search engines. Finally, in 2022, Ofcom also found that 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to be able to do so. A lot of us are kidding ourselves in terms of how safe we are and how much we know about the online world.

So, much more is to be done. Hence, Amendment 52A probes what the duty on platforms should be to improve media literacy and thereby establish the reliability and accuracy of journalistic content. Amendment 91 in my name requires social media and search services to put in place measures to improve media literacy and thereby explain things like the business model that currently is too often skated over by the media literacy content provided by platforms to schools and others. The noble Lord, Lord Holmes, has Amendment 91A, which is similar in intent, and I look forward to hearing his comments on that.

Amendment 98 in my name would require a code of practice from Ofcom in support of these duties and Amendment 186 would ensure that Ofcom has sufficient funds for its media literacy duties. Amendment 188 would update the Communications Act to reflect the online world that we are addressing in this Bill. I look forward to the comments from the noble Baroness, Lady Prashar, in respect of her Amendment 236, which, she may argue, does a more comprehensive job than my amendment.

Finally, my Amendment 189 in this group states that Ofsted would have to collaborate with Ofcom in pursuance of its duties, so that Ofcom could have further influence into the quality of provision in schools. Even this afternoon, I was exchanging messages with an educator in Cornwall called Giles Hill, who said to me that it is truly dreadful for schools having to mop up problems caused by this unregulated mess.

This may not be the perfect package in respect of media literacy and the need to get this right and prop up the three-legged stool, but there is no doubt from Second Reading and other comments through the Bill’s passage that this is an area where the Bill needs to be amended to raise the priority and the impact of media literacy among both service providers and the regulator. I beg to move.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to take part in today’s proceedings. As it is my first contribution on this Bill, I declare my technology and financial services interests, as set out in the register. I also apologise for not being able to take part in the Second Reading deliberations.

It is a particular pleasure to follow my friend, the noble Lord, Lord Knight; I congratulate him on all the work that he has done in this area. Like other Members, I also say how delighted I was to be part of Lord Puttnam’s Democracy and Digital Technologies Committee. It is great to know that he is watching—hopefully on wide-movie screen from Skibbereen—because the contribution that he has made to this area over decades is beyond parallel. To that end, I ask my noble friend the Minister whether he has had a chance to remind himself of the recommendations in our 2020 report. Although it is coming up to three years old, so much of what is in that report is completely pertinent today, as it was on the date of publication.

I am in the happy position to support all the amendments in this group; they all have similar intent. I have been following the debate up to this point and have been in the Chamber for a number of previous sessions. Critically important issues have been raised in every group of amendments but, in so many ways, this group is perhaps particularly critical, because this is one of the groups that enables individuals, particularly young people, to have the tools that they—and we—need in their hands to enable them to grip this stuff, in all its positive and, indeed, all its less-positive elements.

My Amendment 91A covers much of the same ground as Amendment 91 from the noble Lord, Lord Knight. It is critical that, when we talk about media literacy, we go into some detail around the subsets of data literacy, data privacy, digital literacy and, as I will come on to in a moment, financial literacy. We need to ensure that every person has an understanding of how this online world works, how it is currently constructed and how there is no inevitability about that whatever. People need to understand how the algorithms are set up. As was mentioned on a previous group, it is not necessarily that much of a problem if somebody is spouting bile in the corner; it is not ideal, but it is not necessarily a huge problem. The problem in this world is the programmability, the focus, the targeting and the weaponising of algorithms to amplify such content for monetary return. Nothing is inevitable; it is all utterly determined by the models currently in play.

It is critical for young people, and all people, to understand how data is used and deployed. In that media literacy, perhaps the greatest understanding of all is that it is not “the data” but “our data”. It is for us, through media literacy, to determine how our data is deployed, for what purpose, to what intent and in what circumstances, rather than, all too often, it being sold on, and so on.

17:15
Does the Minister agree that it is critical that we include financial literacy in this broader media literacy group of amendments, because so much of what is currently online is designed as financial scams or inducements? It would not be overstating it to say that there is currently an epidemic of online scamming and fraud. Does he agree that the Bill needs to be very clear on this specific issue of literacy? Will he update the Committee on the work the Government have done on the Media Literacy Taskforce Fund and, indeed, the programme fund launched last October? What updates or plans are there to scale, to develop and to further partner on both those funds?
Finally, I quote the words of the Royal College of Psychiatrists, stating pretty clearly, in terms, why media literacy matters:
“media literacy … can equip young people with the tools they need to help protect themselves as new online harms develop”.
I agree but, matching like with like, I seek to amplify. More than tools, we need media literacy to be nothing short of the sword and the shield for young people in the online world—the sword and the shield for all people.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, for once, I am not entirely hostile to all these amendments—hurrah. In fact, I would rather have media literacy and education than regulation; that seems to me the solution to so much of what we have been discussing. But guess what? I have a few anxieties and I shall just raise them so that those who have put forward the arguments can come back to me.

We usually associate media literacy with schools and young people in education. Noble Lords will be delighted to know that I once taught media literacy: that might explain where we are now. It was not a particularly enlightening course for anybody, but it was part of the communications A-level at the time. I am worried about mandating schools how to teach media literacy. As the noble Lord, Lord Knight, will know, I worry about adding more to their overcrowded curriculum than they already have on their plate, but I note that the amendments actually expand the notion of being taught literacy to adults, away from just children. I suppose I just have some anxiety about Ofcom becoming the nation’s teacher, presenting users of digital services as though they are hapless and helpless. In other words, I am concerned about an overly paternalistic approach—that we should not be patronising.

The noble Baroness, Lady Kidron, keeps reminding us that content should not be our focus, and that it should be systems. In fact, in practically every discussion we have had, content has been the focus, because that is what will be removed, or not, by how we deal with the systems. That is one of the things that we are struggling with.

Literacy in the systems would certainly be very helpful for everybody. I have an idea—it is not an amendment—that we should send the noble Lord, Lord Allan of Hallam, on a UK tour so that he can explain it to us all; he is not here for this compliment, but every time he spoke in the first week of Committee, I think those of us who were struggling understood what he meant, as he explained complicated and technical matters in a way that was very clear. That is my constructive idea.

Amendment 52A from the noble Lord, Lord Knight of Weymouth, focuses on content, with its

“duty to make available information to allow users to establish the reliability and accuracy of content”.

That takes us back to the difficulties we were struggling with on how misinformation and disinformation will be settled and whether it is even feasible. I do not know whether any noble Lords have been following the “mask wars” that are going on. There are bodies of scientists on both sides on the efficacy of mask wearing—wielding scientific papers at dawn, as it were. These are well-informed, proper scientists who completely disagree on whether it was effective during lockdown. I say that because establishing reliability and accuracy is not that straightforward.

I like the idea of making available

“to users such information that may be necessary to allow users to establish the reliability and accuracy of content encountered on the service”.

I keep thinking that we need adults and young people to say that there is not one truth, such as “the science”, and to be equipped and given the tools to search around and compare and contrast different versions. I am involved in Debating Matters for 16 to 18 year-olds, which has topic guides that say, “Here is an argument, with four really good articles for it and four really good articles against, and here’s a load of background”. Then 16 to 18 year-olds will at least think that there is not just one answer. I feel that is the way forward.

The noble Lord, Lord Clement-Jones, said that I was preaching a counsel of despair; I like to think of myself as a person who has faith in the capacity and potential of people to overcome problems. I had a slight concern when reading the literature associated with online and digital literacy—not so much with the amendments—that it always says that we must teach people about the harms of the online world. I worry that this will reinforce a disempowering idea of feeling vulnerable and everything being negative. One of the amendments talks about a duty to promote users’ “safe use” of the service. I encourage a more positive outlook, incorporating into this literacy an approach that makes people aware that they can overcome and transcend insults and be robust and savvy enough to deal with algorithms—that they are not always victims but can take control over the choices they make. I would give them lessons on resilience, and possibly just get them all to read John Locke on toleration.

Baroness Prashar Portrait Baroness Prashar (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 236, 237 and 238 in my name. I thank the noble Lord, Lord Storey, and the noble Baroness, Lady Bennett of Manor Castle, for supporting me. Like others, I thank Full Fact for its excellent briefings. I also thank the noble Lord, Lord Knight, for introducing this group of amendments, as it saves me having to make the case for why media literacy is a very important aspect of this work. It is the other side of regulation; they very much go hand in hand. If we do not take steps to promote media literacy, we could fall into a downward spiral of further and further regulation, so it is extremely important.

It is a sad fact that levels of media literacy are very low. Research from Ofcom has found that one-third of internet users are unaware of the potential for inaccurate and biased information. Further, 40% of UK adult internet users do not have the skills to critically assess information they see online, and only 2% of children have skills to tell fact from fiction online. It will not be paternalistic, but a regulator should be proactively involved in developing media literacy programmes. Through the complaints it receives and from the work that it does, the regulator can identify and monitor where the gaps are in media literacy.

To date, the response to this problem has been for social media platforms to remove content deemed harmful. This is often done using technology that picks up on certain words and phrases. The result has been content being removed that should not have been. Examples of this include organisations such as Mumsnet having social media posts on sexual health issues taken down because the posts use certain words or phrases. At one stage, Facebook’s policy was to delete or censor posts expressing opinions that deviated from the norm, without defining what “norm” actually meant. The unintended consequences of the Bill could undermine free speech. Rather than censoring free speech through removing harmful content, we should give a lot more attention to media literacy.

During the Bill’s pre-legislative scrutiny, the Joint Committee recommended that the Government include provisions to ensure media literacy initiatives are of a high standard. The draft version of the Bill included Clause 103, which strengthened the media literacy provisions in the Communications Act 2003, as has already been mentioned. Regrettably, the Government later withdrew the enhanced media literacy clause, so the aim of my amendments is to reintroduce strong media literacy provisions. Doing so will both clarify and strengthen media literacy obligations on online media providers and Ofcom.

Amendment 236 would place a duty on Ofcom to take steps to improve the media literacy of the public in relation to regulated services. As part of this duty, Ofcom must try to reach audiences who are less engaged and harder to reach through traditional media literacy services. It must also address gaps in the current availability of media literacy provisions for vulnerable users. Many of the existing media literacy services are targeted at children but we need to include vulnerable adults too. The amendment would place a duty on Ofcom to promote availability and increase the effectiveness of media literacy initiatives in relation to regulated services. It seeks to ensure that providers of regulated services take appropriate measures to improve users’ media literacy through Ofcom’s online safety function. This proposed new clause makes provision for Ofcom to prepare guidance about media literacy matters, and such guidance must be published and kept under review.

Amendment 237 would place a duty on Ofcom to prepare a strategy on how it intends to undertake the duty to promote media literacy. This strategy should set out the steps Ofcom proposes to take to achieve its media literacy duties and identify organisations, or types of organisations, that Ofcom will work with to undertake these duties. It must also explain why Ofcom believes the proposed steps will be effective in how it will assess progress. This amendment would also place a duty on Ofcom to have regard to the need to allocate adequate resources for implementing this strategy. It would require Ofcom’s media strategy to be published within six months of this provision coming into force, and to be revised within three years; in both cases this should be subject to consultation.

Amendment 238 would place a duty on Ofcom to report annually on the delivery of its media literacy strategy. This reporting must include steps taken in accordance with the strategy and assess the extent to which those steps have had an effect. This amendment goes further than the existing provisions in the Communications Act 2003, which do not include duties on Ofcom to produce a strategy or to measure progress; nor do they place a duty on Ofcom to reach hard-to-reach audiences who are the most vulnerable in our society to disinformation and misinformation.

17:30
The Government have previously responded by saying that there is no need to include media literacy provisions in the Bill, citing Ofcom’s Approach to Online Media Literacy, a document published in December 2021, and the Government’s own Online Media Literacy Strategy, published in July 2021. Both these documents make multiple references to the Online Safety Bill placing media literacy duties on Ofcom. The removal of media literacy provisions from the Bill risks this not being viewed as a priority area of the work of Ofcom or future Governments. Meta have said that it would prefer to have clear media literacy duties in the Bill, as this provides clarity. Without regulatory obligations, there is a risk that, in this important area of work, the regulator will not have the teeth it needs to monitor and regulate where there are gaps.
We need to equip society—children and adults—so that they can make knowledgeable and intelligent use of the internet. We have focused on the harm that the internet does, but the proper use of it can have a very positive impact. The previous debate that we had about misinformation and disinformation highlighted the importance of media literacy.
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Prashar, and I join her in thanking the noble Lord, Lord Knight, for introducing this group very clearly.

In taking part in this debate, I declare a joint interest with the noble Baroness, Lady Fox, in that I was for a number of years a judge in the Debating Matters events to which she referred. Indeed, the noble Baroness was responsible for me ending up in Birmingham jail, during the time that such a debate was conducted with the inmates of Birmingham jail. We have a common interest there.

I want to pick up a couple of additional points. Before I joined your Lordships’ Committee today I was involved in the final stages of the Committee debate on the economic crime Bill, where the noble Lord, Lord Sharpe of Epsom, provided a powerful argument—probably unintentionally—for the amendments we are debating here now. We were talking, as we have at great length in the economic crime Bill, about the issue of fraud. As the noble Lord, Lord Holmes of Richmond, highlighted, in the context of online harms fraud is a huge aspect of people’s lives today and one that has been under-covered in this Committee, although it has very much been picked up in the economic crime Bill Committee. As we were talking about online fraud, the noble Lord, Lord Sharpe of Epsom, said that consumers have to be “appropriately savvy”. I think that is a description of the need for education and critical thinking online, equipping people with the tools to be, as he said, appropriately savvy when facing the risks of fraud and scams, and all the other risks that people face online.

I have attached my name to two amendments here: Amendment 91, which concerns the providers of category 1 and 2A services having a duty, and Amendment 236, which concerns an Ofcom duty. This joins together two aspects. The providers are making money out of the services they provide, which gives them a duty to make some contribution to combatting the potential harms that their services present to people. Ofcom as a regulator obviously has a role. I think it was the noble Lord, Lord Knight, who said that the education system also has a role, and there is some reference in here to Ofsted having a role.

What we need is a cross-society, cross-systems approach. This is where I also make the point that we need to think outside the scope of the Bill—it is part of the whole package—about how the education system works, because media literacy is not a stand-alone thing that you can separate out from the issues of critical thinking more broadly. We need to think about our education system, which is far too often, for schools in particular, where we get pupils to learn and regurgitate a whole set of facts and then reward them for that. We need to think about how our education system prepares children for the modern online world.

There is a great deal we can learn from the example—often cited but worth referring to—of Finland, which by various tests has been ranked as the country most resistant to fake news. A very clearly built-in idea of questioning, scrutiny and challenge is being encouraged among pupils, starting from the age of seven. That is something we need to transform our education system to achieve. However, of course, many people using the internet now are not part of our education system, so this needs to be across our society. A focus on the responsibilities of Ofcom and the providers has to be in the Bill.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, over the last decade, I have been in scores of schools, run dozens of workshops and spoken to literally thousands of children and young people. A lot of what I pass off as my own wisdom in this Chamber is, indeed, their wisdom. I have a couple of points, and I speak really from the perspective of children under 18 with regard to these amendments, which I fully support.

Media literacy—or digital literacy, as it is sometimes called—is not the same as e-safety. E-safety regimes concentrate on the behaviour of users. Very often, children say that what they learn in those lessons is focused on adult anxieties about predators and bullies, and when something goes wrong, they feel that they are to blame. It puts the responsibility on children. This response, which I have heard hundreds of times, normally comes up after a workshop in which we have discussed reward loops, privacy, algorithmic bias, profiling or—my own favourite—a game which reveals what is buried in terms and conditions; for example, that a company has a right to record the sound of a device or share their data with more than a thousand other companies. When young people understand the pressures that they are under and which are designed into the system, they feel much better about themselves and rather less enamoured of the services they are using. It is my experience that they then go on to make better choices for themselves.

Secondly, we have outsourced much of digital literacy to companies such as Google and Meta. They too concentrate on user behaviour, rather than looking at their own extractive policies focused on engagement and time spent. With many schools strapped for cash and expertise, this teaching is widespread. However, when I went to a Google-run assembly, children aged nine were being taught about features available only on services for those aged over 13—and nowhere was there a mention of age limits and why they are important. It cannot be right that the companies are grooming children towards their services without taking full responsibility for literacy, if that is the literacy that children are being given in school.

Thirdly, as the Government’s own 2021 media literacy strategy set out, good media literacy is one line of defence from harm. It could make a crucial difference in people making informed and safe decisions online and engaging in a more positive online debate, at the same time as understanding that online actions have consequences offline.

However, while digital literacy and, in particular, critical thinking are fundamental to a contemporary education and should be available throughout school and far beyond, they must not be used as a way of putting responsibility on the user for the company’s design decisions. I am specifically concerned that in the risk-assessment process, digital literacy is one of the ways that a company can say it has mitigated a potential risk or harm. I should like to hear from the Minister that that is an additional responsibility and not instead of responsibility.

Finally, over all these years I have always asked at the end of the session what the young people care about the most. The second most important thing is that the system should be less addictive—it should have less addiction built into it. Again, I point the Committee in the direction of the safety-by-design amendments in the name of my noble friend Lord Russell that try to get to the crux of that. They are not very exciting amendments in this debate but they get to the heart of it. However, the thing the young people most often say is, “Could you do something to get my parents to put down their phones?” I therefore ask the Minister whether he can slip something into the Bill, and indeed ask the noble Lord, Lord Grade, whether that could emerge somewhere in the guidance. That is what young people want.

Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I strongly support the amendments in the name of my noble friend Lord Knight and others in this group.

We cannot entirely contain harmful, misleading and dangerous content on the internet, no matter how much we strengthen the Bill. Therefore, it is imperative that we put a new duty on category 1 and category 2A services to require them to put in place measures to promote the media literacy of users so that they can use the service safely.

I know that Ofcom takes the issue of media literacy seriously, but it is regrettable that the Government have dropped their proposal for a new media literacy duty for Ofcom. So far, I see no evidence that the platforms take media literacy seriously, so they need to be made to understand that they have corporate social responsibilities towards their clients.

Good media literacy is the first line of defence from bad information and the kind of misinformation we have discussed in earlier groups. Schools are trying to prepare their pupils to understand that the internet can peddle falsehoods as well as useful facts, but they need support, as the noble Baroness, Lady Kidron, just said. We all need to increase our media literacy, especially with the increasing use of artificial intelligence, as it can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and well-being, social cohesion and democracy.

In 2022, Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information online, and 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so, as my noble friend Lord Knight, has pointed out.

Amendment 91 would mean that platforms have to instigate measures to give users an awareness and understanding of the nature and characteristics of the content that may be on the service, its potential impact and how platforms operate. That is a sensible and practical request that is not beyond the ability of companies to provide, and it will be to everyone’s benefit.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I indicate my support in principle for what these amendments are trying to achieve.

I speak with a background that goes back nearly 40 years, being involved in health education initiatives, particularly in primary schools. For 24 years—not very good corporate governance—I was the chair of what is now the largest supplier of health education into primary schools in the United Kingdom, reaching about 500,000 children every year.

The principle of preventive health is not a million miles away from what we are talking about today. I take the point that was well made by the noble Baroness, Lady Fox, that piling more and more duties on Ofcom in a well-intentioned way may not have the effect that we want. What we are really looking for and talking about is a joined-up strategy—a challenge for any Government—between the Department for Education, the Department for Digital, Culture, Media and Sport, the Department for Science, Innovation and Technology, and probably the Department of Health and Social Care, because health education, as it has developed over the last 40 or 50 years, has a lot to teach us about how we think about creating effective preventive education.

17:45
It is not just about children; it is about adults. In the readers’ problem page of any newspaper, whether from the left or the right of the political spectrum, the number of people, including those whom most of us would regard as intellectual peers or cleverer than us, who have been scammed in different ways, particularly through online intrusion, shows that it is very prevalent. These are clever, university-educated people who are being taken for a ride.
Yesterday I cleaned out the spam folder in one of my email accounts, which I do fairly quickly. As of about five minutes ago, I have three spam emails. In two of them, a major retailer seems to be telling me that I am the fortunate winner of a Ninja air fryer—not an offer that I propose to take up. The third purports to be from the Post Office, telling me that I have an exciting parcel to open. I am sure that if I clicked on it, something quite unpleasant would happen.
We need to do something about this. The point made by the noble Baroness, Lady Kidron, about children saying that we would love this to be less addictive, is a very moot point because the companies know exactly what they are doing. Clearly, we want to encourage children to understand how those tools operate and how one can try to control, mitigate or avoid them, or point them out to others who may not be as savvy. As for the one that was most desirable, parents putting down their telephones, I confess that occasionally, when sitting as a Deputy Speaker in your Lordships’ House, I wish the Government Whips would spend slightly less time looking at their telephones, although I am sure that whatever they are doing is very important government business.
I do not expect the Minister to stand up and say that we have a solution. The tech companies need to be involved. We need to look at good or best practice around the world, which probably has a lot to teach us, but we can do this only if we do it together in a joined-up way. If we try to do it in a fragmented way, we will put all the onus on Ofcom and it ain’t going to work.
Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I spoke at Second Reading about the relationship between online safety and protecting people’s mental health, a theme that runs throughout the Bill. I have not followed the progress in Committee as diligently as I wish, but this group of amendments has caught the eye of the Mental Health Foundation, which has expressed support. It identified Amendment 188, but I think it is the general principle that it supports. The Mental Health Foundation understands the importance of education, because it asked young people what they thought should be done. It sponsored a crucial inquiry through its organisation YoungMinds, which produced a report earlier this year, Putting a Stop to the Endless Scroll.

One of the three major recommendations that emerged from that report, from the feelings of young people themselves, was the need for better education. It found that young people were frustrated at being presented with outdated information about keeping their details safe. They felt that they needed something far more advanced, more relevant to the online world as it is happening at the moment, on how to avoid the risks from such things as image-editing apps. They needed information on more sophisticated risks that they face, essentially what they described as design risks, where the website is designed to drag you in and make you addicted to these algorithms.

The Bill as a whole is designed to protect children and young people from harm, but it must also, as previous speakers have made clear, provide young people themselves with tools so that they can exercise their own judgment to protect themselves and ensure that they do not fall foul, set on that well-worn path between being engaged on a website and ending up with problems with their mental health. Eating is the classic example: you click on a website about a recipe and, step by step, you get dragged into material designed to harm your health through its effect on your diet.

I very much welcome this group of amendments, what it is trying to achieve and the role that it will have by educating young people to protect themselves, recognising the nature of the internet as it is now, so that they do not run the risks of affecting their mental health.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has probably been the most constructive and inspiring debate that we have had on the Bill. In particular, I thank the noble Lord, Lord Knight, for introducing this debate. His passion for this kind of media literacy education absolutely shines through. I thank him for kicking off in such an interesting and constructive way. I am sorry that my noble friend Lord Storey is not here to contribute as well, with his educational background. He likewise has a passion for media literacy education and would otherwise have wanted to contribute to the debate today.

I am delighted that I have found some common ground with the noble Baroness, Lady Fox. The idea of sending my noble friend Lord Allan on tour has great attractions. I am not sure that he would find it quite so attractive. I am looking forward to him coming back before sending him off around the country. I agree that he has made a very constructive contribution. I agree with much of what the noble Baroness said, and the noble Baroness, Lady Prashar, had the same instinct: this is a way of better preserving freedom of speech. If we can have those critical thinking skills so that people can protect themselves from misinformation, disinformation and some of the harms online, we can have greater confidence that people are able to protect themselves against these harms at whatever age they may be.

I was very pleased to hear the references to Lord Puttnam, because I think that the Democracy and Digital Technologies Committee report was ground-breaking in the way it described the need for digital media literacy. This is about equipping not just young people but everybody with the critical thinking skills needed to differentiate fact from fiction—particularly, as we have talked through in Committee, on the way that digital platforms operate through their systems, algorithms and data.

The noble Lord, Lord Holmes, talked about the breadth and depth needed for media and digital literacy education; he had it absolutely right about people being appropriately savvy, and the noble Baroness, Lady Bennett, echoed what he said in that respect.

I think we have some excellent amendments here. If we can distil them into a single amendment in time for Report or a discussion with the Minister, I think we will find ourselves going forward constructively. There are many aspects of this. For instance, the DCMS Select Committee recommended that digital literacy becomes the fourth pillar of education, which seems to me a pretty important aspect alongside reading, writing and maths. That is the kind of age that we are in. I have quoted Parent Zone before. It acknowledges the usefulness of user empowerment tools and so on, but again it stressed the need for media literacy. What kind of media literacy? The noble Baroness, Lady Kidron, was extremely interesting when she said that what is important is not just user behaviour but making the right choices—that sort of critical thinking. The noble Lord, Lord Russell, provided an analogy with preventive health that was very important.

Our Joint Committee used a rather different phrase. It talked about a “whole of government” approach. When we look at all the different aspects, we see that it is something not just for Ofcom—I entirely agree with that—but that should involve a much broader range of stakeholders in government. We know that, out there, there are organisations such as the Good Things Foundation and CILIP, the library association, and I am sorry that the noble Baroness, Lady Lane-Fox, is not in her place to remind us about Doteveryone, an organisation that many of us admire a great deal for the work it carries out.

I think the “appropriately savvy” expression very much applies to the fraud prevention aspect, and it will be interesting when we come to the next group to talk about that as well. The Government have pointed to the DCMS online media strategy, but the noble Lord, Lord Holmes, is absolutely right to ask what its outcome has been, what its results have been, and what resources are being devoted towards it. We are often pointed to that by the Government, here in Committee and at Oral Questions whenever we ask how the media literacy strategy is going, so we need to kick the tyres on that as well as on the kind of priority and resources being devoted to media literacy.

As ever, I shall refer to the Government’s response to the Joint Committee, which I found rather extraordinary. The Government responded to the committee’s recommendation about minimum standards; there is an amendment today about minimum standards. They said:

“Ofcom has recently published a new approach to online media literacy … Clause 103 of the draft Bill”—


the noble Baroness, Lady Prashar, referred to the fact that in the draft Bill there was originally a new duty on Ofcom—

“did not grant Ofcom any additional powers. As such, it is … unnecessary regulation. It has therefore been removed”.

It did add to Ofcom’s duties. Will the Minister say whether he thinks all the amendments here today would constitute unnecessary regulation? As he can see, there is considerable appetite around the Committee for the kind of media literacy duty across the board that we have talked about today. He might make up for some of the disappointment that many of us feel about the Government’s having got rid of that clause by responding to that question.

18:00
The noble Lord, Lord Davies, made an important point about the mental health aspects of digital literacy. A survey run by the charity YoungMinds said that this was one of the main provisions it wanted included in the Bill. Again, on those grounds, we should see a minimum standard set by Ofcom under the terms of the Bill, as we are asking for in the amendment.
The All-Party Parliamentary Group on Media Literacy has done some really good work. Just saying, “This is cross-government”, “We need a holistic approach to this” and so on does not obviate the fact that our schools need to be much more vigorous in what they do in this area. Indeed, the group is advocating a media literacy education Bill, talking about upskilling teachers and talking, as does one of the amendments here, about Ofcom having a duty in this area. We need to take a much broader view of this and be much more vigorous in what we do on media literacy, as has been clear from all the contributions from around the House today.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has been a good debate. I am glad that a number of noble Lords mentioned Lord Puttnam and the committee that he chaired for your Lordships’ House on democracy and digital technologies. I responded to the debate that we had on that; sadly, it was after he had already retired from your Lordships’ House, but he participated from the steps of the Throne. I am mindful of that report and the lessons learned in it in the context of the debate that we have had today.

We recognise the intent behind the amendments in this group to strengthen the UK’s approach to media literacy in so far as it relates to services that will be regulated by the Bill. Ofcom has a broad duty to promote media literacy under the Communications Act 2003. That is an important responsibility for Ofcom, and it is right that the regulator is able to adapt its approach to support people in meeting the evolving challenges of the digital age.

Amendments 52A and 91 from the noble Lord, Lord Knight, and Amendment 91A from the noble Lord, Lord Holmes of Richmond, seek to introduce duties on in-scope services, requiring them to put in place measures that promote users’ media literacy, while Amendment 98 tabled by the noble Lord, Lord Knight, would require Ofcom to issue a code of practice in relation to the new duty proposed in his Amendment 91. While we agree that the industry has a role to play in promoting media literacy, the Government believe that these amendments could lead to unintended, negative consequences.

I shall address the role of the industry and media literacy, which the noble Baroness, Lady Kidron, dwelt on in her remarks. We welcome the programmes that it runs in partnership with online safety experts such as Parent Zone and Internet Matters and hope they continue to thrive, with the added benefit of Ofcom’s recently published evaluation toolkit. However, we believe that platforms can go further to empower and educate their users. That is why media literacy has been included in the Bill’s risk assessment duties, meaning that regulated services will have to consider measures to promote media literacy to their users as part of the risk assessment process. Additionally, through work delivered under its existing media literacy duty, Ofcom is developing a set of best-practice design principles for platform-based media literacy measures. That work will build an evidence base of the most effective measures that platforms can take to build their users’ media literacy.

In response to the noble Baroness’s question, I say: no, platforms will not be able to avoid putting in place protections for children by using media literacy campaigns. Ofcom would be able to use its enforcement powers if a platform was not achieving appropriate safety outcomes. There are a range of ways in which platforms can mitigate risks, of which media literacy is but one, and Ofcom would expect platforms to consider them all in their risk assessments.

Let me say a bit about the unintended consequences we fear might arise from these amendments. First, the resource demands to create a code of practice and then to regulate firms’ compliance with this type of broad duty will place an undue burden on the regulator. It is also unclear how the proposed duties in Amendments 52A, 91 and 91A would interact with Ofcom’s existing media literacy duty. There is a risk, we fear, that these parallel duties could be discharged in conflicting ways. Amendment 91A is exposed to broad interpretation by platforms and could enable them to fulfil the duty in a way that lacked real impact on users’ media literacy.

The amendment in the name of my noble friend Lord Holmes proposes a duty to promote awareness of financial deception and fraud. The Government are already taking significant action to protect people from online fraud, including through their new fraud strategy and other provisions in this Bill. I know that my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn met noble Lords to talk about that earlier this week. We believe that measures such as prompts for users before they complete financial transactions sit more logically with financial service providers than with services in scope of this Bill.

Amendment 52A proposes a duty on carriers of journalistic content to promote media literacy to their users. We do not want to risk requiring platforms to act as de facto press regulators, assessing the quality of news publishers’ content. That would not be compatible with our commitment to press freedom. Under its existing media literacy duty, Ofcom is delivering positive work to support people to discern high-quality information online. It is also collaborating with the biggest platforms to design best practice principles for platform-based media literacy measures. It intends to publish these principles this year and will encourage platforms to adopt them.

It is right that Ofcom is given time to understand the benefits of these approaches. The Secretary of State’s post-implementation review will allow the Government and Parliament to establish the effectiveness of Ofcom’s current approach and to reconsider the role of platforms in enhancing users’ media literacy, if appropriate. In the meantime, the Bill introduces new transparency-reporting and information-gathering powers to enhance Ofcom’s visibility of platforms delivery and evaluation of media literacy activities. We would not want to see amendments that would inadvertently dissuade platforms from delivering these activities in favour of less costly and less effective measures.

My noble friend Lord Holmes asked about the Online Media Literacy Strategy, published in July 2021, which set out the Government’s vision for improving media literacy in the country. Alongside the strategy, we have committed to publishing annual action plans each financial year until 2024-25, setting out how we meet the ambition of the strategy. In April 2022 we published the Year 2 Action Plan, which included extending the reach of media literacy education to those who are currently disengaged, in consultation with the media literacy task force—a body of 17 cross-sector experts—expanding our grant funding programme to provide nearly £2.5 million across two years for organisations delivering innovative media literacy activities, and commissioning research to improve our understanding of the challenges faced by the sector. We intend to publish the research later this year, for the benefit of civil society organisations, technology platforms and policymakers.

The noble Lord, Lord Knight, in his Amendment 186, would stipulate that Ofcom must levy fees on regulated firms sufficient to fund the work of third parties involved in supporting it to meet its existing media literacy duties. The Bill already allows Ofcom to levy fees sufficient to fund the annual costs of exercising its online safety functions. This includes its existing media literacy duty as far as it relates to services regulated by this Bill. As such, the Bill already ensures that these media literacy activities, including those that Ofcom chooses to deliver through third parties, can be funded through fees levied on industry.

I turn to Amendments 188, 235, 236, 237 and 238. The Government recognise the intent behind these amendments, which is to help improve the media literacy of the general public. Ofcom already has a statutory duty to promote media literacy with regard to the publication of anything by means of electronic media, including services in scope of the Bill. These amendments propose rather prescriptive objectives, either as part of a new duty for Ofcom or through updating its existing duty. They reflect current challenges in the sector but run the risk of becoming obsolete over time, preventing Ofcom from adapting its work in response to emerging issues.

Ofcom has demonstrated flexibility in its existing duty through its renewed Approach to Online Media Literacy, launched in 2021. This presented an expanded media literacy programme, enabling it to achieve almost all the objectives specified in this group. The Government note the progress that Ofcom has already achieved under its renewed approach in the annual plan it produced last month. The Online Safety Bill strengthens Ofcom’s functions relating to media literacy, which is included in Ofcom’s new transparency-reporting and information-gathering powers, which will give it enhanced oversight of industry activity by enabling it to require regulated services to share or publish information about the work that that they are doing on media literacy.

The noble Baroness, Lady Prashar, asked about the view expressed by the Joint Committee on minimum standards for media literacy training. We agree with the intention behind that, but, because of the broad and varied nature of media literacy, we do not believe that introducing minimum standards is the most effective way of achieving that outcome. Instead, we are focusing efforts on improving the evaluation practices of media literacy initiatives to identify which ones are most effective and to encourage their delivery. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. This was published in February this year and has been met with praise from practitioners, including those who received grant funding from the Government’s non-legislative media literacy work programme. The post-implementation review of Ofcom’s online safety regime, which covers its existing media literacy duty in so far as it relates to regulated services, will provide a reasonable point at which to establish the effectiveness of Ofcom’s new work programme, after giving it time to take effect.

Noble Lords talked about the national curriculum and media literacy in schools. Media literacy is indeed a crucial skill for everyone in the digital age. Key media literacy skills are already taught through a number of compulsory subjects in the national curriculum. Digital literacy is included in the computing national curriculum in England, which equips pupils with the knowledge, understanding and skills to use information and communication technology creatively and purposefully. I can reassure noble Lords that people such as Monica are being taught not about historic things like floppy disks but about emerging and present challenges; the computing curriculum ensures that pupils are taught how to design program systems and accomplish goals such as collecting, analysing, evaluating and presenting data.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Does the Minister know how many children are on computing courses?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I do not know, but I shall find out from the Department for Education and write. But those who are on them benefit from a curriculum that includes topics such as programming and algorithms, the responsible and safe use of technology, and other foundational knowledge that may support future study in fields such as artificial intelligence and data science.

This is not the only subject in which media literacy and critical thinking are taught. In citizenship education, pupils are taught about critical thinking and the proper functioning of a democracy. They learn to distinguish fact from opinion, as well as exploring freedom of speech and the role and responsibility of the media in informing and shaping public opinion. As Minister for Arts and Heritage, I will say a bit about subjects such as history, English and other arts subjects, in which pupils learn to ask questions about information, think critically and weigh up arguments, all of which are important skills for media literacy, as well as more broadly.

18:15
In the debate on the report of the committee led by Lord Puttnam I mentioned the work of Art UK and its programme, the Superpower of Looking. There are many other excellent examples, such as the National Gallery’s Take One Picture scheme, which works with schools to encourage pupils to look at just one work of art from that fabulous collection in order to encourage critical thinking and to look beyond what is immediately apparent. My department is working with the Department for Education on a cultural education plan to ensure that these sorts of initiatives are shared across all schools in the state sector . Additionally, the Department for Education published its updated Teaching Online Safety in Schools non-statutory guidance in January 2023, which provides schools with advice on how to teach children to stay safe online.
There are many ways outside the curriculum in which schoolchildren and young people benefit. I had the pleasure of being a judge for Debating Matters, as did the noble Baroness, Lady Bennett—though not in my case behind bars. A scheme such as this, along with debating clubs in schools, all add to the importance of critical thinking and debate.
Amendment 189 in the name of the noble Lord, Lord Knight, seeks to place a requirement on all public bodies to assist Ofcom in relation to its duties under the regime set out by the Bill. The regulator will need to co-operate with a variety of organisations. Ofcom has existing powers to enable this and, where appropriate and proportionate, we have used the Bill to strengthen them. The Bill’s information-gathering powers will allow Ofcom to request information from any person, including public bodies, who appears to have information required by it in order to exercise its online safety function. Placing this broad duty on all public bodies would not be proportionate or effective. It would create an undefined requirement on public bodies and give Ofcom a disproportionate amount of power.
The noble Lord’s amendment uses Ofsted as an example of a public body that would be required to co-operate with Ofcom under the proposed duty. Ofsted already has the power to advise and assist other public authorities, including Ofcom, under Section 149 of the Education and Inspections Act 2006.
I hope noble Lords have been reassured by the points I have set out and will understand why the Government are not able to accept these amendments. I will reflect on the wider remarks made in this debate. With that, I invite the noble Lord to withdraw his amendment.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to all Members of the Committee for their contributions to a good debate. I was particularly happy to hear the noble Lord, Lord Clement-Jones, describe it as “inspiring”. There were some great speeches.

I could go on at some length about the educational element to this, but I will constrain myself. In the last year, 1.4% of secondary school pupils in this country did computer science at GCSE. It is a constant source of frustration that computer science is prayed in aid by the Department for Education as a line for Ministers to take in the algorithm they are given to use. However, I understand that the Minister has just to deliver the message.

The noble Baroness was worried about adding to the curriculum. Like the noble Baroness, Lady Bennett, I favour a wider-scale reform of the education system to make it much more fit for purpose, but I will not go on.

I was the Minister responsible for the Education and Inspections Act 2006. I would be interested in further updates as to how it is going. For example, does Ofcom ever go with Ofsted into schools and look properly at media literacy delivery? That is what I am trying to tease out with the amendment.

The comments in the speech by the noble Baroness, Lady Prashar, were significant. She pointed out the weaknesses in the strategy and the difference between the duty as set out in the 2003 Act and the duties we now need, and the pressing case for these duties to be updated as we take this Bill through this House.

The noble Baroness, Lady Fox, had some misgivings about adding adults, which I think were perfectly answered by the noble Baroness, Lady Kidron, in respect of her plea on behalf of young people to help educate parents and give them better media literacy, particularly around the overuse of phones. We have a digital code of conduct in our own house to do with no phones being allowed at mealtimes or in bedrooms by any of us. All of that plays to the mental health issues referred to by my noble friend Lord Davies, and the preventive health aspect referred to by the noble Lord, Lord Russell.

As ever, I am grateful to the Minister for the thorough and comprehensive way in which he answered all the amendments. However, ultimately, the media literacy levels of adults and children in this country are simply not good enough. The existing duties that he refers to, and the way in which he referred to them in his speaking notes, suggest a certain amount of complacency about that. The duties are not working and need to be updated; we need clarity as to who owns the problem of that lack of media literacy, and we are not getting that. This is our opportunity to address that and to set out clearly what the responsibilities are of the companies and the regulator, and how the two work together so that we address the problem. I urge the Minister to work with those of us concerned about this and come forward with an amendment that he is happy with at Report, so that we can update this duty. On that basis, I am happy to withdraw the amendment for now.

Amendment 52A withdrawn.
Clause 16: Duty about content reporting
Amendment 53
Moved by
53: Clause 16, page 18, line 10, at end insert—
“(3A) Content that constitutes a fraudulent advertisement within the meaning of section 33.”Member’s explanatory statement
This amendment, and others in the name of Baroness Morgan, would extend the current provisions on transparency reporting, user reporting and user complaints to fraudulent advertisements.
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Hansard - - - Excerpts

My Lords, I shall speak to Amendments 53 to 55, and Amendments 86, 87, 162 to 173, and 175 to 181 in my name and that of the noble Lord, Lord Clement-Jones. I declare my relevant interests in this group of amendments as a non-executive director of the Financial Services Compensation Scheme and Santander UK, and chair of the Association of British Insurers—although, as we have heard, fraud is prevalent across all sectors, so we are all interested in these issues.

This debate follows on well from that on the last group of amendments, as we were just hearing. Fraud is now being discussed so widely in this House and in Parliament that there are three Bills before your Lordships’ House at the moment in which fraud is a very real issue. I am sure that there are others, but there are three major Bills—this one, the Economic Crime and Corporate Transparency Bill, and the Financial Services and Markets Bill.

These amendments seek to fill a noticeable gap in the Bill concerning fraudulent advertisements—a gap that can be easily remedied. The Minister has done a very good job so far with all groups that we have debated, batting away amendments, but I hope that he might just say, “Yes, I see the point of the amendment that you are putting forward, and I shall go away and think about it”. I will see what attitude and response we get at the end of the debate.

I had the great privilege, as I said yesterday when asking a question, of chairing this House’s 2022 inquiry into the Fraud Act 2006 and digital fraud. As we have heard, fraud is currently the fastest growing crime and is being facilitated by online platforms. Coincidentally, just today, UK Finance, the trade body for the UK banking industry, has published its fraud figures for 2022. It has conducted analysis on more than 59,000 authorised push payment fraud cases to show the sources of fraud. Authorised push payment is where the customer—the victim, unfortunately—transfers money to the fraudster and authorises that transfer but has often, or usually, been socially engineered into doing so. UK Finance is now asking where those frauds originate from, and its analysis shows that 78% of APP fraud cases originated online and accounted for 36% of losses, and 18% of fraud cases originated via telecommunications and accounted for 44% of losses.

I will leave to one side the fact that the Bill does not touch on emails and telecoms, and I shall focus today on fraudulent advertisements and fraud. I should say that I welcome the fact the Government changed the legislation from the draft Bill when the Bill was presented to the House of Commons so that fraudulent advertisements and fraud were caught more in the Bill than had originally been anticipated.

As we have heard, victims of fraud suffer not just financially but emotionally and mentally, with bouts of anxiety and depression. They report feeling “embarrassed or depressed” about being scammed. Many lose a significant amount of money in a way that severely impacts their lives and, in the worst cases, people have been known to take their own lives. In case of things such as romance scams or investment scams, people’s trust is severely undermined in any communication that they subsequently receive. I thank all of those victims of fraud who gave evidence to our inquiry and have done so to other inquiries in this House and in the House of Commons.

Fraud is a pretty broad term, as we set out in the report, and we should be clear that this Bill covers fraud facilitated by user-generated content or via search results and fraudulent advertisements on the largest social media and search services. My noble friend the Minister spoke about the meeting held earlier this week between Members of this House and Ministers, and officials produced a helpful briefing note that makes it clear that the Bill covers such fraud. However, as I said, emails, SMS and MMS messages, and internet service providers—web hosting services—are not covered by the Bill. There remains very much a gap that victims, sadly, can fall through.

The point of the amendments in the group, and the reason I hope that the Minister can at least say yes to some of them, is that they are pushing in the direction that the Government want to go too. At the moment, the Bill appears to exclude fraudulent advertisements from several key duties that apply to other priority illegal content, thereby leaving consumers with less protection. In particular, the duties or lack of them around transparency reporting, user reporting and complaints in relation to fraudulent advertisements is concerning. It does not make any sense. That is why I hope that the Minister can explain the drafting. It could be argued that fraudulent advertising is already included in transparency reporting as defined in the Bill, but that is limited to a description of platforms’ actions and does not include obligations to provide information on the incidence of fraudulent advertisements or other key details, as is required for other types of illegal content.

Transparency reporting, as I suspect we will hear from a number of noble Lords, is essential for the regulator to see how prevalent fraudulent advertisements are on a platform’s service and whether that platform is successfully mitigating the advertisements. It remains essential, too, that users can easily report fraudulent content when they come across it and for there to be a procedure that allows users to complain if platforms are failing in their duty to keep users safe.

I should point my noble friend to the Government’s fraud strategy published last week. Paragraph 86 states:

“We want to make it as simple as possible for users to report fraud they see online. This includes scam adverts, false celebrity endorsements and fake user profiles. In discussion with government, many of the largest tech companies have committed to making this process as seamless and consistent as possible. This means, regardless of what social media platform or internet site you are on, you should be able to find the ‘report’ button within a single click, and then able to select ‘report fraud or scams’.”


The Government are saying that they want user reporting to be as simple as possible. These amendments suggest ways in which we can make user reporting as simple as possible as regards fraudulent advertisers.

The amendments address the gap in the Bill’s current drafting by inserting fraudulent advertising alongside other illegal content duties for social media reporting in Clause 16, complaints in Clause 17 and the equivalent clauses for search engines in Clauses 26 and 27. The amendments add fraudulent advertising alongside other illegal content into the description of the transparency reporting requirements in Schedule 8. Without these amendments, the regulator will struggle to understand the extent of the problem of fraudulent advertisements and platforms will probably fail to prevent this harmful content being posted.

This will, I hope, be a short debate, and I look forward to hearing what my noble friend the Minister has to say on this point. I beg to move.

18:30
Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also have a pair of amendments in this group. I am patron of a charity called JobsAware, which specialises in dealing with fraudulent job advertisements. It is an excellent example of collaboration between government and industry in dealing with a problem such as this. Going forward, though, they will be much more effective if there is a decent flow of information and if this Bill provides the mechanism for that. I would be very grateful if my noble friend would agree to a meeting, between Committee and Report, to discuss how that might best be achieved within the construct of this Bill.

It is not just the authorities who are able to deter these sort of things from happening. If there is knowledge spread through reputable networks about who is doing these things, it becomes much easier for other people to stop them happening. At the moment, the experience in using the internet must bear some similarity to walking down a Victorian street in London with your purse open. It really is all our responsibility to try to do something about this, since we now live so much of our life online. I very much look forward to my noble friend’s response.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I had the great privilege of serving as a member of this House’s Fraud Act 2006 and Digital Fraud Committee under the excellent chairing of the noble Baroness, Lady Morgan. She has already told us of the ghastly effects that fraud has on individuals and indeed its adverse effects on businesses. We heard really dramatic statistics, such as when Action Fraud told us that 80% of fraud is cyber enabled.

Many of us here will have been victims of fraud—I have been a victim—or know people who have been victims of fraud. I was therefore very pleased when the Government introduced the fraudulent advertising provisions into the Bill, which will go some way to reducing the prevalence of online fraud. It seems to me that it requires special attention, which is what these amendments should do.

We heard in our inquiry about the problems that category 1 companies had in taking down fraudulent advertisements quickly. Philip Milton, the public policy manager at Meta, told us that it takes between 24 and 48 hours to review possibly harmful content after it has been flagged to the company. He recognised that, due to the deceptive nature of fraudulent advertising, Meta’s systems do not always recognise that advertising is fraudulent and, therefore, take-down rates would be variable. That is one of the most sophisticated tech platforms—if it has difficulties, just imagine the difficulty that other companies have in both recognising and taking down fraudulent advertising.

Again and again, the Bill recognises the difficulties that platforms have in systematising the protections provided in the Bill. Fraud has an ever-changing nature and is massively increasing—particularly so for fraudulent advertising. It is absolutely essential that the highest possible levels of transparency are placed upon the tech companies to report their response to fraudulent advertising. Both Ofcom and users need to be assured that not only do the companies have the most effective reporting systems but, just as importantly, they have the most effective transparency to check how well they are performing.

To do this, the obligations on platforms must go beyond the transparency reporting requirements in the Bill. These amendments would ensure that they include obligations to provide information on incidence of fraud advertising, in line with other types of priority illegal content. These increased obligations are part of checking the effectiveness of the Bill when it comes to being implemented.

The noble Baroness, Lady Stowell, told us on the fifth day of Committee, when taking about the risk-assessment amendments she had tabled:

“They are about ensuring transparency to give all users confidence”.—[Official Report, 9/5/23; col. 1755.]


Across the Bill, noble Lords have repeatedly stated that there needs to be a range of ways to judge how effectively the protections provided are working. I suggest to noble Lords that these amendments are important attempts to help make the Bill more accountable and provide the data to future-proof the harms it is trying to deal with. As we said in the committee report:

“Without sufficient futureproofing, technology will most likely continue to create new opportunities for fraudsters to target victims”.


I ask the Minister to at least look at some of these amendments favourably.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall say very briefly in support of these amendments that in 2017, the 5Rights Foundation, of which I am the chair, published the Digital Childhood report, which in a way was the thing that put the organisation on the map. The report looked at the evolving capacity of children through childhood, what technology they were using, what happened to them and what the impact was. We are about to release the report again, in an updated version, and one of the things that is most striking is the introduction of fraud into children’s lives. At the point at which they are evolving into autonomous people, when they want to buy presents for their friends and parents on their own, they are experiencing what the noble Baroness, Lady Morgan, expressed as embarrassment, loss of trust and a sense of deserting confidence—I think that is probably the phrase. So I just want to put on the record that this is a problem for children also.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this has been an interesting short debate and the noble Baroness, Lady Morgan, made a very simple proposition. I am very grateful to her for introducing this so clearly and comprehensively. Of course, it is all about the way that platforms will identify illegal, fraudulent advertising and attempt to align it with other user-to-user content in terms of transparency, reporting, user reporting and user complaints. It is a very straightforward proposition.

First of all, however, we should thank the Government for acceding to what the Joint Committee suggested, which was that fraudulent advertising should be brought within the scope of the Bill. But, as ever, we want more. That is what it is all about and it is a very straightforward proposition which I very much hope the Minister will accede to.

We have heard from around the Committee about the growing problem and I will be very interested to read the report that the noble Baroness, Lady Kidron, was talking about, in terms of the introduction of fraud into children’s lives—that is really important. The noble Baroness, Lady Morgan, mentioned some of the statistics from Clean Up the Internet, Action Fraud and so on, as did the noble Viscount, Lord Colville. And, of course, it is now digital. Some 80% of fraud, as he said, is cyber-enabled, and 23% of all reported frauds are initiated on social media—so this is bang in the area of the Bill.

It has been very interesting to see how some of the trade organisations, the ABI and others, have talked about the impact of fraud, including digital fraud. The ABI said:

“Consumers’ confidence is being eroded by the ongoing proliferation of online financial scams, including those predicated on impersonation of financial service providers and facilitated through online advertising. Both the insurance and long-term savings sectors are impacted by financial scams perpetrated via online paid-for advertisements, which can deprive vulnerable consumers of their life savings and leave deep emotional scars”.


So, this is very much a cross-industry concern and very visible to the insurance industry and no doubt to other sectors as well.

I congratulate the noble Baroness, Lady Morgan, on her chairing of the fraud committee and on the way it came to its conclusions and scrutinised the Bill. Paragraphs 559, 560 and 561 all set out where the Bill needs to be aligned to the other content that it covers. As she described, there are two areas where the Bill can be improved. If they are not cured, they will substantially undermine its ability to tackle online fraud effectively.

This has the backing of Which? As the Minister will notice, it is very much a cross-industry and consumer body set of amendments, supporting transparency reporting and making sure that those platforms with more fraudulent advertising make proportionately larger changes to their systems. That is why there is transparency reporting for all illegal harms that platforms are obliged to prevent. There is no reason why advertising should be exempt. On user reporting and complaints, it is currently unclear whether this applies only to illegal user-generated content and unpaid search content or if it also applies to illegal fraudulent advertisements. At the very least, I hope the Minister will clarify that today.

Elsewhere, the Bill requires platforms to allow users to complain if the platform fails to comply with its duties to protect users from illegal content and with regard to the content-reporting process. I very much hope the Minister will accede to including that as well.

Some very simple requests are being made in this group. I very much hope that the Minister will take them on board.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

It is the simple requests that always seem to evade the easy solutions. I will not go back over the very good introductory speech from the noble Baroness, which said it all; the figures are appalling and the range of fraud-inspired criminality is extraordinary. It plays back to a point we have been hammering today: if this Bill is about anything, it is the way the internet amplifies that which would be unpleasant anyway but will now reach epidemic proportions.

I wonder whether that is the clue to the problem the noble Baroness was commenting on—I think more in hope than in having any way to resolve it. It is great news that three Bills are doing all the stuff we want. We have talked a bit about three-legged stools; this is another one that might crash over. If we are not careful, it will slip through the cracks. I am mixing my metaphors again.

If the Minister would not mind a bit of advice, it seems to me that this Bill could do certain things and do them well. It should not hold back and wait for the others to catch up or do things differently. The noble Baroness made the point about the extraordinarily difficult to understand gap, in that what is happening to priority illegal content elsewhere in the Bill does not apply to this, even though it is clearly illegal activity. I understand that there is a logical line that it is not quite the same thing—that the Bill is primarily about certain restricted types of activity on social media and not the generality of fraud—but surely the scale of the problem and our difficulty in cracking down on it, by whatever routes and whatever size of stool we choose, suggest that we should do what we can in this Bill and do it hard, deeply and properly.

Secondly, we have amendments later in Committee on the role of the regulators and the possibility recommended by the Communications and Digital Committee that we should seek statutory backing for regulation in this area. Here is a classic example of more than two regulators working to achieve the same end that will probably bump into each other on the way. There is no doubt that the FCA has primary responsibility in this area, but the reality is that the damage is being done by the amplification effect within the social media companies.

18:45
It may or may not be correct, in terms of what we are doing, to restrict what the Bill does to those aspects of user-to-user content and other areas. If something is illegal, surely the Bill should be quite clear that it should not be happening and Ofcom should have the necessary powers, however we frame them, to make sure we follow this through to the logical conclusion. The most-needed powers are the ability for Ofcom to take the lead, if required, in relation to the other regulators who have an impact on this world—can we be sure that is in the Bill and can be exercised?—and to make sure that the transparency, the user reporting and the complaints issues that are so vital to cracking this in the medium term get sorted. I leave that with the Minister to take forward.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to my noble friends for their amendments in this group, and for the useful debate that we have had. I am grateful also to my noble friend Lady Morgan of Cotes and the members of her committee who have looked at fraud, and for the work of the Joint Committee which scrutinised the Bill, in earlier form, for its recommendations on strengthening the way it tackles fraud online. As the noble Lord, Lord Clement-Jones, said, following those recommendations, the Government have brought in new measures to strengthen the Bill’s provisions to tackle fraudulent activity on in-scope services. I am glad he was somewhat satisfied by that.

All in-scope services will be required to take proactive action to tackle fraud facilitated through user-generated content. In addition, the largest and most popular platforms have a stand-alone duty to prevent fraudulent paid-for advertising appearing on their services. This represents a major step forward in ensuring that internet users are protected from scams, which have serious financial and psychological impacts, as noble Lords noted in our debate. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. Advertising involves a broad range of actors not covered by the current legislative framework, such as advertising intermediaries. I am sympathetic to these concerns and the Government are taking action in this area. Through the online advertising programme, we will deliver a holistic review of the regulatory framework in relation to online advertising. The Government consulted on this work last year and aim to publish a response erelong. As the noble Lord, Lord Stevenson, and others noted, there are a number of Bills which look at this work. Earlier this week, there was a meeting hosted by my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn to try to avoid the cracks opening up between the Bills. I am grateful to my noble friend Lady Morgan for attending; I hope it was a useful discussion.

I turn to the amendments tabled by my noble friend. The existing duties on user reporting and user complaints have been designed for user-generated content and search content and are not easily applicable to paid-for advertising. The duties on reporting and complaints mechanisms require platforms to take action in relation to individual complaints, but many in-scope services do not have control over the paid-for advertising on their services. These amendments are therefore difficult to operate for many in-scope services and would create a substantial burden for small businesses. I assure her and other noble Lords that the larger services, which have strong levers over paid-for advertising, will have to ensure that they have processes in place to enable users to report fraudulent advertising.

In reference to transparency reporting, let me assure my noble friend and others that Ofcom can already require information about how companies comply with their fraudulent advertising duties through transparency reports. In addition, Ofcom will also have the power to gather any information it requires for the purpose of exercising its online safety functions. These powers are extensive and will allow Ofcom to assess compliance with the fraudulent advertising duties.

The noble Viscount, Lord Colville of Culross, asked about the difficulty of identifying fraudulent advertising. Clauses 170 and 171 give guidance and a duty on Ofcom about providers making a judgment about content, including fraudulent advertising. There will also be a code of practice on fraudulent advertising to provide further guidance on mechanisms to deal with this important issue.

My noble friend Lord Lucas’s Amendments 94 and 95 aim to require services to report information relating to fraudulent advertising to UK authorities. I am confident that the Bill’s duties will reduce the prevalence of online fraud, reducing the need for post hoc reporting in this way. If fraud does appear online, there are adequate systems in place for internet users to report this to the police.

People can report a scam to Action Fraud, the national reporting service for fraud and cybercrime. Reports submitted to Action Fraud are considered by the National Fraud Intelligence Bureau and can assist a police investigation. Additionally, the Advertising Standards Authority has a reporting service for reporting online scam adverts, and those reports are automatically shared with the National Cyber Security Centre.

The online advertising programme, which I mentioned earlier, builds on the Bill’s fraudulent advertising duty and looks at the wider online advertising system. That programme is considering measures to increase accountability and transparency across the supply chain, including proposals for all parties to enhance record keeping and information sharing.

My noble friend Lord Lucas was keen to meet to speak further. I will pass that request to my noble friend Lord Sharpe of Epsom, who I think would be the better person to talk to in relation to this on behalf of the Home Office—but I am sure that one of us will be very happy to talk with him.

I look forward to discussing this issue in more detail with my noble friend Lady Morgan and others between now and Report, but I hope that this provides sufficient reassurance on the work that the Government are doing in this Bill and in other ways. I invite my noble friends not to press their amendments.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I am grateful to my noble friend for replying to my amendments and for his offer of a meeting, which I will certainly accept when issued.

The Government are missing some opportunities here. I do not know whether he has tried reporting something to Action Fraud, but if you have not lost money you cannot do it; you need to have been gulled and lost money for any of the government systems to take you seriously. While you can report something to the other ones, they do not tell you what they have done. There is no feedback or mechanism for encouraging and rewarding you for reporting—it is a deficient system.

When it comes to job adverts, by and large they go through job boards. There is a collection of people out there who are not direct internet providers who have leverage, and a flow of data to them can make a huge difference; there may also be other areas. It is that flow of data that enables job scams to be piled down on, and that is what the Bill needs to improve. Although the industry as a whole is willing, there just is not the impetus at the moment to make prevention nearly as good as it should be.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friend the Minister very much indeed for his response. Although this has been a short debate, it is a good example of us all just trying to get the Bill to work as well as possible—in this case to protect consumers, but there will be other examples as well.

My noble friend said that the larger services in particular are the ones that are going to have to deal with fraudulent advertisements, so I think the issue about the burdens of user reporting do not apply. I remind him of the paragraph I read out from the Fraud Strategy, where the Government themselves say that they want to make the reporting of fraud online as easy as possible. I will read the record of what he said very carefully, but it might be helpful after that to have a further conversation or perhaps for him to write to reassure those outside this Committee who are looking for confirmation about how transparency reporting, user reporting and complaints will actually apply in relation to fraudulent advertisements, so that this can work as well as possible.

On that basis, I will withdraw my amendment for today, but I think we would all be grateful for further discussion and clarification so that this part of the Bill works as well as possible to protect people from any kind of fraudulent advertisement.

Amendment 53 withdrawn.
Clause 16 agreed.
Clause 17: Duties about complaints procedures
Amendments 53A to 55 not moved.
Clause 17 agreed.
House resumed.
House adjourned at 6.56 pm.