Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberI thank the noble Lord for his intervention. He has made me think of the fact that a particular area where this may be of grave concern is cosmetic procedures, which I think we debated during the passage of the Health and Care Act. These things are all interrelated, and it is important that we see them in an interrelated way as part of what is now the health system.
My Lords, I will speak to a number of amendments in this group. I want to make the point that misinformation and disinformation was probably the issue we struggled with the most in the pre-legislative committee. We recognised the extraordinary harm it did, but also—as the noble Baroness, Lady Fox, said—that there is no one great truth. However, algorithmic spread and the drip, drip, drip of material that is not based on any search criteria or expression of an opinion but simply gives you more of the same, particularly the most shocking, moves very marginal views into the mainstream.
I am concerned that our debates over the last five days have concentrated so much on content, and that the freedom we seek does not take enough account of the way in which companies currently exercise control over the information we see. Correlations such as “Men who like barbecues are also susceptible to conspiracy theories” are then exploited to spread toxic theories that end in real-world harm or political tricks that show, for example, the Democrats as a paedophile group. Only last week I saw a series of pictures, presented as “evidence”, of President Biden caught in a compromising situation that gave truth to that lie. As Maria Ressa, the Nobel Peace Prize winner for her contribution to the freedom of expression, said in her acceptance speech:
“Tech sucked up our personal experiences and data, organized it with artificial intelligence, manipulated us with it, and created behavior at a scale that brought out the worst in humanity”.
That is the background to this set of amendments that we must take seriously.
As the noble Lord, Lord Bethell, said, Amendment 52 will ensure that platforms undertake a health misinformation risk assessment and provide a clear policy on dealing with harmful, false and misleading information. I put it to the Committee that, without this requirement, we will keep the status quo in which clicks are king, not health information.
It is a particular pleasure to support the noble Lord, Lord Moylan, on his Amendments 59 and 107. Like him, I am instinctively against taking material down. There are content-neutral ways of marking or questioning material, offering alternatives and signposting to diverse sources—not only true but diverse. These can break this toxic drip feed for long enough for people to think before they share, post and make personal decisions about the health information that they are receiving.
I am not incredibly thrilled by a committee for every occasion, but since the Bill is silent on the issue of misinformation and disinformation—which clearly will be supercharged by the rise of large language data models—it would be good to give a formal role to this advisory committee, so that it can make a meaningful and formal contribution to Ofcom as it develops not only this code of conduct but all codes of conduct.
Likewise, I am very supportive of Amendment 222, which seeks independence for the chair of the advisory body. I have seen at first hand how a combination of regulatory capture and a very litigious sector with deep pockets slows down progress and transparency. While the independence of the chair should be a given, our collective lived experience would suggest otherwise. This amendment would make that requirement clear.
Finally, and in a way most importantly, Amendment 224 would allow Ofcom to consider after the effect whether the code of conduct is necessary. This strikes a balance between adding to its current workload, which we are trying not to do, and tying one hand behind its back in the future. I would be grateful to hear from the Minister why we would not give Ofcom this option as a reasonable piece of future-proofing, given that this issue will be ever more important as AI creates layers of misinformation and disinformation at scale.
My Lords, I support Amendment 52, tabled by my noble friend Lady Merron. This is an important issue which must be addressed in the Bill if we are to make real progress in making the internet a safer space, not just for children but for vulnerable adults.
We have the opportunity to learn lessons from the pandemic, where misinformation had a devastating impact, spreading rapidly online like the virus and threatening to undermine the vaccine rollout. If the Government had kept their earlier promise to include protection from harmful false health content in their indicative list of harmful content that companies would have been required to address under the now removed adult safety duties, these amendments would not be necessary.
It is naive to think that platforms will behave responsibly. Currently, they are left to their own devices in how they tackle health misinformation, without appropriate regulatory oversight. They can remove it at scale or leave it completely unchecked, as illustrated by Twitter’s decision to stop enforcing its Covid-19 misinformation policies, as other noble Lords have pointed out.
It is not a question of maintaining free speech, as some might argue. It was the most vulnerable groups who suffered from the spread of misinformation online—pregnant women and the BAME community, who had higher illness rates. Studies have shown that, proportionately, more of them died, not just because they were front-line workers but because of rumours spread in the community which resulted in vaccine hesitancy, with devastating consequences. As other noble Lords have pointed out, in 2021 the Royal College of Obstetricians and Gynaecologists found that only 42% of women who had been offered the vaccine accepted it, and in October that year one in five of the most critically ill Covid patients were unvaccinated, pregnant women. That is a heartbreaking statistic.
Unfortunately, it is not just vaccine fears that are spread on the internet. Other harmful theories can affect patients with cancer, mental health issues and sexual health issues, and, most worryingly, can affect children’s health. Rumours and misinformation play on the minds of the most vulnerable. The Government have a duty to protect people, and by accepting this amendment they would go some way to addressing this.
Platforms must undertake a health misinformation risk assessment and have a clear policy on dealing with harmful, false and misleading health information in their terms of service. They have the money and the expertise to do this, and Parliament must insist. As my noble friend Lady Merron said, I do not think that the Minister can say that the false communications offence in Clause 160 will address the problem, as it covers only a user sending a knowingly false communication with the intention of causing harm. The charity Full Fact has stated that this offence will exclude most health misinformation that it monitors online.
My Lords, this debate has demonstrated the diversity of opinion regarding misinformation and disinformation—as the noble Lord said, the Joint Committee gave a lot of thought to this issue—as well as the difficulty of finding the truth of very complex issues while not shutting down legitimate debate. It is therefore important that we legislate in a way that takes a balanced approach to tackling this, keeping people safe online while protecting freedom of expression.
The Government take misinformation and disinformation very seriously. From Covid-19 to Russia’s use of disinformation as a tool in its illegal invasion of Ukraine, it is a pervasive threat, and I pay tribute to the work of my noble friend Lord Bethell and his colleagues in the Department of Health and Social Care during the pandemic to counter the cynical and exploitative forces that sought to undermine the heroic effort to get people vaccinated and to escape from the clutches of Covid-19.
We recognise that misinformation and disinformation come in many forms, and the Bill reflects this. Its focus is rightly on tackling the most egregious, illegal forms of misinformation and disinformation, such as content which amounts to the foreign interference offence or which is harmful to children—for instance, that which intersects with named categories of primary priority or priority content.
That is not the only way in which the Bill seeks to tackle it, however. The new terms of service duties for category 1 services will hold companies to account over how they say they treat misinformation and disinformation on their services. However, the Government are not in the business of telling companies what legal content they can and cannot allow online, and the Bill should not and will not prevent adults accessing legal content. In addition, the Bill will establish an advisory committee on misinformation and disinformation to provide advice to Ofcom on how they should be tackled online. Ofcom will be given the tools to understand how effectively misinformation and disinformation are being addressed by platforms through transparency reports and information-gathering powers.
Amendment 52 from the noble Baroness, Lady Merron, seeks to introduce a new duty on platforms in relation to health misinformation and disinformation for adult users, while Amendments 59 and 107 from my noble friend Lord Moylan aim to introduce new proportionality duties for platforms tackling misinformation and disinformation. The Bill already addresses the most egregious types of misinformation and disinformation in a proportionate way that respects freedom of expression by focusing on misinformation and disinformation that are illegal or harmful to children.
I am curious as to what the Bill says about misinformation and disinformation in relation to children. My understanding of primary priority and priority harms is that they concern issues such as self-harm and pornography, but do they say anything specific about misinformation of the kind we have been discussing and whether children will be protected from it?
I am sorry—I am not sure I follow the noble Baroness’s question.
Twice so far in his reply, the Minister has said that this measure will protect children from misinformation and disinformation. I was just curious because I have not seen any sight of that, either in discussions or in the Bill. I was making a distinction regarding harmful content that we know the shape of—for example, pornography and self-harm, which are not, in themselves, misinformation or disinformation of the kind we are discussing now. It is news to me that children are going to be protected from this, and I am delighted, but I was just checking.
Yes, that is what the measure does—for instance, where it intersects with the named categories of primary priority or priority content in the Bill, although that is not the only way the Bill does it. This will be covered by non-designated content that is harmful to children. As we have said, we will bring forward amendments on Report—which is perhaps why the noble Baroness has not seen them in the material in front of us—regarding material harms to children, and they will provide further detail and clarity.
Returning to the advisory committee that the Bill sets up and the amendments from the noble Baroness, Lady Merron, and my noble friend Lord Moylan, all regulated service providers will be forced to take action against illegal misinformation and disinformation in scope of the Bill. That includes the new false communication offences in the Bill that will capture communications where the sender knows the information to be false but sends it intending to cause harm—for example, hoax cures for a virus such as Covid-19. The noble Baroness is right to say that that is a slightly different approach from the one taken in her amendment, but we think it an appropriate and proportionate response to tackling damaging and illegal misinformation and disinformation. If a platform is likely to be accessed by children, it will have to protect them from encountering misinformation and disinformation content that meets the Bill’s threshold for content that is harmful to children. Again, that is an appropriate and proportionate response.
Turning to the points made by my noble friend Lord Moylan and the noble Baroness, Lady Fox, services will also need to have particular regard to freedom of expression when complying with their safety duties. Ofcom will be required to set out steps that providers can take when complying with their safety duties in the codes of practice, including what is proportionate for different providers and how freedom of expression can be protected.
My Lords, it is a pleasure to follow the noble Baroness, Lady Prashar, and I join her in thanking the noble Lord, Lord Knight, for introducing this group very clearly.
In taking part in this debate, I declare a joint interest with the noble Baroness, Lady Fox, in that I was for a number of years a judge in the Debating Matters events to which she referred. Indeed, the noble Baroness was responsible for me ending up in Birmingham jail, during the time that such a debate was conducted with the inmates of Birmingham jail. We have a common interest there.
I want to pick up a couple of additional points. Before I joined your Lordships’ Committee today I was involved in the final stages of the Committee debate on the economic crime Bill, where the noble Lord, Lord Sharpe of Epsom, provided a powerful argument—probably unintentionally—for the amendments we are debating here now. We were talking, as we have at great length in the economic crime Bill, about the issue of fraud. As the noble Lord, Lord Holmes of Richmond, highlighted, in the context of online harms fraud is a huge aspect of people’s lives today and one that has been under-covered in this Committee, although it has very much been picked up in the economic crime Bill Committee. As we were talking about online fraud, the noble Lord, Lord Sharpe of Epsom, said that consumers have to be “appropriately savvy”. I think that is a description of the need for education and critical thinking online, equipping people with the tools to be, as he said, appropriately savvy when facing the risks of fraud and scams, and all the other risks that people face online.
I have attached my name to two amendments here: Amendment 91, which concerns the providers of category 1 and 2A services having a duty, and Amendment 236, which concerns an Ofcom duty. This joins together two aspects. The providers are making money out of the services they provide, which gives them a duty to make some contribution to combatting the potential harms that their services present to people. Ofcom as a regulator obviously has a role. I think it was the noble Lord, Lord Knight, who said that the education system also has a role, and there is some reference in here to Ofsted having a role.
What we need is a cross-society, cross-systems approach. This is where I also make the point that we need to think outside the scope of the Bill—it is part of the whole package—about how the education system works, because media literacy is not a stand-alone thing that you can separate out from the issues of critical thinking more broadly. We need to think about our education system, which is far too often, for schools in particular, where we get pupils to learn and regurgitate a whole set of facts and then reward them for that. We need to think about how our education system prepares children for the modern online world.
There is a great deal we can learn from the example—often cited but worth referring to—of Finland, which by various tests has been ranked as the country most resistant to fake news. A very clearly built-in idea of questioning, scrutiny and challenge is being encouraged among pupils, starting from the age of seven. That is something we need to transform our education system to achieve. However, of course, many people using the internet now are not part of our education system, so this needs to be across our society. A focus on the responsibilities of Ofcom and the providers has to be in the Bill.
My Lords, over the last decade, I have been in scores of schools, run dozens of workshops and spoken to literally thousands of children and young people. A lot of what I pass off as my own wisdom in this Chamber is, indeed, their wisdom. I have a couple of points, and I speak really from the perspective of children under 18 with regard to these amendments, which I fully support.
Media literacy—or digital literacy, as it is sometimes called—is not the same as e-safety. E-safety regimes concentrate on the behaviour of users. Very often, children say that what they learn in those lessons is focused on adult anxieties about predators and bullies, and when something goes wrong, they feel that they are to blame. It puts the responsibility on children. This response, which I have heard hundreds of times, normally comes up after a workshop in which we have discussed reward loops, privacy, algorithmic bias, profiling or—my own favourite—a game which reveals what is buried in terms and conditions; for example, that a company has a right to record the sound of a device or share their data with more than a thousand other companies. When young people understand the pressures that they are under and which are designed into the system, they feel much better about themselves and rather less enamoured of the services they are using. It is my experience that they then go on to make better choices for themselves.
Secondly, we have outsourced much of digital literacy to companies such as Google and Meta. They too concentrate on user behaviour, rather than looking at their own extractive policies focused on engagement and time spent. With many schools strapped for cash and expertise, this teaching is widespread. However, when I went to a Google-run assembly, children aged nine were being taught about features available only on services for those aged over 13—and nowhere was there a mention of age limits and why they are important. It cannot be right that the companies are grooming children towards their services without taking full responsibility for literacy, if that is the literacy that children are being given in school.
Thirdly, as the Government’s own 2021 media literacy strategy set out, good media literacy is one line of defence from harm. It could make a crucial difference in people making informed and safe decisions online and engaging in a more positive online debate, at the same time as understanding that online actions have consequences offline.
However, while digital literacy and, in particular, critical thinking are fundamental to a contemporary education and should be available throughout school and far beyond, they must not be used as a way of putting responsibility on the user for the company’s design decisions. I am specifically concerned that in the risk-assessment process, digital literacy is one of the ways that a company can say it has mitigated a potential risk or harm. I should like to hear from the Minister that that is an additional responsibility and not instead of responsibility.
Finally, over all these years I have always asked at the end of the session what the young people care about the most. The second most important thing is that the system should be less addictive—it should have less addiction built into it. Again, I point the Committee in the direction of the safety-by-design amendments in the name of my noble friend Lord Russell that try to get to the crux of that. They are not very exciting amendments in this debate but they get to the heart of it. However, the thing the young people most often say is, “Could you do something to get my parents to put down their phones?” I therefore ask the Minister whether he can slip something into the Bill, and indeed ask the noble Lord, Lord Grade, whether that could emerge somewhere in the guidance. That is what young people want.
My Lords, I strongly support the amendments in the name of my noble friend Lord Knight and others in this group.
We cannot entirely contain harmful, misleading and dangerous content on the internet, no matter how much we strengthen the Bill. Therefore, it is imperative that we put a new duty on category 1 and category 2A services to require them to put in place measures to promote the media literacy of users so that they can use the service safely.
I know that Ofcom takes the issue of media literacy seriously, but it is regrettable that the Government have dropped their proposal for a new media literacy duty for Ofcom. So far, I see no evidence that the platforms take media literacy seriously, so they need to be made to understand that they have corporate social responsibilities towards their clients.
Good media literacy is the first line of defence from bad information and the kind of misinformation we have discussed in earlier groups. Schools are trying to prepare their pupils to understand that the internet can peddle falsehoods as well as useful facts, but they need support, as the noble Baroness, Lady Kidron, just said. We all need to increase our media literacy, especially with the increasing use of artificial intelligence, as it can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and well-being, social cohesion and democracy.
In 2022, Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information online, and 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so, as my noble friend Lord Knight, has pointed out.
Amendment 91 would mean that platforms have to instigate measures to give users an awareness and understanding of the nature and characteristics of the content that may be on the service, its potential impact and how platforms operate. That is a sensible and practical request that is not beyond the ability of companies to provide, and it will be to everyone’s benefit.
My Lords, this has been a good debate. I am glad that a number of noble Lords mentioned Lord Puttnam and the committee that he chaired for your Lordships’ House on democracy and digital technologies. I responded to the debate that we had on that; sadly, it was after he had already retired from your Lordships’ House, but he participated from the steps of the Throne. I am mindful of that report and the lessons learned in it in the context of the debate that we have had today.
We recognise the intent behind the amendments in this group to strengthen the UK’s approach to media literacy in so far as it relates to services that will be regulated by the Bill. Ofcom has a broad duty to promote media literacy under the Communications Act 2003. That is an important responsibility for Ofcom, and it is right that the regulator is able to adapt its approach to support people in meeting the evolving challenges of the digital age.
Amendments 52A and 91 from the noble Lord, Lord Knight, and Amendment 91A from the noble Lord, Lord Holmes of Richmond, seek to introduce duties on in-scope services, requiring them to put in place measures that promote users’ media literacy, while Amendment 98 tabled by the noble Lord, Lord Knight, would require Ofcom to issue a code of practice in relation to the new duty proposed in his Amendment 91. While we agree that the industry has a role to play in promoting media literacy, the Government believe that these amendments could lead to unintended, negative consequences.
I shall address the role of the industry and media literacy, which the noble Baroness, Lady Kidron, dwelt on in her remarks. We welcome the programmes that it runs in partnership with online safety experts such as Parent Zone and Internet Matters and hope they continue to thrive, with the added benefit of Ofcom’s recently published evaluation toolkit. However, we believe that platforms can go further to empower and educate their users. That is why media literacy has been included in the Bill’s risk assessment duties, meaning that regulated services will have to consider measures to promote media literacy to their users as part of the risk assessment process. Additionally, through work delivered under its existing media literacy duty, Ofcom is developing a set of best-practice design principles for platform-based media literacy measures. That work will build an evidence base of the most effective measures that platforms can take to build their users’ media literacy.
In response to the noble Baroness’s question, I say: no, platforms will not be able to avoid putting in place protections for children by using media literacy campaigns. Ofcom would be able to use its enforcement powers if a platform was not achieving appropriate safety outcomes. There are a range of ways in which platforms can mitigate risks, of which media literacy is but one, and Ofcom would expect platforms to consider them all in their risk assessments.
Let me say a bit about the unintended consequences we fear might arise from these amendments. First, the resource demands to create a code of practice and then to regulate firms’ compliance with this type of broad duty will place an undue burden on the regulator. It is also unclear how the proposed duties in Amendments 52A, 91 and 91A would interact with Ofcom’s existing media literacy duty. There is a risk, we fear, that these parallel duties could be discharged in conflicting ways. Amendment 91A is exposed to broad interpretation by platforms and could enable them to fulfil the duty in a way that lacked real impact on users’ media literacy.
The amendment in the name of my noble friend Lord Holmes proposes a duty to promote awareness of financial deception and fraud. The Government are already taking significant action to protect people from online fraud, including through their new fraud strategy and other provisions in this Bill. I know that my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn met noble Lords to talk about that earlier this week. We believe that measures such as prompts for users before they complete financial transactions sit more logically with financial service providers than with services in scope of this Bill.
Amendment 52A proposes a duty on carriers of journalistic content to promote media literacy to their users. We do not want to risk requiring platforms to act as de facto press regulators, assessing the quality of news publishers’ content. That would not be compatible with our commitment to press freedom. Under its existing media literacy duty, Ofcom is delivering positive work to support people to discern high-quality information online. It is also collaborating with the biggest platforms to design best practice principles for platform-based media literacy measures. It intends to publish these principles this year and will encourage platforms to adopt them.
It is right that Ofcom is given time to understand the benefits of these approaches. The Secretary of State’s post-implementation review will allow the Government and Parliament to establish the effectiveness of Ofcom’s current approach and to reconsider the role of platforms in enhancing users’ media literacy, if appropriate. In the meantime, the Bill introduces new transparency-reporting and information-gathering powers to enhance Ofcom’s visibility of platforms delivery and evaluation of media literacy activities. We would not want to see amendments that would inadvertently dissuade platforms from delivering these activities in favour of less costly and less effective measures.
My noble friend Lord Holmes asked about the Online Media Literacy Strategy, published in July 2021, which set out the Government’s vision for improving media literacy in the country. Alongside the strategy, we have committed to publishing annual action plans each financial year until 2024-25, setting out how we meet the ambition of the strategy. In April 2022 we published the Year 2 Action Plan, which included extending the reach of media literacy education to those who are currently disengaged, in consultation with the media literacy task force—a body of 17 cross-sector experts—expanding our grant funding programme to provide nearly £2.5 million across two years for organisations delivering innovative media literacy activities, and commissioning research to improve our understanding of the challenges faced by the sector. We intend to publish the research later this year, for the benefit of civil society organisations, technology platforms and policymakers.
The noble Lord, Lord Knight, in his Amendment 186, would stipulate that Ofcom must levy fees on regulated firms sufficient to fund the work of third parties involved in supporting it to meet its existing media literacy duties. The Bill already allows Ofcom to levy fees sufficient to fund the annual costs of exercising its online safety functions. This includes its existing media literacy duty as far as it relates to services regulated by this Bill. As such, the Bill already ensures that these media literacy activities, including those that Ofcom chooses to deliver through third parties, can be funded through fees levied on industry.
I turn to Amendments 188, 235, 236, 237 and 238. The Government recognise the intent behind these amendments, which is to help improve the media literacy of the general public. Ofcom already has a statutory duty to promote media literacy with regard to the publication of anything by means of electronic media, including services in scope of the Bill. These amendments propose rather prescriptive objectives, either as part of a new duty for Ofcom or through updating its existing duty. They reflect current challenges in the sector but run the risk of becoming obsolete over time, preventing Ofcom from adapting its work in response to emerging issues.
Ofcom has demonstrated flexibility in its existing duty through its renewed Approach to Online Media Literacy, launched in 2021. This presented an expanded media literacy programme, enabling it to achieve almost all the objectives specified in this group. The Government note the progress that Ofcom has already achieved under its renewed approach in the annual plan it produced last month. The Online Safety Bill strengthens Ofcom’s functions relating to media literacy, which is included in Ofcom’s new transparency-reporting and information-gathering powers, which will give it enhanced oversight of industry activity by enabling it to require regulated services to share or publish information about the work that that they are doing on media literacy.
The noble Baroness, Lady Prashar, asked about the view expressed by the Joint Committee on minimum standards for media literacy training. We agree with the intention behind that, but, because of the broad and varied nature of media literacy, we do not believe that introducing minimum standards is the most effective way of achieving that outcome. Instead, we are focusing efforts on improving the evaluation practices of media literacy initiatives to identify which ones are most effective and to encourage their delivery. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. This was published in February this year and has been met with praise from practitioners, including those who received grant funding from the Government’s non-legislative media literacy work programme. The post-implementation review of Ofcom’s online safety regime, which covers its existing media literacy duty in so far as it relates to regulated services, will provide a reasonable point at which to establish the effectiveness of Ofcom’s new work programme, after giving it time to take effect.
Noble Lords talked about the national curriculum and media literacy in schools. Media literacy is indeed a crucial skill for everyone in the digital age. Key media literacy skills are already taught through a number of compulsory subjects in the national curriculum. Digital literacy is included in the computing national curriculum in England, which equips pupils with the knowledge, understanding and skills to use information and communication technology creatively and purposefully. I can reassure noble Lords that people such as Monica are being taught not about historic things like floppy disks but about emerging and present challenges; the computing curriculum ensures that pupils are taught how to design program systems and accomplish goals such as collecting, analysing, evaluating and presenting data.
Does the Minister know how many children are on computing courses?
My Lords, I had the great privilege of serving as a member of this House’s Fraud Act 2006 and Digital Fraud Committee under the excellent chairing of the noble Baroness, Lady Morgan. She has already told us of the ghastly effects that fraud has on individuals and indeed its adverse effects on businesses. We heard really dramatic statistics, such as when Action Fraud told us that 80% of fraud is cyber enabled.
Many of us here will have been victims of fraud—I have been a victim—or know people who have been victims of fraud. I was therefore very pleased when the Government introduced the fraudulent advertising provisions into the Bill, which will go some way to reducing the prevalence of online fraud. It seems to me that it requires special attention, which is what these amendments should do.
We heard in our inquiry about the problems that category 1 companies had in taking down fraudulent advertisements quickly. Philip Milton, the public policy manager at Meta, told us that it takes between 24 and 48 hours to review possibly harmful content after it has been flagged to the company. He recognised that, due to the deceptive nature of fraudulent advertising, Meta’s systems do not always recognise that advertising is fraudulent and, therefore, take-down rates would be variable. That is one of the most sophisticated tech platforms—if it has difficulties, just imagine the difficulty that other companies have in both recognising and taking down fraudulent advertising.
Again and again, the Bill recognises the difficulties that platforms have in systematising the protections provided in the Bill. Fraud has an ever-changing nature and is massively increasing—particularly so for fraudulent advertising. It is absolutely essential that the highest possible levels of transparency are placed upon the tech companies to report their response to fraudulent advertising. Both Ofcom and users need to be assured that not only do the companies have the most effective reporting systems but, just as importantly, they have the most effective transparency to check how well they are performing.
To do this, the obligations on platforms must go beyond the transparency reporting requirements in the Bill. These amendments would ensure that they include obligations to provide information on incidence of fraud advertising, in line with other types of priority illegal content. These increased obligations are part of checking the effectiveness of the Bill when it comes to being implemented.
The noble Baroness, Lady Stowell, told us on the fifth day of Committee, when taking about the risk-assessment amendments she had tabled:
“They are about ensuring transparency to give all users confidence”.—[Official Report, 9/5/23; col. 1755.]
Across the Bill, noble Lords have repeatedly stated that there needs to be a range of ways to judge how effectively the protections provided are working. I suggest to noble Lords that these amendments are important attempts to help make the Bill more accountable and provide the data to future-proof the harms it is trying to deal with. As we said in the committee report:
“Without sufficient futureproofing, technology will most likely continue to create new opportunities for fraudsters to target victims”.
I ask the Minister to at least look at some of these amendments favourably.
My Lords, I shall say very briefly in support of these amendments that in 2017, the 5Rights Foundation, of which I am the chair, published the Digital Childhood report, which in a way was the thing that put the organisation on the map. The report looked at the evolving capacity of children through childhood, what technology they were using, what happened to them and what the impact was. We are about to release the report again, in an updated version, and one of the things that is most striking is the introduction of fraud into children’s lives. At the point at which they are evolving into autonomous people, when they want to buy presents for their friends and parents on their own, they are experiencing what the noble Baroness, Lady Morgan, expressed as embarrassment, loss of trust and a sense of deserting confidence—I think that is probably the phrase. So I just want to put on the record that this is a problem for children also.
My Lords, this has been an interesting short debate and the noble Baroness, Lady Morgan, made a very simple proposition. I am very grateful to her for introducing this so clearly and comprehensively. Of course, it is all about the way that platforms will identify illegal, fraudulent advertising and attempt to align it with other user-to-user content in terms of transparency, reporting, user reporting and user complaints. It is a very straightforward proposition.
First of all, however, we should thank the Government for acceding to what the Joint Committee suggested, which was that fraudulent advertising should be brought within the scope of the Bill. But, as ever, we want more. That is what it is all about and it is a very straightforward proposition which I very much hope the Minister will accede to.
We have heard from around the Committee about the growing problem and I will be very interested to read the report that the noble Baroness, Lady Kidron, was talking about, in terms of the introduction of fraud into children’s lives—that is really important. The noble Baroness, Lady Morgan, mentioned some of the statistics from Clean Up the Internet, Action Fraud and so on, as did the noble Viscount, Lord Colville. And, of course, it is now digital. Some 80% of fraud, as he said, is cyber-enabled, and 23% of all reported frauds are initiated on social media—so this is bang in the area of the Bill.
It has been very interesting to see how some of the trade organisations, the ABI and others, have talked about the impact of fraud, including digital fraud. The ABI said:
“Consumers’ confidence is being eroded by the ongoing proliferation of online financial scams, including those predicated on impersonation of financial service providers and facilitated through online advertising. Both the insurance and long-term savings sectors are impacted by financial scams perpetrated via online paid-for advertisements, which can deprive vulnerable consumers of their life savings and leave deep emotional scars”.
So, this is very much a cross-industry concern and very visible to the insurance industry and no doubt to other sectors as well.
I congratulate the noble Baroness, Lady Morgan, on her chairing of the fraud committee and on the way it came to its conclusions and scrutinised the Bill. Paragraphs 559, 560 and 561 all set out where the Bill needs to be aligned to the other content that it covers. As she described, there are two areas where the Bill can be improved. If they are not cured, they will substantially undermine its ability to tackle online fraud effectively.
This has the backing of Which? As the Minister will notice, it is very much a cross-industry and consumer body set of amendments, supporting transparency reporting and making sure that those platforms with more fraudulent advertising make proportionately larger changes to their systems. That is why there is transparency reporting for all illegal harms that platforms are obliged to prevent. There is no reason why advertising should be exempt. On user reporting and complaints, it is currently unclear whether this applies only to illegal user-generated content and unpaid search content or if it also applies to illegal fraudulent advertisements. At the very least, I hope the Minister will clarify that today.
Elsewhere, the Bill requires platforms to allow users to complain if the platform fails to comply with its duties to protect users from illegal content and with regard to the content-reporting process. I very much hope the Minister will accede to including that as well.
Some very simple requests are being made in this group. I very much hope that the Minister will take them on board.