Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Lords ChamberMy Lords, I declare my interests as chair of 5Rights Foundation and the Digital Futures Commission, my positions at Oxford and LSE and at the UN Broadband Commission and the Institute for Ethics in AI, as deputy chair of the APPG on digital regulation and as a member of the Joint Committee on this Bill.
As has already been mentioned, on Monday I hosted the saddest of events, at which Ian Russell and Merry Varney, the Russell family’s solicitor, showed parliamentarians images and posts that had been algorithmically recommended to Molly in the lead-up to her death. These were images so horrible that they cannot be shown in the media, so numerous that we could see only a fraction, and so full of despair and violence that many of the adult professionals involved in the inquest had to seek counselling. Yet in court, much of this material was defended by two tech companies as being suitable for a 14 year-old. Something has gone terribly wrong. The question is: is this Bill sufficient to fix it?
At the heart of our debates should not be content but the power of algorithms that shape our experiences online. Those algorithms could be designed for any number of purposes, including offering a less toxic digital environment, but they are instead fixed on ranking, nudging, promoting and amplifying anything to keep our attention, whatever the societal cost. It does not need to be like that. Nothing about the digital world is a given; it is 100% engineered and almost all privately owned; it can be designed for any outcome. Now is the time to end the era of tech exceptionality and to mandate a level of product safety so that the sector, just like any other sector, does not put its users at foreseeable risk of harm. As Meta’s corporate advertising adorning bus stops across the capital says:
“The metaverse may be virtual, but the impact will be real.”
I very much welcome the Bill, but there are still matters to discuss. The Government have chosen to take out many of the protections for adults, which raises questions about the value and practicality of what remains. In Committee, it will be important to understand how enforcement of a raft of new offences will be resourced and to question the oversight and efficacy of the remaining adult provisions. Relying primarily on companies to be author, judge and jury of their own terms of service may well be a race to the bottom.
I regret that Parliament has been denied the proper opportunity to determine what kind of online world we want for adults, which, I believe, we will regret as technology enters its next phase of intelligence and automation. However, my particular concern is the fate of children, whose well-being is collateral damage to a profitable business model. Changes to the Bill will mean that child safety duties are no longer an add-on to a generally safer world; they are now the first and only line of defence. I have given the Secretary of State sight of my amendments, and I inform the House that they are not probing amendments; they are necessary to fill the gaps and loopholes in the Bill as it now stands. In short, we need to ensure that child safety duties apply to all services likely to be accessed by children. We must ensure the quality control of all age-assurance systems. Age checking must not focus on a particular harm, but on the child; it needs to be secure, privacy-preserving and proportionate, and it must work. The children’s risk assessment and the list of harms must cover each of the four Cs: content harm, conduct harm, contact harm and commercial harm, such as the recommendation loops of violence and self-hatred that push thousands of children into states of misery. Those harms must be in the Bill.
Coroners and bereaved parents must have access to data relevant to the death of a child to end the current inhumane arrangement whereby bereaved families facing the devasting loss of their child are forced to battle, unsuccessfully, with tech behemoths for years. I hope that the Minister will reiterate commitments made in the other place to close that loophole.
Children’s rights must be in the Bill. An unintended consequence of removing protections for adults is that children will now cost companies vastly more developer time, more content moderation and more legal costs than adults. The digital world is the organising technology of our society, and children need to be online for their education and information to participate in civic society—they must not be kicked out.
I thank all those who have indicated their support, and the Secretary of State, the Minister and officials for the considerable time they have given me. However, I ask the Minister to listen very carefully to the mood of the House this evening; the matters I have raised are desperately urgent and long-promised, and must now be delivered unequivocally.
While millions of children suffer from the negative effects of the online world, some pay with their lives. I am a proud supporter of a group of bereaved parents for online safety, and I put on the record that we remember Molly, Frankie, Olly, Breck, Sophie and all the others who have lost their lives. I hope that the whole House will join me in not resting until we have a Bill fit for their memory.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 8 months ago)
Lords ChamberMy Lords, I draw attention to my interests in the register, which I declared in full at Second Reading. It is an absolute pleasure to follow the noble Lord, Lord Stevenson, and, indeed, to have my name on this amendment, along with those of fellow members of the pre-legislative committee. It has been so long that it almost qualifies as a reunion tour.
This is a fortuitous amendment on which to start our deliberations, as it sets out the very purpose of the Bill—a North Star. I want to make three observations, each of which underlines its importance. First, as the pre-legislative committee took evidence, it was frequently remarked by both critics and supporters that it was a complicated Bill. We have had many technical briefings from DSIT and Ofcom, and they too refer to the Bill as “complicated”. As we took advice from colleagues in the other place, expert NGOs, the tech sector, academics and, in my own case, the 5Rights young advisory group, the word “complicated” repeatedly reared its head. This is a complex and ground-breaking area of policy, but there were other, simpler structures and approaches that have been discarded.
Over the five years with ever-changing leadership and political pressures, the Bill has ballooned with caveats and a series of very specific, and in some cases peculiar, clauses—so much so that today we start with a Bill that even those of us who are paying very close attention are often told that we do not understand. That should make the House very nervous.
It is a complicated Bill with intersecting and dependent clauses—grey areas from which loopholes emerge—and it is probably a big win for the deepest pockets. The more complicated the Bill is, the more it becomes a bonanza for the legal profession. As the noble Lord, Lord Stevenson, suggests, the Minister is likely to argue that the contents of the amendment are already in the Bill, but the fact that the word “complicated” is firmly stuck to its reputation and structure is the very reason to set out its purpose at the outset, simply and unequivocally.
Secondly, the OSB is a framework Bill, with vast amounts of secondary legislation and a great deal of work to be implemented by the regulator. At a later date we will discuss whether the balance between the Executive, the regulator and Parliament is exactly as it should be, but as the Bill stands it envisages a very limited future role for Parliament. If I might borrow an analogy from my previous profession, Parliament’s role is little more than that of a background extra.
I have some experience of this. In my determination to follow all stages of the age-appropriate design code, I found myself earlier this week in the Public Gallery of the other place to hear DSIT Minister Paul Scully, at Second Reading of the Data Protection and Digital Information (No. 2) Bill, pledge to uphold the AADC and its provisions. I mention this in part to embed it on the record—that is true—but primarily to make this point: over six years, there have been two Information Commissioners and double figures of Secretaries of State and Ministers. There have been many moments at which the interpretation, status and purpose of the code has been put at risk, at least once to a degree that might have undermined it altogether. At these moments, each time the issue was resolved by establishing the intention of Parliament beyond doubt. Amendment 1 moves Parliament from background extra to star of the show. It puts the intention of Parliament front and centre for the days, weeks, months and years ahead in which the work will still be ongoing—and all of us will have moved on.
The Bill has been through a long and fractured process in which the pre-legislative committee had a unique role. Many attacks on the Bill have been made by people who have not read it. Child safety was incorrectly cast as the enemy of adult freedom. While some wanted to apply the existing and known concepts and terms of public interest, protecting the vulnerable, product safety and the established rights and freedoms of UK citizens, intense lobbying has seen them replaced by untested concepts and untried language over which the tech sector has once again emerged as judge and jury. This has further divided opinion.
In spite of all the controversy, when published, the recommendations of the committee report received almost universal support from all sides of the debate. So I ask the Minister not only to accept the committee’s view that the Bill needs a statement of purpose, the shadow of which will provide shelter for the Bill long into the future, but to undertake to look again at the committee report in full. In its pages lies a landing strip of agreement for many of the things that still divide us.
This is a sector that is 100% engineered and almost all privately owned, and within it lie solutions to some of the greatest problems of our age. It does not have to be as miserable, divisive and exploitative as this era of exceptionalism has allowed it to be. As the Minister is well aware, I have quite a lot to say about proposed new subsection (1)(b),
“to provide a higher level of protection for children than for adults”,
but today I ask the Minister to tell us which of these paragraphs (a) to (g) are not the purpose of the Bill and, if they are not, what is.
My Lords, I am pleased that we are starting our Committee debate on this amendment. It is a pleasure to follow the noble Lord, Lord Stevenson, and the noble Baroness, Lady Kidron.
In this Bill, as has already been said, we are building a new and complex system and we can learn some lessons from designing information systems more generally. There are three classic mistakes that you can make. First, you can build systems to fit particular tools. Secondly, you can overcommit beyond what you can actually achieve. Thirdly, there is feature creep, through which you keep adding things on as you develop a new system. A key defence against these mistakes is to invest up front in producing a really good statement of requirements, which I see in Amendment 1.
On the first risk, as we go through the debate, there is a genuine risk that we get bogged down in the details of specific measures that the regulator might or might not include in its rules and guidance, and that we lose sight of our goals. Developing a computer system around a particular tool—for example, building everything with Excel macros or with Salesforce—invariably ends in disaster. If we can agree on the goals in Amendment 1 and on what we are trying to achieve, that will provide a sound framework for our later debates as we try to consider the right regulatory technologies that will deliver those goals.
The second cardinal error is overcommitting and underdelivering. Again, it is very tempting when building a new system to promise the customer that it will be all-singing, all-dancing and can be delivered in the blink of an eye. Of course, the reality is that in many cases, things prove to be more complex than anticipated, and features sometimes have to be removed while timescales for delivering what is left are extended. A wise developer will instead aim to undercommit and overdeliver, promising to produce a core set of realistic functions and hoping that, if things go well, they will be able to add in some extra features that will delight the customer as an unexpected bonus.
This lesson is also highly relevant to the Bill, as there is a risk of giving the impression to the public that more can be done quicker than may in fact be possible. Again, Amendment 1 helps us to stay grounded in a realistic set of goals once we put those core systems in place. The fundamental and revolutionary change here is that we will be insisting that platforms carry out risk assessments and share them with a regulator, who will then look to them to implement actions to mitigate those risks. That is fundamental. We must not lose sight of that core function and get distracted by some of the bells and whistles that are interesting, but which may take the regulator’s attention away from its core work.
We also need to consider what we mean by “safe” in the context of the Bill and the internet. An analogy that I have used in this context, which may be helpful, is to consider how we regulate travel by car and aeroplane. Our goal for air travel is zero accidents, and we regulate everything down to the nth degree: from the steps we need to take as passengers, such as passing through security and presenting identity documents, to detailed and exacting safety rules for the planes and pilots. With car travel, we have a much higher degree of freedom, being able to jump in our private vehicles and go where we want, when we want, pretty much without restrictions. Our goal for car travel is to make it incrementally safer over time; we can look back and see how regulation has evolved to make vehicles, roads and drivers safer year on year, and it continues to do so. Crucially, we do not expect car travel to be 100% safe, and we accept that there is a cost to this freedom to travel that, sadly, affects thousands of people each year, including my own family and, I am sure, many others in the House. There are lots of things we could do to make car travel even safer that we do not put into regulation, because we accept that the cost of restricting freedom to travel is too high.
Without over-labouring this analogy, I ask that we keep it in mind as we move through Committee—whether we are asking Ofcom to implement a car-like regime whereby it is expected to make continual improvements year on year as the state of online safety evolves, or we are advocating an aeroplane-like regime whereby any instance of harm will be seen as a failure by the regulator. The language in Amendment 1 points more towards a regime of incremental improvements, which I believe is the right one. It is in the public interest: people want to be safer online, but they also want the freedom to use a wide range of internet services without excessive government restriction, and they accept some risk in doing so.
I hope that the Minister will respond positively to the intent of Amendment 1 and that we can explore in this debate whether there is broad consensus on what we hope the Bill will achieve and how we expect Ofcom to go about its work. If there is not, then we should flush that out now to avoid later creating confused or contradictory rules based on different understandings of the Bill’s purpose. I will keep arguing throughout our proceedings for us to remain focused on giving the right goals to Ofcom and allowing it considerable discretion over the specific tools it needs, and for us to be realistic in our aims so that we do not overcommit and underdeliver.
Finally, the question of feature creep is very much up to us. There will be a temptation to add things into the Bill as it goes through. Some of those things are essential; I know that the noble Baroness, Lady Kidron, has some measures that I would also support. This is the right time to do that, but there will be other things that would be “nice to have”, and the risk of putting them in might detract from those core mechanisms. I hope we are able to maintain our discipline as we go through these proceedings to ensure we deliver the right objectives, which are incredibly well set out in Amendment 1, which I support.
I strongly support my noble friend in his amendment. I clarify that, in doing so, I am occupying a guest slot on the Front Bench: I do so as a member of his team but also as a member of the former Joint Committee. As my noble friend set out, this reflects where we got to in our thinking as a Joint Committee all that time ago. My noble friend said “at last”, and I echo that and what others said. I am grateful for the many briefings and conversations that we have had in the run-up to Committee, but it is good to finally be able to get on with it and start to clear some of these things out of my head, if nothing else.
In the end, as everyone has said, this is a highly complex Bill. Like the noble Baroness, Lady Stowell, in preparation for this I had another go at trying to read the blooming thing, and it is pretty much unreadable —it is very challenging. That is right at the heart of why I think this amendment is so important. Like the noble Baroness, Lady Kidron, I worry that this will be a bonanza for the legal profession, because it is almost impenetrable when you work your way through the wiring of the Bill. I am sure that, in trying to amend it, some of us will have made errors. We have been helped by the Public Bill Office, but we will have missed things and got things the wrong way around.
It is important to have something purposive, as the Joint Committee wanted, and to have clarity of intent for Ofcom, including that this is so much more about systems than about content. Unlike the noble Baroness, Lady Stowell—clearly, we all respect her work chairing the communications committee and the insights she brings to the House—I think that a very simple statement, restricting it just to proposed new paragraph (g), is not enough. It would almost be the same as the description at the beginning of the Bill, before Clause 1. We need to go beyond that to get the most from having a clear statement of how we want Ofcom to do its job and the Secretary of State to support Ofcom.
I like what the noble Lord, Lord Allan, said about the risk of overcommitment and underdevelopment. When the right reverend Prelate the Bishop of Oxford talked about being the safest place in the world to go online, which is the claim that has been made about the Bill from the beginning, I was reminded again of the difficulty of overcommitting and underdelivering. The Bill is not perfect, and I do not believe that it will be when this Committee and this House have finished their work; we will need to keep coming back and legislating and regulating in this area, as we pursue the goal of being the safest place in the world to go online —but it will not be any time soon.
I say to the noble Baroness, Lady Fox, who I respect, that I understand what she is saying about some of her concerns about a risk-free child safety regime and the unintended consequences that may come in this legislation. But at its heart, what motivate us and make us believe that getting the Bill right is one of the most important things we will do in all of our times in this Parliament are the unintended consequences of the algorithms that these tech companies have created in pushing content at children that they do not want to hear. I see the noble Baroness, Lady Kidron, wanting to comment.
I just want to say to the noble Baroness, Lady Fox, that we are not looking to mollycoddle children or put them in cotton wool; we are asking for a system where they are not systematically exploited by major companies.
I very much agree. The core of what I want to say in supporting this amendment is that in Committee we will do what we are here to do. There are a lot of amendments to what is a very long and complicated Bill: we will test the Minister and his team on what the Government are trying to achieve and whether they have things exactly right in order to give Ofcom the best possible chance to make it work. But when push comes to shove at the end of the process, at its heart we need to build trust in Ofcom and give it the flexibility to be able to respond to the changing online world and the changing threats to children and adults in that online world. To do that, we need to ensure that we have the right amount of transparency.
I was particularly pleased to see proposed new paragraph (g) in the amendment, on transparency, as referenced by the noble Baroness, Lady Stowell. It is important that we have independence for Ofcom; we will come to that later in Committee. It is important that Parliament has a better role in terms of accountability so that we can hold Ofcom to account, having given it trust and flexibility. I see this amendment as fundamental to that, because it sets the framework for the flexibility that we then might want to be able to give Ofcom over time. I argue that this is about transparency of purpose, and it is a fundamental addition to the Bill to make it the success that we want.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.
The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.
There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.
Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,
“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.
It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.
Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.
Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.
The Bill requires
“a significant number of children”
to use the service, or for the service to be
“likely to attract a significant number of users who are children”.
“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.
Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is
“likely to be accessed by children”
if
“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”
that are likely to be accessed.
Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.
Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.
When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.
Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.
Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.
Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.
My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.
My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.
I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare
“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user”
services reaching children. Amendment 22 would mandate app stores
“to use proportionate and proactive measures, such as age assurance, to prevent children”
coming into contact with
“primary priority content that is harmful to children”.
Amendments 298 and 299 would simply define “app” and “app stores”.
Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.
Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.
Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.
The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.
I want to reassure the noble Lord: I have his blog in front of me and he was quite right—there were not a lot of children on that site. It is a very good blog, which I read frequently.
I want to make two points. First, age-rating and age-gating are two different things, and I think the noble Lord has conflated them. There is a lot of age- rating going on, and it is false information. We need good information, and we have not managed to get it by asking nicely. Secondly, I slightly dispute his idea that we have a very structured Bill regarding user-to-user and so on. We have a very structured Bill from a harms perspective that describes the harms that must be prevented—and then we got to commercial porn, and we can also get to these other things.
I agree with the noble Lord’s point about freedom of speech, but we are talking about a fixed set of harms that will, I hope, be in the Bill by the end. We can then say that if children are likely to be accessed by this test, and known harm is there, that is what we are looking at. We are certainly not looking at the noble Lord’s blog.
I appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.
My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.
That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.
My Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.
As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.
As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.
I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.
These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.
Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—
I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?
If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.
On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.
The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.
I thank the Minister for an excellent debate; I will make two points. First, I think the Minister was perhaps answering on my original amendment, which I have narrowed considerably to services
“likely to be accessed by children”
and with proven harm on the basis of the harms described by the Bill. It is an “and”, not an “or”, allowing Ofcom to go after places that have proven to be harmful.
Secondly, I am not sure the Government can have it both ways—that it is the same as the age-appropriate design code but different in these ways—because it is exactly in the ways that it is different that I am suggesting the Government might improve. We will come back to both those things.
Finally, what are we asking here? We are asking for a risk assessment. The Government say there is no risk assessment, no harm, no mitigation, nothing to do. This is a major principle of the conversations we will have going forward over a number of days. I also believe in proportionality. It is basic product safety; you have a look, you have standards, and if there is nothing to do, let us not make people do silly things. I think we will return to these issues, because they are clearly deeply felt, and they are very practical, and my own feeling is that we cannot risk thousands of children not benefiting from all the work that Ofcom is going to do. With that, I beg leave to withdraw.
My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.
The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.
There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.
The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.
Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.
Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.
Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?
I will speak to a couple of the amendments in this group. First, small is not safe, and you cannot necessarily see these platforms in isolation. For example, there is an incel group that has only 4,000 active users, but it posts a great deal on YouTube and has 24.2 million users in that context. So we have to be clear that small and safe are not the same thing.
However, I am sympathetic to the risk-based approach. I should probably have declared an interest as someone who has given money to Wikipedia on several occasions to keep it going. I ask the Minister for some clarity on the systems and processes of the Bill, and whether the risk profile of Wikipedia—which does not entice you in and then follow you for the next six months once you have looked at something—is far lower than something very small that gets hold of you and keeps on going. I say that particularly in relation to children, but I feel it for myself also.
My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.
There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.
Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.
On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.
It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.
Far be it for me to suggest that all the amendments tabled by the noble Lord, Lord Moylan, are in the wrong place, but I think that Amendment 26 might have been better debated with the other amendments on age assurance.
On community moderation, I underscore the point that Ofcom must have a risk profile as part of its operations. When we get to that subject, let us understand what Ofcom intends to do with it—maybe we should instruct Ofcom a little about what we would like it to do with it for community moderation. I have a lot of sympathy—but do not think it is a get-out clause—with seeing some spaces as less risky, or, at least, for determining what risky looks like in online spaces, which is a different question. This issue belongs in the risk profile: it is not about taking things out; we have to build it into the Bill we have.
On age assurance and AV, I do not think that today is the day to discuss it in full. I disagree with the point that, because we are checking kids, we have to check ourselves—that is not where the technology is. Without descending into technical arguments, as the noble Lord, Lord Moylan, asked us not to, we will bring some of those issues forward.
The noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford have a package of amendments which are very widely supported across the Committee. They have put forward a schedule of age assurance that says what the rules of the road are. We must stop pretending that age assurance is something that is being invented now in this Bill. If you log into a website with your Facebook login, it shares your age—and that is used by 42% of people online. However, if you use an Apple login, it does not share your age, so I recommend using Apple—but, interestingly, it is harder to find that option on websites, because websites want to know your age.
So, first, we must not treat age assurance as if it has just been invented. Secondly, we need to start to have rules of the road, and ask what is acceptable, what is proportionate, and when we will have zero tolerance. Watching faces around the Committee, I say that I will accept zero tolerance for pornography and some other major subjects, but, for the most part, age assurance is something that we need to have regulated. Currently, it is being done to us rather than in any way that is transparent or agreed, and that is very problematic.
My Lords, I hesitated to speak to the previous group of amendments, but I want to speak in support of the issue of risk that my noble friend Lady Kidron raised again in this group of amendments. I do not believe that noble Lords in the Committee want to cut down the amount of information and the ability to obtain information online. Rather, we came to the Bill wanting to avoid some of the really terrible harms promoted by some websites which hook into people’s vulnerability to becoming addicted to extremely harmful behaviours, which are harmful not only to themselves but to other people and, in particular, to children, who have no voice at all. I also have a concern about vulnerable people over the age of 18, and that may be something we will come to later in our discussions on the Bill.
My Lords I am grateful to all noble Lords who have contributed to this slightly disjointed debate. I fully accept that there will be further opportunities to discuss age verification and related matters, so I shall say no more about that. I am grateful, in particular, to the noble Lord, Lord Allan of Hallam, for supplying the deficiency in my opening remarks about the importance of Amendments 10 and 11, and for explaining just how important that is too. I also thank the noble Lord, Lord Stevenson. It was good of him to say, in the open approach he took on the question of age, that there are issues still to be addressed. I do not think anybody feels that we have yet got this right and I think we are going to have to be very open in that discussion, when we get to it. That is also true about what the noble Lord, Lord Allan of Hallam, said: we have not yet got clarity as to where the age boundary is—I like his expression—for the public space. Where is the point at which, if checks are needed, those checks are to be applied? These are all matters to discuss and I hope noble Lords will forgive me if I do not address each individual contribution separately.
I would like to say something, I hope not unfairly or out of scope, about what was said by the noble Baronesses, Lady Finlay of Llandaff and Lady Kidron, when they used, for the first time this afternoon, the phrase “zero tolerance”, and, at the same time, talked about a risk-based approach. I have, from my own local government experience, a lot of experience of risk-based approaches taken in relation to things—very different, of course, from the internet—such as food safety, where local authorities grade restaurants and food shops and take enforcement action and supervisory action according to their assessment of the risk that those premises present. That is partly to do with their assessment of the management and partly to do with their experience of things that have gone wrong in the past. If you have been found with mouse droppings and you have had to clean up the shop, then you will be examined a great deal more frequently until the enforcement officers are happy; whereas if you are always very well run, you will get an inspection visit maybe only once a year. That is what a risk-based assessment consists of. The important thing to say is that it does not produce zero tolerance or zero outcomes.
I just want to make the point that I was talking about zero tolerance at the end of a ladder of tolerance, just to be clear. Letting a seven-year-old child into an 18-plus dating app or pornographic website is where the zero tolerance is—everything else is a ladder up to that.
I beg the noble Baroness’s pardon; I took that for granted. There are certain things—access to pornography, material encouraging self-harm and things of that sort—where one has to have zero tolerance, but not everything. I am sorry I took that for granted, so I fully accept that I should have made that more explicit in my remarks. Not everything is to be zero-toleranced, so to speak, but certain things are. However, that does not mean that they will not happen. One has to accept that there will be leakage around all this, just as some of the best-run restaurants that have been managed superbly for years will turn out, on occasion, to be the source of food poisoning. One has to accept that this is never going to be as tight as some of the advocates wanted, but with that, I hope I will be given leave to withdraw—
My Lords, it is risky to stand between people and their dinner, but I rise very briefly to welcome these amendments. We should celebrate the good stuff that happens in Committee as well as the challenging stuff. The risk assessments are, I think, the single most positive part of this legislation. Online platforms already do a lot of work trying to understand what risks are taking place on their platforms, which never sees the light of day except when it is leaked by a whistleblower and we then have a very imperfect debate around it.
The fact that platforms will have to do a formal risk assessment and share it with a third-party regulator is huge progress; it will create a very positive dynamic. The fact that the public will be able to see those risk assessments and make their own judgments about which services to use—according to how well they have done them—is, again, a massive public benefit. We should welcome the fact that risk assessments are there and the improvements that this group of amendments makes to them. I hope that was short enough.
I also welcome these amendments, but I have two very brief questions for the Minister. First, in Amendment 27A, it seems that the child risk assessment is limited only to category 1 services and will be published only in the terms of service. As he probably knows, 98% of people do not read terms of service, so I wondered where else we might find this, or whether there is a better way of dealing with it.
My second question is to do with Amendments 64A and 88A. It seems to me—forgive me if I am wrong—that the Bill previously stipulated that all regulated search and user services had to make and keep a written record of any measure taken in compliance with a relevant duty, but now it seems to have rowed back to only category 1 and 2A services. I may be wrong on that, but I would like to check it for the record.
My Lords, the noble Baroness, Lady Kidron, put her finger exactly on the two questions that I wanted to ask: namely, why only category 1 and category 2A, and is there some rowing back involved here? Of course, none of this prejudices the fact that, when we come later in Committee to talk about widening the ambit of risk assessments to material other than that which is specified in the Bill, this kind of transparency would be extremely useful. But the rationale for why it is only category 1 and category 2A in particular would be very useful to hear.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberThe Twitter scenario, and other scenarios of mixed sites, are some of the most challenging that we have to deal with. But I would say, straightforwardly, “Look, 13% is a big chunk, but the primary purpose of Twitter is not the delivery of pornography”. I use Twitter on a daily basis and I have never seen pornography on it. I understand that it is there and that people can go for it, and that is an issue, but I think people out there would say that for most people, most of the time, the primary purpose of Twitter is not pornography.
What we want to do—in answer to the noble Lord’s second point—is create an incentive for people to be recategorised in the right direction. There is an assumption here that it is all going to be about gaming the system. I actually think that there is an opportunity here for genuine changes. There will be a conversation with Twitter. It will be interesting, given Twitter’s current management—apparently it is run by a dog, so there will be a conversation with the dog that runs Twitter. In that conversation, the regulator, Ofcom, on our behalf, will be saying, “You could change your terms of service and get rid of pornography”. Twitter will say yes or no. If it says no, Ofcom will say, “Well, here are all the things we expect you to do in order to wall off that part of the site”.
That is a really healthy and helpful conversation to have with Twitter. I expect it is listening now and already thinking about how it will respond. But it would expect that kind of treatment and conversation to be different; and I think the public would expect that conversation to be a different and better conversation than just saying “Twitter, you’re Pornhub. We’re just going to treat you like Pornhub”.
That is the distinction. As I say, we have an opportunity to get people to be more robust about either limiting or removing pornography, and I fear that the amendments we have in front of us would actually undermine rather than enhance that effort.
At the centre of this is the question of whether we are trying to block the entire service or block at the level of porn content. It is the purpose of a set of amendments in the names of the noble Lord, Lord Bethell, myself and a number of other noble Lords to do exactly the latter. But I have to say to the noble Baroness that I am very much in sympathy with, first, putting porn behind an age gate; secondly, having a commencement clause; and, thirdly and very importantly—this has not quite come up in the conversation—saying that harms must be on the face of the Bill and that porn is not the only harm. I say, as a major supporter of the Bereaved Families for Online Safety, that “Porn is the only harm children face” would be a horrendous message to come from this House. But there is nothing in the noble Baroness’s amendments, apart from where the action happens, that I disagree with.
I also felt that the noble Baroness made an incredibly important point when she went into detail on Amendment 125A. I will have to read her speech in order to follow it, because it was so detailed, but the main point she made is salient and relates to an earlier conversation: the reason we have Part 5 is that the Government have insisted on this ridiculous thing about user-to-user and search, instead of doing it where harm is. The idea that you have Part 5, which is to stop the loophole of sites that do not have user-to-user, only to find that they can add user-to-user functionality and be another type of site, is quite ludicrous. I say to the Committee and the Minister, who I am sure does not want me to say it, “If you accept Amendment 2, you’d be out of that problem”—because, if a site was likely to be accessed by children and it had harm and we could see the harm, it would be in scope. That is the very common-sense approach. We are where we are, but let us be sensible about making sure the system cannot be gamed, because that would be ludicrous and would undermine everybody’s efforts—those of the Government and of all the campaigners here.
I just want to say one more thing because I see that the noble Lord, Lord Moylan, is back in his place. I want to put on the record that age assurance and identity are two very separate things. I hope that, when we come to debate the package of harms—unfortunately, we are not debating them all together; we are debating harms first, then AV—we get to the bottom of that issue because I am very much in the corner of the noble Lord and the noble Baroness, Lady Fox, on this. Identity and age assurance must not be considered the same thing by the House, and definitely not by the legislation.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I also support Amendment 157, which stands in the name of the noble Lord, Lord Pickles, and others, including my own. As the noble Baroness, Lady Deech, indicated, it is specific in the nature of what it concentrates on. The greatest concern that arises through the amendment is with reference to category 2A. It is not necessarily incompatible with what the noble Lord, Lord Moylan, proposes; I do not intend to make any direct further comment on his amendments. While the amendment is specific, it has a resonance with some of the other issues raised on the Bill.
I am sure that everyone within this Committee would want to have a Bill that is as fit for purpose as possible. The Bill was given widespread support at Second Reading, so there is a determination across the Chamber to have that. Where we can make improvements to the Bill, we should do that and, as much as possible, try to future-proof the Bill. The wider resonance is the concern that if the Bill is to be successful, we need as much consistency and clarity within it as possible, particularly for users. Where we have a level of false dichotomy of regulations, that runs contrary to the intended purposes of the Bill and creates inadvertent opportunities for loopholes. As such, and as has been indicated, the concern is that in the Bill at present, major search engines are effectively treated in some of the regulations on a different basis from face-to-face users. For example, some of the provisions around risk assessment, the third shield and the empowerment tools are different.
As also indicated, we are not talking about some of the minor search engines. We are talking about some of the largest companies in the world, be it Google, Microsoft through Bing, Amazon through its devices or Apple through its Siri voice tool, so it is reasonable that they are brought into line with what is there is for face-to-face users. The amendment is therefore appropriate and the rationale for it is that there is a real-world danger. Mention has been made—we do not want to dwell too long on some of the examples, but I will use just one—of the realms of anti-Semitism, where I have a particular interest. For example, on search tools, a while ago there was a prompt within one search engine that Jews are evil. It was found that when that prompt was there, searches of that nature increased by 10% and when it was removed, they were reduced. It is quite fixable and it goes into a wide range of areas.
One of the ways in which technology has changed, I think for us all, is the danger that it can be abused by people who seek to radicalise others and make them extreme, particularly young children. Gone are the days when some of these extremists or terrorists were lonely individuals in an attic, with no real contact with the outside world, or hanging around occasionally in the high street while handing out poorly produced A4 papers with their hateful ideology. There is a global interconnection here and, in particular, search engines and face-to-face users can be used to try to draw young people into their nefarious activities.
I mentioned the example of extremism and radicalisation when it comes to anti-Semitism. I have seen it from my own part of the world, where there is at times an attempt by those who still see violence as the way forward in Northern Ireland to draw new generations of young people into extremist ideology and terrorist acts. There is an attempt to lure in young people and, sadly, search engines have a role within that, which is why we need to see that level of protection. Now, the argument from search engines is that they should have some level of exemptions. How can they be held responsible for everything that appears through their searches, or indeed through the web? But in terms of content, the same argument could be used for face-to-face users. It is right, as the proposer of this amendment has indicated, that there are things such as algorithmic indexing and prompt searches where they do have a level of control.
The use of algorithms has moved on considerably since my schooldays, as they surely have for everyone in this Committee, and I suspect that none of us felt that they would be used in such a fashion. We need a level of protection through an amendment such as this and, as its proposers, we are not doctrinaire on the precise form in which this should take place. We look, for example, at the provisions within Clause 11—we seek to hear what the Government have to say on that—which could potentially be used to regulate search engines. Ensuring that that power is given, and will be used by Ofcom, will go a long way to addressing many of the concerns.
I think all of us in this Committee are keen to work together to find the right solutions, but we feel that there is a need to make some level of change to the regulations that are required for search engines. None of us in this Committee believes that we will ultimately have a piece of legislation that reflects perfection, but there is a solemn duty on us all to produce legislation that is as fit for purpose and future-proofed as possible, while providing children in particular with the maximum protection in what is at times an ever-changing and sometimes very frightening world.
My Lords, I agree in part with the noble Lord, Lord Moylan. I was the person who said that small was not safe, and I still feel that. I certainly do not think that anything in the Bill will make the world online 100% safe, and I think that very few noble Lords do, so it is important to say that. When we talk about creating a high bar or having zero tolerance, we are talking about ensuring that there is a ladder within the Bill so that the most extreme cases have the greatest force of law trying to attack them. I agree with the noble Lord on that.
I also absolutely agree with the noble Lord about implementation: if it is too complex and difficult, it will be unused and exploited in certain ways, and it will have a bad reputation. The only part of his amendment that I do not agree with is that we should look at size. Through the process of Committee, if we can look at risk rather than size, we will get somewhere. I share his impatience—or his inquiry—about what categories 2A and 2B mean. If category 2A means the most risky and category 2B means those that are less risky, I am with him all the way. We need to look into the definition of what they mean.
Finally, I mentioned several times on Tuesday that we need to look carefully at Ofcom’s risk profiles. Is this the answer to dealing with where risk gets determined, rather than size?
My Lords, I rise to speak along similar lines to the noble Baroness, Lady Kidron. I will address my noble friend Lord Moylan’s comments. I share his concern that we must not make the perfect the enemy of the good but, like the noble Baroness, I do not think that size is the key issue here, because of how tech businesses grow. Tech businesses are rather like building a skyscraper: if you get the foundations wrong, it is almost impossible to change how safe the building is as it goes up and up. As I said earlier this week, small tech businesses can become big very quickly, and, if you design your small tech business with the risks to children in mind at the very beginning, there is a much greater chance that your skyscraper will not wobble as it gets taller. On the other hand, if your small business begins by not taking children into account at all, it is almost impossible to address the problem once it is huge. I fear that this is the problem we face with today’s social media companies.
The noble Baroness, Lady Kidron, hit the nail on the head, as she so often does, in saying that we need to think about risk, rather than size, as the means of differentiating the proportionate response. In Clause 23, which my noble friend seeks to amend, the important phrase is “use proportionate measures” in subsection (2). Provided that we start with a risk assessment and companies are then under the obligation to make proportionate adjustments, that is how you build safe technology companies—it is just like how you build safe buildings.
My Lords, I support Amendment 190 in the name of the noble Lord, Lord Clement-Jones, and Amendment 285 in the name of the noble Lord, Lord Stevenson. That is not to say that I do not have a great deal of sympathy for the incredibly detailed and expert speech we have just heard, but I want to say just a couple of things.
First, I think we need to have a new conversation about privacy in general. The privacy that is imagined by one community is between the state and the individual, and the privacy that we do not have is between individuals and the commercial companies. We live in a 3D world and the argument remains 2D. We cannot do that today, but I agree with the noble Lord that many in the enforcement community do have one hand on human rights, and many in the tech world do care about human rights. However, I do not believe that the tech sector has fully fessed up to its role and the contribution it could make around privacy. I hope that, as part of the debate on the Bill, and the debate that we will have subsequently on the data Bill No. 2, we come to untangle some of the things that they defend—in my view, unnecessarily and unfairly.
I point out that one of the benefits of end-to-end encryption is that it precisely stops companies doing things such as targeted advertising based on the content of people’s communications. Again, I think there is a very strong and correct trend to push companies in that direction.
I thank the noble Lord for the intervention. For those noble Lords who are not following the numbers, Amendment 285, which I support, would prevent general monitoring. Apart from anything else, I am worried about equivalence and other issues in relation to general monitoring. Apart from a principled position against it, I think to be explicit is helpful.
Ofcom needs to be very careful, and that is what Amendment 190 sets out. It asks whether the alternatives have been thought about, whether the conditions have been thought about, and whether the potential impact has been thought about. That series of questions is essential. I am probably closer to the community that wants to see more powers and more interventions, but I would like that to be in a very monitored and regulated form.
I thank the noble Lord for his contribution. Some of these amendments must be supported because it is worrying for us as a country to have—what did the noble Lord call it?—ambiguity about whether something is possible. I do not think that is a useful ambiguity.
My Lords, my name is attached to Amendment 203 in this group, along with those of the noble Lords, Lord Clement-Jones, Lord Strathcarron and Lord Moylan. I shall speak in general terms about the nature of the group, because it is most usefully addressed through the fundamental issues that arise. I sincerely thank the noble Lord, Lord Allan, for his careful and comprehensive introduction to the group, which gave us a strong foundation. I have crossed out large amounts of what I had written down and will try not to repeat, but rather pick up some points and angles that I think need to be raised.
As was alluded to by the noble Baroness, Lady Kidron, this debate and the range of these amendments shows that the Bill is currently extremely deficient and unclear in this area. It falls to this Committee to get some clarity and cut-through to see where we could end up and change where we are now.
I start by referring to a briefing, which I am sure many noble Lords have received, from a wide range of organisations, including Liberty, Big Brother Watch, the Open Rights Group, Article 19, the Electronic Frontier Foundation, Reset and Fair Vote. It is quite a range of organisations but very much in the human rights space, particularly the digital human rights space. The introduction of the briefing includes a sentence that gets to the heart of why many of us have received so many emails about this element of the Bill:
“None of us want to feel as though someone is looking over our shoulder when we are communicating”.
I want to take advantage of the noble Baroness having raised that point to say that perhaps I was not clear enough in my speech. While I absolutely agree about not everything, everybody, all the time, for my specific concerns around child sexual abuse, abuse of women and so on, we have to find new world order ways of creating targeted approaches so it does not have to be everything, everybody, all the time.
No one in the Committee or anyone standing behind us who speaks up for children thinks that this is going to be a silver bullet. It is unacceptable to suggest that we take that position. Much child abuse takes place offline and is then put online, but the exponential way in which it is consumed, created, and spread is entirely new because of the services we are talking about. Later in Committee I will explain some of the new ways in which it is creating child abuse—new forms, new technologies, new abuse.
I am sorry to interrupt the noble Baroness. I have made my feelings clear that I am not an end-to-end encryption “breaker”. There are amendments covering this; I believe some of them will come up later in the name of the noble Lord, Lord Russell, on safety by design and so on. I also agree with the noble Baroness that we need more resources in this area for the police, teachers, social workers and so on. However, I do not want child sexual abuse to be a football in this conversation.
I agree with the noble Baroness, which is precisely why I am suggesting that we need to consider whether privacy should be sacrificed totally in relation to the argument around encryption. It is difficult, and I feel awkward saying it. When I mentioned a silver bullet I was not talking about the noble Baroness or any other noble Lords present, but I have heard people say that we need this Bill because it will deal with child abuse. In this group of amendments, I am raising the fact that when I have talked about encryption with people outside of the House they have said that we need to do something to tackle the fact that these messages are being sent around. It is not just child abuse; it is also terrorism. There is a range of difficult situations.
Things can go wrong with this, and that is what I was trying to raise. For example, we have a situation where some companies are considering using, or are being asked to use, machine learning to detect nudity. Just last year, a father lost his Google account and was reported to the police for sending a naked photo of their child to the doctor for medical reasons. I am raising these as examples of the problems that we have to consider.
Child abuse is so abhorrent that we will do anything to protect children, but let me say this to the Committee, as it is where the point on privacy lies: children are largely abused in their homes, but as far as I understand it we are not as yet arguing that the state should put CCTV cameras in every home for 24/7 surveillance to stop child abuse. That does not mean that we are glib or that we do not understand the importance of child abuse; it means that we understand the privacy of your home. There are specialist services that can intervene when they think there is a problem. I am worried about the possibility of putting a CCTV camera in everyone’s phone, which is the danger of going down this route.
My final point is that these services, such as WhatsApp, will potentially leave the UK. It is important to note that. I agree with the noble Lord, Lord Allan: this is not like threatening to storm off. It is not done in any kind of pique in that way. In putting enormous pressure on these platforms to scan communications, we must remember that they are global platforms. They have a system that works for billions of people all around the world. A relatively small market such as the UK is not something for which they would compromise their billions of users around the world. As I have explained, they would not put up with it if the Chinese state said, “We have to see people’s messages”. They would just say, “We are encrypted services”. They would walk out of China and we would all say, “Well done”. There is a real, strong possibility of these services leaving the UK so we must be very careful.
I am hesitant to give too tight a definition, because we want to remain technology neutral and make sure that we are keeping an open mind to developing changes. I will think about that and write to the noble Lord. The best endeavours will inevitably change over time as new technological solutions present themselves. I point to the resourcefulness of the sector in identifying those, but I will see whether there is anything more I can add.
While the Minister is reflecting, I note that the words “best endeavours” are always a bit of a worry. The noble Lord, Lord Allan, made the good point that once it is on your phone, you are in trouble and you must report it, but the frustration of many people outside this Chamber, if it has been on a phone and you cannot deal with it, is what comes next to find the journey of that piece of material without breaking encryption. I speak to the tech companies very often—indeed, I used to speak to the noble Lord, Lord Allan, when he was in position at then Facebook—but that is the question that we would like answered in this Committee, because the frustration that “It is nothing to do with us” is where we stop with our sympathy.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, this group of amendments concerns terms of service. All the amendments either have the phrase “terms of service” in them or imply that we wish to see more use of the phrase in the Bill, and seek to try to tidy up some of the other bits around that which have crept into the Bill.
Why are we doing that? Rather late in the day, terms of service has suddenly become a key fulcrum, under which much of the operations of the activity relating to people’s usage of social media and service functions on the internet will be expressed in relation to how they view the material coming to them. With the loss of the adult “legal but harmful” provisions, we also lost quite a considerable amount of what would have been primary legislation, which no doubt would have been backed up by codes of practice. The situation we are left with, and which we need to look at very closely, is the triple shield at the heart of the new obligations on companies, and, in particular, on their terms of service. That is set out primarily in Clauses 64, 65, 66 and 67, and is a subject to which my amendments largely refer.
Users of the services would be more confident that the Government have got their focus on terms of service right, if they actually said what should be said on the tin, as the expression goes. If it is the case that something in a terms of service was so written and implemented so that material which should be taken down was indeed taken down, these would become reliable methods of judging whether or not the service is the one people want to have, and the free market would be seen to be working to empower people to make their own decisions about what level of risk they can assume by using a service. That is a major change from the way the Bill was originally envisaged. Because this was done late, we have one or two of the matters to which I have referred already, which means that the amendments focus on changing what is currently in the Bill.
It is also true that the changes were not consulted upon; I do not recall there being any document from government about whether this was a good way forward. The changes were certainly not considered by the Joint Committee, of which several of those present were members—we did not discuss it in the Joint Committee and made no recommendation on it. The level of scrutiny we have enjoyed on the Bill has been absent in this area. The right reverend Prelate the Bishop of Oxford will speak shortly to amendments about terms of service, and we will be able to come back to it. I think it would have been appropriate had the earlier amendment in the name of the noble Lord, Lord Pickles, been in this group because the issue was the terms of service, even though it had many other elements that were important and that we did discuss.
The main focus of my speech is that the Government have not managed to link this new idea of terms of service and the responsibilities that will flow from that to the rest of the Bill. It does not seem to fit into the overall architecture. For example, it is not a design feature, and does not seem to work through in that way. This is a largely self-contained series of clauses. We are trying to ask some of the world’s largest companies, on behalf of the people who use them, to do things on an almost contractual basis. Terms of service are not a contract that you sign up to, but you certainly click something—or occasionally click it, if you remember to—by which you consent to the company operating in a particular set of ways. In a sense, that is a contract, but is it really a contract? At the heart of that contract between companies and users is whether the terms of service are well captured in the way the Bill is organised. I think there are gaps.
The Bill does have something that we welcome and want to hold on to, which is that the process under which the risks are assessed and decisions taken about how companies operate and how Ofcom relates to those decisions is about the design and operation of the service—both the design and the operation, something that the noble Baroness, Lady Kidron, is very keen to emphasise at all times. It all starts and ends with design, and the operation is a consequence of design choices. Other noble Baronesses have mentioned in the debate that small companies get it right and so, when they grow, can be confident that what they are doing is something that is worth doing. Design, and operating that design to make a service, is really important. Are terms of service part of that or are they different, and does it matter? It seems to me that they are downstream from the design: something can be designed and then have terms of service that were not really part of the original process. What is happening here?
My Amendments 16, 21, 66DA, 75 and 197 would ensure that the terms of service are included within the list of matters that constitute “design and operation” of the service at each point that it occurs. I have had to go right through the Bill to add it in certain areas—in a rather irritating way, I am sure, for the Bill team—because sometimes we find that what I think should be a term of service is actually described as something else, such as a “a publicly available statement”, whatever that is. It would be an advantage if we went through it again and defined terms of service and made sure that that was what we were talking about.
Amendments 70 to 72, 79 to 81 and 174 seek to help the Government and their officials with tidying up the drafting, which probably has not been scrutinised enough to pick up these issues. It may not matter, at the end of the day, but what is in the Bill is going to be law and we may as well try to get it right as best we can. I am sure the Minister will say we really do not need to worry about this because it is all about risks and outcomes, and if a company does not protect children or has illegal content, or the user-empowerment duties—the toggling—do not work, Ofcom will find a way of driving the company to sort it out. What does that mean in practice? Does it mean that Ofcom has a role in defining what terms of service are? It is not in the Bill and may not reach the Bill, but it is something that will be a bit of problem if we do not resolve what we mean by it, even if it is not by changing the legislation.
If the Minister were to disagree with my approach, it would be quite nice to have it said at the Dispatch Box so that we can look at that. The key question is: are terms of service an integral part of the design and operation of a service and, if so, can we extend the term to make sure that all aspects of the services people consume are covered by adequate and effective terms of service? There is probably going to be division in the way we approach this because, clearly, whether they are terms of service or have another name, the actual enforcement of illegal and children’s duties will be effected by Ofcom, irrespective of the wording of the Bill—I do not want to question that. However, there is obviously an overlap into questions about adults and others who are affected by the terms of service. If you cannot identify what the terms of service say in relation to something you might not wish to receive because the terms of service are imprecise, how on earth are you going to operate the services, the toggles and things, around it? If you look at that and accept there will be pressure within the market to get these terms of service right, there will be a lot of dialogue with Ofcom. I accept that all that will happen, but it would be good if the position of the terms of service was clarified in the Bill before it becomes law and that Ofcom’s powers in relation to those are clarified—do they or do they not have the chance to review terms of service if they turn out to be ineffective in practice? If that is the case, how are we going to see this work out in practice in terms of what people will be able to do about it, either through redress or by taking the issue to court? I beg to move.
I support these amendments, which were set out wonderfully by the noble Lord, Lord Stevenson. I want to raise a point made on Tuesday when the noble Baroness, Lady Merron, said that only 3% of people read terms of service and I said that 98% of people do not read them, so one of us is wrong, but I think the direction of travel is clear. She also used a very interesting phrase about prominence, and I want to use this opportunity to ask the Minister whether there is some lever whereby Ofcom can insist on prominence for certain sorts of material—a hierarchy of information, if you like—because these are really important pieces of information, buried in the wrong place so that even 2% or 3% of people may not find them.
My Lords, I am very pleased that the noble Lord, Lord Stevenson, has given us the opportunity to talk about terms of service, and I will make three points again, in a shorter intervention than on the previous group.
First, terms of service are critical as the impact of terms of service will generally be much greater in terms of the amount of intervention that occurs on content than it will ever be under the law. Terms of service create, in effect, a body of private law for a community, and they are nearly always a superset of the public law—indeed, it is very common for the first items of a terms of service to say, “You must not do anything illegal”. This raises the interesting question of “illegal where?”—what it generally means is that you must not do anything illegal in the jurisdiction in which the service provider is established. The terms of service will say, “Do not do anything illegal”, and then they will give a whole list of other things, as well as illegality, that you cannot do on the platform, and I think this is right because they have different characteristics.
My Lords, before speaking to my Amendment 137, I want to put a marker down to say that I strongly support Amendment 135 in the name of my noble friend Lord Moylan. I will not repeat anything that he said but I agree with absolutely every word.
Amendment 137 is in my name and that of my noble and learned friend Lord Garnier and the noble Lord, Lord Moore of Etchingham. This amendment is one of five which I have tabled with the purpose of meeting a core purpose of the Bill. In the words of my noble friend the Minister in response to Amendment 1, it is
“to protect users of all ages from being exposed to illegal content”—[Official Report, 19/4/23; col. 724.]
—in short, to ensure that what is illegal offline is illegal online.
If accepted, this small group of amendments would, I strongly believe, make a really important difference to millions of people’s lives—people who are not necessarily listed in Clause 12. I therefore ask the Committee to allow me to briefly demonstrate the need for these amendments through the prism of millions of people and their families working and living in rural areas. They are often quite isolated and working alone in remote communities, and are increasingly at risk of or are already suffering awful online abuse and harassment. This abuse often goes way beyond suffering; it destroys businesses and a way of life.
I find it extraordinary that the Bill seems to be absent of anything to do with livelihoods. It is all about focusing on feelings, which of course are important—and the most important focus is children—but people’s businesses and livelihoods are being destroyed through abuse online.
Research carried out by the Countryside Alliance has revealed a deeply disturbing trend online that appears to be disproportionately affecting people who live in rural areas and who are involved in rural pursuits. Beyond direct abuse, a far more insidious tactic that activists have adopted involves targeting businesses involved in activities of which they disapprove, such as livestock farming or hosting shoots. They post fake reviews on platforms including Tripadvisor and Google Maps, and their aim is to damage the victim, their business and their reputation by, to put it colloquially, trashing their business and thereby putting off potential customers. This is what some call trolling.
Let me be clear that I absolutely defend, to my core, the right to freedom of expression and speech, and indeed the right to offend. Just upsetting someone is way below the bar for the Bill, or any legislation. I am deeply concerned about the hate crime—or non-crime—issue we debated yesterday; in fact, I put off reading the debate because I so disagree with this nonsense from the College of Policing.
Writing a negative review directly based on a negative experience is entirely acceptable in my book, albeit unpleasant for the business targeted. My amendments seek to address something far more heinous and wrong, which, to date, can only be addressed as libel and, therefore, through the civil courts. Colleagues in both your Lordships’ House and in another place shared with me tremendously upsetting examples from their constituents and in their neighbourhoods of how anonymous activists are ruining the lives of hard-working people who love this country and are going the extra mile to defend our culture, historic ways of life and freedoms.
Fortunately, through the Bill, the Government are taking an important step by introducing a criminal offence of false communications. With the leave of the Committee, I will briefly cite and explain the other amendments in order to make sense of Amendment 137. One of the challenges of the offence of false communications is the need to recognise that so much of the harm that underpins the whole reason why the Bill is necessary is the consequence of allowing anonymity. It is so easy to destroy and debilitate others by remaining anonymous and using false communications. Why be anonymous if you have any spine at all to stand up for what you believe? It is not possible offline—when writing a letter to a newspaper, for example—so why is it acceptable online? The usual tech business excuse of protecting individuals in rogue states is no longer acceptable, given the level of harm that anonymity causes here at home.
Therefore, my Amendment 106 seeks to address the appalling effect of harm, of whatever nature, arising from false or threatening communications committed by unverified or anonymous users—this is what we refer to as trolling. Amendments 266 and 267, in my name and those of my noble and learned friend Lord Garnier and my noble friend Lord Leicester, would widen the scope of this new and welcome offence of false communications to include financial harm, and harm to the subject of the false message arising from its communication to third parties.
The Bill will have failed unless we act beyond feelings and harm to the person and include loss of livelihood. As I said, I am amazed that it is not front and centre of the Bill after safety for our children. Amendment 268, also supported by my noble and learned friend, would bring within the scope of the communications offences the instigation of such offences by others—for example, Twitter storms, which can involve inciting others to make threats without doing so directly. Currently, we are unsure whether encouraging others to spread false information—for example, by posting fake reviews of businesses for ideologically motivated reasons—would become an offence under the Bill. We believe that it should, and my Amendment 268 would address this issue.
I turn briefly to the specifics of my Amendment 137. Schedule 7 lists a set of “priority offences” that social media platforms must act to prevent, and they must remove messages giving rise to certain offences. However, the list does not include the new communications offences created elsewhere in Part 10. We believe that this is a glaring anomaly. If there is a reason why the new communications offences are not listed, it is important that we understand why. I hope that my noble friend the Minister can explain.
The practical effect of Amendment 137 would be to include the communications offences introduced in the Bill and communications giving rise to them within the definition of “relevant offence” and “priority illegal content” for the purposes of Clause 53(4) and (7) and otherwise.
I ask the Committee to have a level of imagination here because I have been asked to read the speech of the noble Viscount, Lord Colville—
I do not know who advised the noble Baroness—and forgive me for getting up and getting all former Leader on her—but this is a practice that we seem to have adopted in the last couple of years and that I find very odd. It is perfectly proper for the noble Baroness to deploy the noble Viscount’s arguments, but to read his speech is completely in contravention of our guidance.
I beg the pardon of the Committee. I asked about it and was misinformed; I will do as the noble Baroness says.
The noble Viscount, Lord Colville, is unable to be with us. He put his name to Amendments 273, 275, 277 and 280. His concern is that the Bill sets the threshold for illegality too low and that in spite of the direction provided by Clause 170, the standards for determining illegality are too vague.
I will make a couple of points on that thought. Clause 170(6) directs that a provider must have
“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”,
but that does not mean that the platform has to be certain that the content is illegal before it takes it down. This is concerning when you take it in combination with what or who will make judgments on illegality.
If a human moderator makes the decision, it will depend on the resources and time available to them as to how much information they gather in order to make that judgment. Unlike in a court case, when a wide range of information and context can be gathered, when it comes to decisions about content online, these resources are very rarely available to human moderators, who have a vast amount of content to get through.
If an automated system makes the judgment, it is very well established that algorithms are not good at context—the Communications and Digital Committee took evidence on this repeatedly when I was on it. AI simply uses the information available in the content itself to make a decision, which can lead to significant missteps. Clause 170(3) provides the requirement for the decision-makers to judge whether there is a defence for the content. In the context of algorithms, it is very unclear how they will come to such a judgment from the content itself.
I understand that these are probing amendments, but I think the concern is that the vagueness of the definition will lead to too much content being taken down. This concern was supported by Parliament’s Joint Committee on Human Rights, which wrote to the former Culture Secretary, Nadine Dorries, on that matter. I apologise again.
My Lords, I support the amendments in this group that probe how removing illegal material is understood and will be used under the Bill. The noble Lord, Lord Moylan, explained a lot of my concerns, as indeed did the noble Viscount, Lord Colville, via his avatar. We have heard a range of very interesting contributions that need to be taken seriously by the Government. I have put my name to a number of amendments.
The identification of illegal material might be clear and obvious in some cases—even many cases. It sounds so black and white: “Don’t publish illegal material”. But defining communications of this nature can be highly complex, so much so that it is traditionally reserved for law enforcement bodies and the judicial system. We have already heard from the noble Lord, Lord Moylan, that, despite Home Secretaries, this House, regulations and all sorts of laws having indicated that non-crime hate incidents, for example, should not be pursued by the police, they continue to pursue them as though they are criminal acts. That is exactly the kind of issue we have.
My Lords, it is genuinely difficult to summarise such a wide-ranging debate, which was of a very high standard. Only one genuinely bright idea has emerged from the whole thing: as we go through Committee, each group of amendments should be introduced by the noble Lord, Lord Allan of Hallam, because it is only after I have heard his contribution on each occasion that I have begun to understand the full complexity of what I have been saying. I suspect I am not alone in that and that we could all benefit from hearing the noble Lord before getting to our feet. That is not meant to sound the slightest bit arch; it is absolutely genuine.
The debate expressed a very wide range of concerns. Concerns about gang grooming and recruiting were expressed on behalf of the right reverend Prelate the Bishop of Derby and my noble friend Lady Buscombe expressed concerns about trolling of country businesses. However, I think it is fair to say that most speakers focused on the following issues. The first was the definition of legality, which was so well explicated by the noble Lord, Lord Allan of Hallam. The second was the judgment bar that providers have to pass to establish whether something should be taken down. The third was the legislative mandating of private foreign companies to censor free speech rights that are so hard-won here in this country. These are the things that mainly concern us.
I was delighted that I found myself agreeing so much with what the noble Baroness, Lady Kidron, said, even though she was speaking in another voice or on behalf of another person. If her own sentiments coincide with the sentiments of the noble Viscount—
I am sorry to intrude, but I must say now on the record that I was speaking on my own behalf. The complication of measuring and those particular things are terribly important to establish, so I am once again happy to agree with the noble Lord.
I am delighted to hear the noble Baroness say that, and it shows that that pool of common ground we share is widening every time we get to our feet. However, the pool is not particularly widening, I am afraid to say—at least in respect of myself; other noble Lords may have been greatly reassured—as regards my noble friend the Minister who, I am afraid, has not in any sense addressed the issues about free speech that I and many other noble Lords raised. On some issues we in the Committee are finding a consensus that is drifting away from the Minister. We probably need to put our heads together more closely on some of these issues with the passage of time in Committee.
My noble friend also did not say anything that satisfied me in respect of the practical operation of these obligations for smaller sites. He speaks smoothly and persuasively of risk-based proactive approaches without saying that, for a large number of sites, this legislation will mean a complete re-engineering of their business model. For example, where Wikipedia operates in a minority language, such as in Welsh Wikipedia, which is the largest Welsh language website in the world, if its model is to involve monitoring what is put out by the community and correcting it as it goes along, rather than having a model in advance that is designed to prevent things being put there in the first place, then it is very likely to close down. If that is one of the consequences of this Bill the Government will soon hear about it.
Finally, although I remain concerned about public order offences, I have to say to the Minister that if he is so concerned about the dissemination of alarm among the population under the provisions of the Public Order Act, what does he think that His Majesty’s Government were doing on Sunday at 3 pm? I beg leave to withdraw the amendment.
My Lords, this amendment and Amendments 74, 93 and 123 are part of a larger group that have been submitted as a package loosely referred to as the AV and harms package. They have been the subject of much private debate with the Government, for which we are grateful, and among parliamentarians, and have featured prominently in the media. The amendments are in my name and those of the noble Lord, Lord Bethell, the right reverend Prelate the Bishop of Oxford and the noble Lord, Lord Stevenson, but enjoy the support of a vast array of Members of both Houses. I thank all those who have voiced their support.
The full package of amendments defines and sets out the rules of the road for age assurance, including the timing of its introduction, and the definition of terms such as age verification and age assurance. They introduce the concept of measuring the efficacy of systems with one eye on the future so that we as parliamentarians can indicate where and when we feel that proportionality is appropriate and where it is simply not—for example, in relation to pornography. In parallel, we have developed a schedule of harms, which garners rather fewer column inches but is equally important in establishing Parliament’s intention. It is that schedule of harms that is up for debate today.
Before I lay out the amendment, I thank the 26 children’s charities which have so firmly got behind this package and acknowledge, in particular, Barnardo’s, CEASE and 5Rights, of which I am chair, which have worked tirelessly to ensure that the full expertise of children’s charities has been embedded in these amendments. I also pay tribute to the noble Baroness, Lady Benjamin, who in this area of policy has shown us all the way.
The key amendment in this group is Amendment 93, which would place a schedule of harms to children in the Bill. There are several reasons for doing so, the primary one being that by putting them in the Bill we are stating the intention of Parliament, which gives clarity to companies and underlines the authority of Ofcom to act on these matters. Amendments 20, 74 and 123 ensure that the schedule is mirrored in risk assessments and tasks Ofcom with updating its guidance every six months to capture new and emerging harms, and as such are self-evident.
The proposed harms schedule is centred around the four Cs, a widely used and understood taxonomy of harm used in legislation and regulation around the globe. Importantly, rather than articulate individual harms that may change over time, it sets its sight on categories of harm: content, contact, conduct and contract, which is sometimes referred to as commercial harm. It also accounts for cumulative harms, where two or more risk factors create a harm that is greater than any single harm or is uniquely created by the combination. The Government’s argument against the four Cs is that they are not future-proof, which I find curious since the very structure of the four Cs is to introduce broad categories of harm to which harms can be added, particularly emerging harms. By contrast, the Government are adding an ever-growing list of individual harms.
I wish to make three points in favour of our package of amendments relating first to language, secondly to the nature of the digital world, and finally to clarity of purpose. It is a great weakness of the Bill that it consistently introduces new concepts and language—for example, the terms “primary priority content”, “priority content” and “non-designated content”. These are not terms used in other similar Bills across the globe, they are not evident in current UK law and they do not correlate with established regimes, such as equalities legislation or children’s rights under the convention, more of which in group 7.
The question of language is non-trivial. It is the central concern of those who fight CSAE around the world, who frequently find that enforcement against perpetrators or takedown is blocked by legal systems that define child sexual abuse material differently—not differently in some theoretical sense but because the same image can be categorised differently in two countries and then be a barrier to enforcement across jurisdictions. Leadership from WeProtect, the enforcement community and representatives that I recently met from Africa, South America and Asia have all made this point. It undermines the concept of UK leadership in child protection that we are wilfully and deliberately rejecting accepted language which is embedded in treaties, international agreements and multilateral organisations to start again with our own, very likely with the same confused outcome.
Secondly, I am concerned that while both the Bill and the digital world are predicated on system design, the harms are all articulated as content with insufficient emphasis on systems harms, such as careless recommendations, spreading engagement and the sector-wide focus on maximising engagement, which are the very things that create the toxic and dangerous environment for children. I know, because we have discussed it, that the Minister will say that this is all in the risk assessment, but the risk assessment asks regulated companies to assess how a number of features contribute to harm, mostly expressed as content harm.
What goes through my mind is the spectre of Meta’s legal team, which I watched for several days during Molly Russell’s inquest; they stood in a court of law and insisted that hundreds, in fact thousands, of images of cut bodies and depressive messages did not constitute harm. Rather, they regarded them as cries for help or below the bar of harm as they interpreted it. Similarly, there was material that featured videos of people jumping off buildings—some of them sped-up versions of movie clips edited to suggest that jumping was freedom—and I can imagine a similar argument that says that kind of material cannot be considered harmful, because in another context it is completely legitimate. Yet this material was sent to Molly at scale.
My Lords, I really appreciated the contribution from the noble Baroness, Lady Ritchie of Downpatrick, because she asked a lot of questions about this group of amendments. Although I might be motivated by different reasons, I found it difficult to fully understand the impact of the amendments, so I too want to ask a set of questions.
Harm is defined in the Bill as “physical or psychological harm”, and there is no further explanation. I can understand the frustration with that and the attempts therefore to use what are described as the
“widely understood and used 4 Cs of online risk to children”.
They are not widely understood by me, and I have ploughed my way through it. I might well have misunderstood lots in it, but I want to look at and perhaps challenge some of the contents.
I was glad that Amendment 20 recognises the level of risk of harm to different age groups. That concerns me all the time when we talk about children and young people, and then end up treating four year-olds, 14 year-olds and 18 year-olds. I am glad that that is there, and I hope that we will look at it again in future.
I want to concentrate on Amendment 93 and reflect and comment more generally on the problem of a definition, or a lack of definition, of harm in the Bill. For the last several years that we have been considering bringing this Bill to this House and to Parliament, I have been worried about the definition of psychological harm. That is largely because this category has become ever more expansive and quite subjective in our therapeutic age. It is a matter of some discussion and quite detailed work by psychologists and professionals, who worry that there is an expanding concept group of what is considered harmful and what psychological harm really means.
As an illustration, I was invited recently to speak to a group of sixth-formers and was discussing things such as trigger warnings and so on. They said, “Well, you know, you’ve got to understand what it’s like”—they were 16 year-olds. “When we encounter certain material, it makes us have PTSD”. I was thinking, “No, it doesn’t really, does it?” Post-traumatic stress disorder is something that you might well gain if you have been in the middle of a war zone. The whole concept of triggering came from psychological and medical insights from the First World War, which you can understand. If you hear a car backfiring, you think it is somebody shooting at you. But the idea here is that we should have trigger warnings on great works of literature and that if we do not it will lead to PTSD.
I am not being glib, because an expanded, elastic and pathologised view of harm is being used quite cavalierly and casually in relation to young people and protecting them, often by the young people themselves. It is routinely used to close down speech as part of the cancel culture wars, which, as noble Lords know, I am interested in. Is there not a danger that this concept of harm is not as obvious as we think, and that the psychological harm issue makes it even more complicated?
The other thing is that Amendment 93 says:
“The harms in this Schedule are a non-exhaustive list of categories and other categories may be relevant”.
As with the discussion on whose judgment decides the threshold for removing illegal material, I think that judging what is harmful is even more tricky for the young in relation to psychological harm. I was reminded of that when the noble Baroness, Lady Kidron, complained that what she considered to be obviously and self-evidently harmful, Meta did not. I wondered whether that is just the case with Meta, or whether views will differ when it comes to—
The report found—I will not give a direct quotation—that social media contributed to the death of Molly Russell, so it was the court’s judgment, not mine, that Meta’s position was indefensible.
I completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.
In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.
The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.
This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.
The amendment states that the Bill should target any platform that posts
“links to, or … encourages child users to seek”
out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.
I appreciate that that this is the case we all have in the back of our minds. I am asking whether, when Meta says it is content agnostic, the Bill is the appropriate place for us to list the topics that we consider harmful. If we are to do that, I was giving examples of contentious, harmful topics. I might have got this wrong—
I will answer the noble Baroness more completely when I wind up, but I just want to say that she is missing the point of the schedule a little. Like her, I am concerned about the way we concentrate on content harms, but she is bringing it back to content harms. If she looks at it carefully, a lot of the provisions are about contact and conduct: it is about how the system is pushing children to do certain things and pushing them to certain places. It is about how things come together, and I think she is missing the point by keeping going back to individual pieces of content. I do not want to take the place of the Minister, but this is a systems and processes Bill; it is not going to deal with individual pieces of content in that way. It asks, “Are you creating these toxic environments for children? Are you delivering this at scale?” and that is the way we must look at this amendment.
I will finish here, because we have to get on, but I did not introduce content; it is in the four Cs. One of the four Cs is “content” and I am reacting to amendments tabled by the noble Baroness. I do not think I am harping on about content; I was responding to amendments in which content was one of the key elements.
My Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.
The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:
“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]
This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.
The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.
Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.
It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.
The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.
I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.
I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.
With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.
I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that
“primary priority content harmful to children”
will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?
I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.
For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.
Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.
This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.
The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.
The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.
I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.
Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.
I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.
Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.
We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.
I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.
I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.
What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.
Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.
Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberIt is a great pleasure to follow my noble friend Lord Russell and to thank him for his good wishes. I assure the Committee that there is nowhere I would rather spend my birthday, in spite of some competitive offers. I remind noble Lords of my interests in the register, particularly as the chair of 5Rights Foundation.
As my noble friend has set out, these amendments fall in three places: the risk assessments, the safety duties and the codes of practice. However, together they work on the overarching theme of safety by design. I will restrict my detailed remarks to a number of amendments in the first two categories. This is perhaps a good moment to recall the initial work of Carnegie, which provided the conceptual approach of the Bill several years ago in arguing for a duty of care. The Bill has gone many rounds since then, but I think the principle remains that a regulated service should consider its impact on users before it causes them harm. Safety by design, to which all the amendments in this group refer, is an embodiment of a duty of care. In thinking about these amendments as a group, I remind the Committee that both the proportionality provisions and the fact that this is a systems and processes Bill means that no company can, should or will be penalised for a single piece of content, a single piece of design or, indeed, low-level infringements.
Amendments 24, 31, 77 and 84 would delete “content” from the Government’s description of what is harmful to children, meaning that the duty is to consider harm in the round rather than just harmful content. The definition of “content” is drawn broadly in Clause 207 as
“anything communicated by means of an internet service”,
but the examples in the Bill, including
“written material … music and data of any description”,
once again fail to include design features that are so often the key drivers of harm to children.
On day three of Committee, the Minister said:
“The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service … This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children”.—[Official Report, 27/4/23; col. 1385.]
However, in looking at the child safety duties, Clause 11(5) says:
“The duties … in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used”,
but subsection (14) says:
“The duties set out in subsections (3) and (6)”—
which are the duties to operate proportionate systems and processes to prevent and protect children from encountering harmful content and to include them in terms of service—
“are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.
I hesitate to say whether that is contradictory. I am not actually sure, but it is confusing. I am concerned that while we are reassured that “content” means content and activity and that the risk assessment considers functionality, “harm” is then repeatedly expressed only in the form of content.
Over the weekend, I had an email exchange with the renowned psychoanalyst and author, Norman Doidge, whose work on the plasticity of the brain profoundly changed how we think about addiction and compulsion. In the exchange, he said that
“children’s exposures to super doses, of supernormal images and scenes, leaves an imprint that can hijack development”.
Then, he said that
“the direction seems to be that AI would be working out the irresistible image or scenario, and target people with these images, as they target advertising”.
His argument is that it is not just the image but the dissemination and tailoring of that image that maximises the impact. The volume and frequency of those images create habits in children that take a lifetime to change—if they change at all. Amendments 32 and 85 would remove this language to ensure that content that is harmful by virtue of its dissemination is accounted for.
I turn now to Amendments 28 and 82, which cut the reference to the
“size and capacity of the provider of the service”
in deeming what measures are proportionate. We have already discussed that small is not safe. Such platforms such as Yubo, Clapper and Discord have all been found to harm children and, as both the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, told us, small can become big very quickly. It is far easier to build to a set of rules than it is to retrofit them after the event. Again, I point out that Ofcom already has duties of proportionality; adding size and capacity is unnecessary and may tip the scale to creating loopholes for smaller services.
Amendment 138 seeks to reverse the exemption in Clause 54 of financial harms. More than half of the 100 top-grossing mobile phone apps contain loot boxes, which are well established as unfair and unhealthy, priming young children to gamble and leading to immediate hardship for parents landed with extraordinary bills.
By rights, Amendments 291 and 292 could fit in the future-proof set of amendments. The way that the Bill in Clause 204 separates out functionalities in terms of search and user-to-user is in direct opposition to the direction of travel in the tech sector. TikTok does shopping, Instagram does video, Amazon does search; autocomplete is an issue across the full gamut of services, and so on and so forth. This amendment simply combines the list of functionalities that must be risk-assessed and makes them apply on any regulated service. I cannot see a single argument against this amendment: it cannot be the Government’s intention that a child can be protected, on search services such as Google, from predictive search or autocomplete, but not on TikTok.
Finally, Amendment 295 will embed the understanding that most harm is cumulative. If the Bereaved Parents for Online Safety were in the Chamber, or any child caught up in self-harm, depression sites, gambling, gaming, bullying, fear of exposure, or the inexorable feeling of losing their childhood to an endless scroll, they would say at the top of their voices that it is not any individual piece of content, or any one moment or incident, but the way in which they are nudged, pushed, enticed and goaded into a toxic, harmful or dangerous place. Adding the simple words
“the volume of the content and the frequency with which the content is accessed”
to the interpretation of what can constitute harm in Clause 205 is one of the most important things that we can do in this Chamber. This Bill comes too late for a whole generation of parents and children but, if these safety by design amendments can protect the next generation of children, I will certainly be very glad.
My Lords, it is an honour, once again, to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, in this Committee. I am going to speak in detail to the amendments that seek to change the way the codes of practice are implemented. Before I do, however, I will very briefly add my voice to the general comments that the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, have just taken us through. Every parent in the country knows that both the benefit and the harm that online platforms can bring our children is not just about the content. It is about the functionality: the way these platforms work; the way they suck us in. They do give us joy but they also drive addiction. It is hugely important that this Bill reflects the functionality that online platforms bring, and not just content in the normal sense of the word “content”.
I will now speak in a bit more detail about the following amendments: Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A—I will finish soon, I promise—112, 122ZA, 122ZB and 122ZC.
My Lords, I want, apart from anything else, to speak in defence of philosophical ruminations. The only way we can scrutinise the amendments in Committee is to do a bit of philosophical rumination. We are trying to work out what the amendments might mean in terms of changing the Bill.
I read these amendments, noted their use of “eliminate” —we have to “eliminate” all risks—and wondered what that would mean. I do not want to feel that I cannot ask these kinds of difficult questions for fear that I will offend a particular group or that it would be insensitive to a particular group of parents. It is difficult but we are required as legislators to try to understand what each other are trying to change, or how we are going to try to change the law.
I say to those who have put “eliminate” prominently in a number of these amendments that it is impossible to eliminate all risks to children—is it not?—if they are to have access to the online world, unless you ban them from the platforms completely. Is “eliminate” really helpful here?
Previously in Committee, I talked a lot about the potential dangers, psychologically and with respect to development, of overcoddling young people, of cotton wool kids, and so on. I noted an article over the weekend by the science journalist Tom Chivers, which included arguments from the Oxford Internet Institute and various psychologists that the evidence on whether social media is harmful, particularly for teenagers, is ambiguous.
I am very convinced by the examples brought forward by the noble Baroness, Lady Kidron—and I too wish her a happy birthday. We all know about the targeting of young people and so forth, but I am also aware of the positives. I always try to balance these things out and make sure that we do not deny young people access to the positives. In fact, I found myself cheering at the next group of amendments, which is unusual. First, they depend on whether you are four or 14—in other words, you have to be age-specific—and, secondly, they recognise that we do not want to pass anything in the Bill that actually denies children access to either their own privacy or the capacity to know more.
I also wanted to explore a little the idea of expanding the debate away from content to systems, because this is something that I think I am not quite understanding. My problem is that moving away from the discussion on whether content is removed or accessible, and focusing on systems, does not mean that content is not in scope. My worry is that the systems will have an impact on what content is available.
Let me give some examples of things that can become difficult if we think that we do not want young people to encounter violence and nudity—which makes it seem as though we know what we are talking about when we talk about “harmful”. We will all recall that, in 2018, Facebook removed content from the Anne Frank Centre posted by civil rights organisations because it included photographs of the Holocaust featuring undressed children among the victims. Facebook apologised afterwards. None the less, my worry is these kinds of things happening. Another example, in 2016, was the removal of the Pulitzer Prize-winning photograph “The Terror of War”, featuring fleeing Vietnamese napalm victims in the 1970s, because the system thought it was something dodgy, given that the photo was of a naked child fleeing.
I need to understand how system changes will not deprive young people of important educational information such as that. That is what I am trying to distinguish. The point made by the noble Lord, Lord Moylan, about “harmful” not being defined—I have endlessly gone on about this, and will talk more about it later—is difficult because we think that we know what we mean by “harmful” content.
Finally, on the amendments requiring compliance with Ofcom codes of practice, that would give an extraordinary amount of power to the regulator and the Secretary of State. Since I have been in this place, people have rightly drawn my attention to the dangers of delegating power to the Executive or away from any kind of oversight—there has been fantastic debate and discussion about that. It seems to me that these amendments advocate delegated powers being given to the Secretary of State and Ofcom, an unelected body —the Secretary of State could amend for reasons of public policy in order to protect children—and this is to be put through the negative procedure. In any other instance, I would have expected outcry from the usual suspects, but, because it involves children, we are not supposed to object. I worry that we need to have more scrutiny of such amendments and not less, because in the name of protecting children unintended consequences can occur.
I want to answer the point that amendments cannot be seen in isolation. Noble Lords will remember that we had a long and good debate about what constituted harms to children. There was a big argument and the Minister made some warm noises in relation to putting harms to children in the Bill. There is some alignment between many people in the Chamber whereby we and Parliament would like to determine what harm is, and I very much share the noble Baroness’s concern about pointing out what that is.
On the issue of the system versus the content, I am not sure that this is the exact moment but the idea of unintended consequences keeps getting thrown up when we talk about trying to point the finger at what creates harm. There are unintended consequences now, except neither Ofcom nor the Secretary of State or Parliament but only the tech sector has a say in what the unintended consequences are. As someone who has been bungee jumping, I am deeply grateful that there are very strict rules under which that is allowed to happen.
My Lords, I support the amendments in this group that, with regard to safety by design, will address functionality and harms—whatever exactly we mean by that—as well as child safety duties and codes of practice. The noble Lord, Lord Russell, and the noble Baronesses, Lady Harding and Lady Kidron, have laid things out very clearly, and I wish the noble Baroness, Lady Kidron, a happy birthday.
I also support Amendment 261 in the name of my right reverend friend the Bishop of Oxford and supported by the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville. This amendment would allow the Secretary of State to consider safety by design, and not just content, when reviewing the regime.
As we have heard, a number of the amendments would amend the safety duties to children to consider all harms, not just harmful content, and we have begun to have a very interesting debate on that. We know that service features create and amplify harms to children. These harms are not limited to spreading harmful content; features in and of themselves may cause harm—for example, beautifying filters, which can create unrealistic body ideals and pressure on children to look a certain way. In all of this, I want us to listen much more to the voices of children and young people—they understand this issue.
Last week, as part of my ongoing campaign on body image, including how social media can promote body image anxiety, I met a group of young people from two Gloucestershire secondary schools. They were very good at saying what the positives are, but noble Lords will also be very familiar with many of the negative issues that were on their minds, which I will not repeat here. While they were very much alive to harmful content and the messages it gives them, they were keen to talk about the need to address algorithms and filters that they say feed them strong messages and skew the content they see, which might not look harmful but, because of design, accentuates their exposure to issues and themes about which they are already anxious. Suffice to say that underpinning most of what they said to me was a sense of powerlessness and anxiety when navigating the online world that is part of their daily lives.
The current definition of content does not include design features. Building in a safety by design principle from the outset would reduce harms in a systematic way, and the amendments in this group would address that need.
I am grateful to the noble Lord. In many ways, I am reminded of the article I read in the New York Times this weekend and the interview with Geoffrey Hinton, the now former chief scientist at Google. He said that as companies improve their AI systems, they become increasingly dangerous. He said of AI technology:
“Look at how it was five years ago and how it is now. Take the difference and propagate it forwards. That’s scary”.
Yes, the huge success of the iPhone, of mobile phones and all of us, as parents, handing our more redundant iPhones on to our children, has meant that children have huge access. We have heard the stats in Committee around the numbers who are still in primary school and on social media, despite the terms and conditions of those platforms. That is precisely why we are here, trying to get things designed to be safe as far as is possible from the off, but recognising that it is dynamic and that we therefore need a regulator to keep an eye on the dynamic nature of these algorithms as they evolve, ensuring that they are safe by design as they are being engineered.
My noble friend Lord Stevenson has tabled Amendment 27, which looks at targeted advertising, especially that which requires data collection and profiling of children. In that, he has been grateful to Global Action Plan for its advice. While advertising is broadly out of scope of the Bill, apart from in respect of fraud, it is significant for the Minister to reflect on the user experience for children. Whether it is paid or organic content, it is pertinent in terms of their safety as children and something we should all be mindful of. I say to the noble Lord, Lord Vaizey, that as I understand it, the age-appropriate design code does a fair amount in respect of the data privacy of children, but this is much more about preventing children encountering the advertising in the first place, aside from the data protections that apply in the age-appropriate design code. But the authority is about to correct me.
Just to add to what the noble Lord has said, it is worth noting that we had a debate, on Amendment 92, about aligning the age-appropriate design code likely to be accessed and the very important issue that the noble Lord, Lord Vaizey, raised about alignment of these two regimes. I think we can say that these are kissing cousins, in that they take a by-design approach. The noble Lord is completely right that the scope of the Bill is much broader than data protection only, but they take the same approach.
I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.
Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.
Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.
I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.
I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.
I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.
Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.
Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.
My Lords, I support the amendments in the name of the noble Lord, Lord Russell, to require regulated services to have regard to the UN Convention on the Rights of the Child. As we continue to attempt to strengthen the Bill by ensuring that the UK will be the safest place for children to be online, there is a danger that platforms may take the easy way out in complying with the new legislation and just block children entirely from their sites. Services must not shut children out of digital spaces altogether to avoid compliance with the child safety duties, rather than designing services with their safety in mind. Children have rights and, as the UN convention makes clear, they must be treated according to their evolving capacities and in their best interests in consideration of their well-being.
Being online is now an essential right, not an option, to access education, entertainment and friendship, but we must try to ensure that it is a safe space. As the 5Rights Foundation points out, the Bill risks infringing children’s rights online, including their rights to information and participation in the digital world, by mandating that services prevent children from encountering harmful content, rather than ensuring services are made age appropriate for children and safe by design, as we discussed earlier. As risk assessments for adults have been stripped from the Bill, this has had the unintended consequence of rendering a child user relative to an adult user even more costly, as services will have substantial safety duties to comply with to protect children. 5Rights Foundation warns that this will lead services to determine that it is not worth designing services with children’s safety in mind but that it could be more cost effective to lock them out entirely.
Ofcom must have a duty to have regard for the UNCRC in its risk assessments. Amendment 196 would ensure that children’s rights are reflected in Ofcom’s assessment of risks, so that Ofcom must have regard for children’s rights in balancing their rights to be safe against their rights to access age-appropriate digital spaces. This would ensure compliance with general comment No. 25, as the noble Lord, Lord Russell, mentioned, passed in 2021, to protect children’s rights to freedom of expression and privacy. I urge the Ministers to accept these amendments to ensure that the UK will be not only the safest place for children to be online but the best place too, by respecting and protecting their rights.
My Lords, I support all the amendments in this group, and will make two very brief points. Before I do, I believe that those who are arguing for safety by design and to put harms in the Bill are not trying to restrict the freedom of children to access the internet but to give the tech sector slightly less freedom to access children and exploit them.
My first point is a point of principle, and here I must declare an interest. It was my very great privilege to chair the international group that drafted general comment No. 25 on children’s rights in relation to the digital environment. We did so on behalf of the Committee on the Rights of the Child and, as my noble friend Lord Russell said, it was adopted formally in 2021. To that end, a great deal of work has gone into balancing the sorts of issues that have been raised in this debate. I think it would interest noble Lords to know that the process took three years, with 150 submissions, many by nation states. Over 700 children in 28 countries were consulted in workshops of at least three hours. They had a good shout and, unlike many of the other general comments, this one is littered with their actual comments. I recommend it to the Committee as a very concise and forceful gesture of what it might be to exercise children’s rights in a balancing way across all the issues that we are discussing. I cannot remember who, but somebody said that the online world is not optional for children: it is where they grow up; it is where they spend their time; it is their education; it is their friendships; it is their entertainment; it is their information. Therefore, if it is not optional, then as a signatory to the UNCRC we have a duty to respect their rights in that environment.
My second point is rather more practical. During the passage of the age-appropriate design code, of which we have heard much, the argument was made that children were covered by the amendment itself, which said they must be kept in mind and so on. I anticipate that argument being made here—that we are aligning with children’s rights, apart from the fact that they are indivisible and must be done in their entirety. In that case, the Government happily accepted that it should be explicit, and it was put in the Data Protection Act. It was one of the most important things that happened in relation to the age-appropriate design code. We might hope that, when this Bill is an Act, it will all be over—our job will be done and we can move on. However, after the Data Protection Act, the most enormous influx of lobbying happened, saying, “Please take the age down from 18 to 13”. The Government, and in that case the ICO, shrugged their shoulders and said, “We can’t; it’s on the face of the Bill”, because Article 1 of the UNCRC says that a child is anyone under the age of 18.
The evolving capacities of children are central to the UNCRC, so the concerns of the noble Baroness, Lady Fox, which I very much share, that a four year-old and a 14 year-old are not the same, are embodied in that document and in the general comment, and therefore it is useful.
These amendments are asking for that same commitment here—to children and to their rights, and to their rights to protection, which is at the heart of so much of what we are debating, and their well-being. We need their participation; we need a digital world with children in it. Although I agreed very much with the noble Baroness, Lady Bennett, and her fierce defending of children’s rights, there are 1 billion children online. If two-thirds of them have not seen anything upsetting in the last year, that rather means that one-third of 1 billion children have—and that is too many.
My Lords, I did not intend to speak in this debate but I have been inspired by it.
I was here for the encryption debate last week, which I did not speak in. One of the contributions was around unintended consequences of the legislation, and I am concerned about unintended consequences here.
I absolutely agree with the comments of the noble Baroness, Lady Bennett, around the need for children to engage on the internet. Due to a confidence and supply agreement with the then Government back in 2017, I ensured that children and adults alike in Northern Ireland have the best access to the internet in the United Kingdom, and I am very proud of that. Digital literacy is covered in a later amendment, Amendment 91, which I will be strongly supporting. It is something that everybody needs to be involved in, not least our young people—and here I declare an interest as the mother of a 16 year-old.
I have two concerns. The first was raised by my friend the noble Lord, Lord Weir, around private companies being legally accountable for upholding an international human rights treaty. I am much more comfortable with Amendments 187 and 196, which refer to Ofcom. I think that is where the duty should be. I have an issue not with the convention but with private companies being held responsible for it; Ofcom should be the body responsible.
Secondly, I listened very carefully to what the noble Baroness, Lady Kidron, said about general comment No. 25. If what I say is incorrect, I hope she will say so. Is general comment No. 25 a binding document on the Government? I understood that it was not.
We need to see the UNCRC included in the Bill. The convention is never opened up again, and how it makes itself relevant to the modern world is through the general comments; that is how the Committee on the Rights of the Child would interpret it.
So it is an interpretive document. The unintended consequences piece was around general comment No. 25 specifically having reference to children being able to seek out content. That is certainly something that I would be concerned about. I am sure that we will discuss it further in the next group of amendments, which are on pornography. If young people were able to seek out harmful content, that would concern me greatly.
I support Amendments 187 and 196, but I have some concerns about the unintended consequences of Amendment 25.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I support the noble Baroness, Lady Ritchie, in her search to make it clear that we do not need to take a proportionate approach to pornography. I would be delighted if the Minister could indicate in his reply that the Government will accept the age-assurance amendments in group 22 that are coming shortly, which make it clear that porn on any regulated service, under Part 3 or Part 5, should be behind an age gate.
In making the case for that, I want to say very briefly that, after the second day of Committee, I received a call from a working barrister who represented 90 young men accused of serious sexual assault. Each was a student and many were in their first year. A large proportion of the incidents had taken place during freshers’ week. She rang to make sure that we understood that, while what each and every one of them had done was indefensible, these men were also victims. As children brought up on porn, they believed that their sexual violence was normal—indeed, they told her that they thought that was what young women enjoyed and wanted. On this issue there is no proportionality.
My Lords, I also support Amendments 29, 83 and 103 from the noble Baroness, Lady Ritchie. As currently drafted, the Bill makes frequent reference to Ofcom taking into account
“the size and capacity of … a service”
when it determines the extent of the measures a site should apply to protect children. We have discussed size on previous days; I am conscious that the point has been made in part, but I hope the Committee will forgive me if I repeat it clearly. When it comes to pornography and other harms to children, size does matter. As I have said many times recently, porn is porn no matter the size of the website or publisher involved with it. It does not matter whether it is run by a huge company such as MindGeek or out of a shed in London or Romania by a small gang of people. The harm of the content to children is still exactly the same.
Our particular concern is that, if the regulations from Ofcom are applied to the bigger companies, that will create a lot of space for smaller organisations which are not bending to the regulations to try to gain a competitive advantage over the larger players and occupy that space. That is the concern of the bigger players. They are very open to age verification; what concerns them is that they will face an unequal, unlevel playing field. It is a classic concern of bigger players facing regulation in the market: that bad actors will gain competitive advantage. We should be very cognisant of that when thinking about how the regulations on age verification for porn will be applied. Therefore, the measures should be applied in proportion to the risk of harm to children posed by a porn site, not in proportion to the site’s financial capacity or the impact on its revenues of basic protections for children.
In this, we are applying basic, real-world principles to the internet. We are denying its commonly held exceptionalism, which I think we are all a bit tired of. We are applying the same principles that you might apply in the real world, for instance, to a kindergarten, play centre, village church hall, local pub, corner shop or any other kind of business that brings itself in front of children. In other words, if a company cannot afford to implement or does not seem capable of implementing measures that protect children, it should not be permitted by law to have a face in front of the general public. That is the principle that we apply in the real world, and that is the principle we should be applying on the internet.
Allowing a dimension of proportionality to apply to pornography cases creates an enormous loophole in the legislation, which at best will delay enforcement for particular sites when it is litigated and at worst will disable regulatory action completely. That is why I support the amendments in the name of the noble Baroness, Lady Ritchie.
I would not want to disagree with the noble Baroness for a moment.
Does the noble Lord think it is also important to have some idea of measurement? Age assurance in certain circumstances is far more accurate than age verification.
Yes; the noble Baroness is right. She has pointed out in other discussions I have been party to that, for example, gaming technology that looks at the movement of the player can quite accurately work out from their musculoskeletal behaviour, I assume, the age of the gamer. So there are alternative methods. Our challenge is to ensure that if they are to be used, we will get the equivalent of age verification or better. I now hand over to the Minister.
My Lords, I support something between the amendments of the noble Lords, Lord Stevenson and Lord Bethell, and the Government. I welcome all three and put on record my thanks to the Government for making a move on this issue.
There are three members of the pre-legislative committee still in the Chamber at this late hour, and I am sure I am not the only one of those three who remembers the excruciating detail in which Suzanne Webb MP, during evidence given with Meta’s head of child safety, established that there was nowhere to report harm, but nowhere—not up a bit, not sideways, not to the C-suite. It was stunning. I have used that clip from the committee’s proceedings several times in schools to show what we do in the House of Lords, because it was fascinating. That fact was also made abundantly clear by Frances Haugen. When we asked her why she took the risk of copying things and walking them out, she said, “There was nowhere to go and no one to talk to”.
Turning to the amendments, like the noble Baroness, Lady Harding, I am concerned about whether we have properly dealt with C-suite reporting and accountability, but I am a hugely enthusiastic supporter of that accountability being in the system. I will be interested to hear the Minister speak to the Government’s amendment, but also to some of the other issues raised by the noble Lord, Lord Knight.
I will comment very briefly on the supply chain and Amendment 219. Doing so, I go back again to Amendment 2, debated last week, which sought to add services not covered by the current scope but which clearly promoted and enabled access to harm and which were also likely to be accessed by children. I have a long quote from the Minister but, because of the hour, I will not read it out. In effect, and to paraphrase, he said, “Don’t worry, they will be caught by the other guys—the search and user-to-user platforms”. If the structure of the Bill means that it is mandatory that the user-to-user and search platforms catch the people in the supply chain, surely it would be a great idea to put that in the Bill absolutely explicitly.
Finally, while I share some of the concerns raised by the noble Baroness, Lady Fox, I repeat my constant reprise of “risk not size”. The size of the fine is related to the turnover of the company, so it is actually proportionate.
My Lords, this has been a really interesting debate. I started out thinking that we were developing quite a lot of clarity. The Government have moved quite a long way since we first started debating senior manager liability, but there is still a bit of fog that needs dispelling—the noble Baronesses, Lady Kidron and Lady Harding, have demonstrated that we are not there yet.
I started off by saying yes to this group, before I got to grips with the government amendments. I broadly thought that Amendment 33, tabled by the noble Lord, Lord Stevenson, and Amendment 182, tabled by the noble Lord, Lord Bethell, were heading in the right direction. However, I was stopped short by Trustpilot’s briefing, which talked about a stepped approach regarding breaches and so on—that is a very strong point. It says that it is important to recognise that not all breaches should carry the same weight. In fact, it is even more than that: certain things should not even be an offence, unless you have been persistent or negligent. We have to be quite mindful as to how you formulate criminal offences.
I very much liked what the noble Lord, Lord Bethell, had to say about the tech view of its own liability. We have all seen articles about tech exceptionalism, and, for some reason, that seems to have taken quite a hold—so we have to dispel that as well. That is why I very much liked what the noble Lord, Lord Curry, said. It seemed to me that that was very much part of a stepped approach, while also being transparent to the object of the exercise and the company involved. That fits very well with the architecture of the Bill.
The noble Baroness, Lady Harding, put her finger on it: the Bill is not absolutely clear. In the Government’s response to the Joint Committee’s report, we were promised that, within three to six months, we would get that senior manager liability. On reading the Bill, I am certainly still a bit foggy about it, and it is quite reassuring that the noble Baroness, Lady Harding, is foggy about it too. Is that senior manager liability definitely there? Will it be there?
The Joint Committee made two other recommendations which I thought made a lot of sense: the obligation to report on risk assessment to the main board of a company, and the appointment of a safety controller, which the noble Lord, Lord Knight, mentioned. Such a controller would make it very clear—as with GDPR, you would have a senior manager who you can fix the duty on.
Like the noble Baroness, Lady Harding, I would very much like to hear from the Minister on the question of personal liability, as well as about Ofcom. It is important that any criminal prosecution is mediated by Ofcom; that is cardinal. You cannot just create criminal offences where you can have a prosecution without the intervention of Ofcom. That is extraordinarily important.
I have just a couple of final points. The noble Baroness, Lady Fox, comes back quite often to this point about regulation being the enemy of innovation. It very much depends what kind of innovation we are talking about. Technology is not necessarily neutral. It depends how the humans who deploy it operate it. In circumstances such as this, where we are talking about children and about smaller platforms that can do harm, I have no qualms about having regulation or indeed criminal liability. That is a really important factor. We are talking about a really important area.
I very strongly support Amendment 219. It deals with a really important aspect which is completely missing from the Bill. I have a splendid briefing here, which I am not going to read out, but it is all about Mastodon being one example of a new style of federated platform in which the app or hub for a network may be category 1 owing to the size of its user base but individual subdomains or networks sitting below it could fall under category 2 status. I am very happy to give a copy of the briefing to the Minister; it is a really well-written brief, and demonstrates entirely some of the issues we are talking about here.
I reassure the noble Lord, Lord Knight, that I think the amendment is very well drafted. It is really quite cunning in the way that it is done.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI support Amendment 44. I am pleased that, as part of the new triple shield, the Government have introduced Clause 12 on “User empowerment duties”, which allow users to protect themselves, not just from abusive posts from other users but from whole areas of content. In the Communications and Digital Committee’s inquiry, we had plenty of evidence from organisations representing minorities and people with special characteristics who are unable adequately to protect themselves from the hate they receive online. I am glad that subsections (10) to (12) recognise specific content and users with special characteristics who are targets of abuse and need to be able to protect themselves, but subsection (3) requests that these features should be
“designed to effectively … reduce the likelihood of the user encountering content”
they want to avoid. I am concerned that “effectively” will be interpreted subjectively by platforms in scope and that each will interpret it differently.
At the moment, it will not be possible for Ofcom to assess how thoroughly the platforms have been providing these empowerment tools of protection for users. If the features are to work, there must be an overview of how effective they are being and how well they are working. When the former Secretary of State, Michelle Donelan, was asked about this, she said that there was nothing in this clause to pin an assessment on. It seems to me that the lists in Clause 12 create plenty of criteria on which to hang an assessment.
The new duties in Clause 12 provide for control tools for users against very specific content that is abusive or incites hatred on the basis of race, ethnicity, religion, disability, sex, gender reassignment or sexual orientation. However, this list is not exhaustive. There will inevitably be areas of content for which users have not been given blocking tools, including pornography, violent material and other material that is subject to control in the offline world.
Not only will the present list for such tools need to be assessed for its thoroughness in allowing users to protect themselves from specific harms, but surely the types of harm from which they need to protect themselves will change over time. Ofcom will need regularly to assess where these harms are and make sure that service providers regularly update their content-blocking tools. Without such an assessment, it will be hard for Ofcom and civil society to understand what the upcoming concerns are with the tools.
The amendment would provide a transparency obligation, which would demand that service providers inform users of the risks present on the platform. Surely this is crucial when users are deciding what to protect themselves from.
The assessment should also look for unintended restrictions on freedom of expression created by the new tools. If the tools are overprotective, they could surely create a bubble and limit users’ access to information that they might find useful. For example, the user might want to block material about eating disorders, but the algorithm might interpret that to mean limiting the user’s access to content on healthy lifestyles or nutrition content. We are also told that the algorithms do not understand irony and humour. When the filters are used to stop content that is abusive or incites hatred on the basis of users’ particular characteristics, they might also remove artistic, humorous or satirical content.
Repeatedly, we are told that the internet creates echo chambers, where users read only like-minded opinions. These bubbles can create an atmosphere where freedom of expression is severely limited and democracy suffers. A freedom of expression element to the assessment would also, in these circumstances, be critical. We are told that the tech platforms often do not know what their algorithms do and, not surprisingly, they often evolve beyond their original intentions. Assessments on the tools demanded by Clause 12 need to be carefully investigated to ensure that they are keeping up to date with the trends of abuse on the internet but also for the unintended consequences they might create, curbing freedom of expression.
Throughout the Bill, there is a balancing act between freedom of expression and protection from abuse. The user empowerment tools are potentially very powerful, and neither the service providers, the regulators nor the Government know what their effects will be. It is beholden upon the Government to introduce an assessment to check regularly how the user empowerment duties are working; otherwise, how can they be updated, and how can Ofcom discover what content is being unintentionally controlled? I urge the Minister, in the name of common sense, to ensure that these powerful tools unleashed by the Bill will not be misused or become outdated in a fast-changing digital world.
My Lords, I thank the noble Lord, Lord Moylan, for his words—I thought I was experiencing time travel there—and am sympathetic to many of the issues that he has raised, although I think that some of the other amendments in the group tackle those issues in a slightly different way.
I support Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. Requiring a post-rollout assessment to ensure that the triple shield acts as we are told it will seems to be a classic part of any regulatory regime that is fit for purpose: it needs to assess whether the system is indeed working. The triple shield is an entirely new concept, and none of the burgeoning regulatory systems around the world is taking this approach, so I hope that both the Government and Ofcom welcome this very targeted and important addition to the Bill.
I will also say a few words about Amendments 154 and 218. It seems to me that, in moving away from legal but harmful—which as a member of the pre-legislative committee I supported, under certain conditionality that has not been met, but none the less I did support it—not enough time and thought have been given to the implications of that. I do not understand, and would be grateful to the Minister if he could help me understand, how Ofcom is to determine whether a company has met its own terms and conditions—and by any means, not only by the means of a risk assessment.
I want to make a point that the noble Baroness, Lady Healy, made the other day—but I want to make it again. Taking legal but harmful out and having no assessment of whether a company has met its general safety duties leaves the child safety duties as an island. They used to be something that was added on to a general system of safety; now they are the first and only port of call. Again, because of the way that legal but harmful fell out of the Bill, I am not sure whether we have totally understood how the child risk assessments sit without a generally cleaned up or risk-assessed digital environment.
Finally, I will speak in support of Amendment 160, which would have Ofcom say what “adequate and appropriate” terms are. To a large degree, that is my approach to the problem that the noble Lord, Lord Moylan, spoke about: let Parliament and the regulator determine what we want to see—as was said on the data protection system, that is how it is—and let us have minimum standards that we can rightly expect, based on UK law, as the noble Lord suggested.
I am not against the triple shield per se, but it radically replaced an entire regime of assessment, enforcement and review. I think that some of the provisions in this group really beg the Government’s attention, in order to make sure that there are no gaping holes in the regime.
My Lords, I will speak to Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. I also note my support for the amendments in the name of the noble Lord, Lord Stevenson of Balmacara, to ensure the minimum standard for a platform’s terms of service. My noble friend Lord Moylan has just given an excellent speech on the reasons why these amendments should be considered.
I am aware that the next group of amendments relates to the so-called user empowerment tools, so it seems slightly bizarre to be speaking to Amendment 44, which seeks to ensure that these user empowerment tools actually work as the Government hope they will, and Amendment 158, which seeks to risk assess whether providers’ terms of service duties do what they say and report this to Ofcom. Now that the Government have watered down the clauses that deal with protection for adults, like other noble Lords, I am not necessarily against the Government’s replacement—the triple shield—but I believe that it needs a little tightening up to ensure that it works properly. These amendments seem a reasonable way of doing just that. They would ensure greater protection for adults without impinging on others’ freedom of expression.
The triple shield relies heavily on companies’ enforcement of terms of service and other vaguely worded duties, as the noble Viscount mentioned, that user empowerment tools need to be “easily accessible” and “effective”—whatever that means. Unlike with other duties in the Bill, such as those on illegal content and children’s duties, there is no mechanism to assess whether these new measures are working; whether the way companies are carrying out these duties is in accordance with the criteria set out; and whether they are indeed infringing freedom of expression. Risk assessments are vital to doing just that, because they are vital to understanding the environment in which services operate. They can reduce bureaucracy by allowing companies to rule out risks which are not relevant to them, and they can increase user safety by revealing new risks, thereby enabling the future-proofing of a regime. Can the Minister give us an answer today as to why risk assessment duties on these two strands of the triple shield—terms of service and user empowerment tools—were removed? If freedom of speech played a part in this, perhaps he could elaborate why he thinks undertaking a risk assessment is in any way a threat.
Without these amendments, the Bill cannot be said to be a complete risk management regime. Companies will, in effect, be marking their own homework when designing their terms of service and putting their finger in the air when it comes to user empowerment tools. There will be no requirement for them to explain either to Ofcom or indeed to service users the true nature of the harms that occur on their service, nor the rationale behind any decisions they might make in these two fundamental parts of their service.
Since the Government are relying so heavily on their triple shield to ensure protection for adults, to me, not reviewing two of the three strands that make up the triple shield seems like fashioning a three-legged stool with completely uneven legs: a stool that will not stand up to the slightest pressure when used. Therefore, I urge the Minister to look again and consider reinstating these protections in the Bill.
My Lords, I contribute to this debate on the basis of my interests as laid out in the register: as chief executive of Cerebral Palsy Scotland; my work with the Scottish Government on people with neurological conditions; and as a trustee of the Neurological Alliance of Scotland. It is an honour to follow the right reverend Prelate, whose point about the inequality people experience in the online world is well made. I want to be clear that when I talk about ensuring online protection for people with disabilities, I do not assume that all adults with disabilities are unable to protect themselves. As the right reverend Prelate and the noble Lord, Lord Griffiths of Burry Port, pointed out, survey after survey demonstrates how offline vulnerabilities translate into the online world, and Ofcom’s own evidence suggests that people with physical disabilities, learning disabilities, autism, mental health issues and others can be classed as being especially vulnerable online.
The Government recognise that vulnerable groups are at greater risk online, because in its previous incarnations, this Bill included greater protection for such groups. We spoke in a previous debate about the removal of the “legal but harmful” provisions and the imposition of the triple shield. The question remains from that debate: does the triple shield provide sufficient protection for these vulnerable groups?
As I have said previously this afternoon, user empowerment tools are the third leg of the triple shield, but they put all the onus on users and no responsibility on the platforms to prevent individuals’ exposure to harm. Amendments 36, 37 and 38A, in the name of the noble Lord, Lord Clement-Jones, seek simply to make the default setting for the proposed user empowerment tools to be “on”. I do not pretend to understand how, technically, this will happen, but it clearly can, because the Bill requires platforms to ensure that this is the default position to ensure protection for children. The default position in those amendments protects all vulnerable people, and that is why I support them—unlike, I fear, Amendment 34 from my noble friend Lady Morgan, which lists specific categories of vulnerable adults. I would prefer that all vulnerable people be protected from being exposed to harm in the first place.
Nobody’s freedom of expression is affected in any way by this default setting, but the overall impact on vulnerable individuals in the online environment would, I assure your Lordships, be significant. Nobody’s ability to explore the internet or to go into those strange rooms at the back of bookshops that the noble Baroness, Lady Fox, was talking about would be curtailed. The Government have already stated that individuals will have the capacity to seek out these tools and turn them on and off, and that they must be easily accessible. So individuals with capacity will be able to find the settings and set them to explore whatever legal content they choose.
However, is it not our duty to remember those who do not have capacity? What about adults with learning difficulties and people at a point of crisis—the noble Baroness, Lady Parminter, movingly spoke about people with eating disorders—who might not be able to turn to those tools due to their affected mental state, or who may not realise that what they are seeing is intended to manipulate? Protecting those users from encountering such content in the first place surely tips the balance in favour of turning the tools on by default.
I am very sad that the noble Baroness, Lady Campbell of Surbiton, cannot be here, because her contribution to this debate would be powerful. But, from her enormous experience of work with disabled people, this is her top priority for the Bill.
In preparing to speak to these amendments, I looked back to the inquiry in the other place into online abuse and the experience of disabled people that was prompted by Katie Price’s petition after the shocking abuse directed at her disabled son Harvey. In April 2019 the Government responded to that inquiry by saying that they were
“aware of the disproportionate abuse experienced by disabled people online and the damage such abuse can have on people’s lives, career and health”—
and the Government pledged to act.
The internet is a really important place for disabled people, and I urge the Government to ensure that it remains a safe place for all of us and to accept these amendments that would ensure the default settings are set to on.
My Lords, I rise to support the amendments in the name of the noble Baroness, Lady Morgan. I do so somewhat reluctantly, not because I disagree with anything that she said but because I would not necessarily start from here. I want to briefly say three very quick things about that and then move on to Amendments 42 and 45, which are also in this group.
We already have default settings, and we are pretending that this is a zero-sum game. The default settings at the moment are profiling us, filtering us and rewarding us; and, as the right reverend Prelate said in his immensely powerful speech, we are not starting at zero. So I do share the concerns of the noble Baroness, Lady Fox, about who gets to choose—some of us on this side of the debate are saying, “Can we define who gets to choose? Can Parliament choose? Can Ofcom choose? Can we not leave this in the hands of tech companies?” So on that I fully agree. But we do have default settings already, and this is a question of looking at some of the features as well as the content. It is a weakness of the Government’s argument that it keeps coming back to the content rather than the features, which are the main driver of what we see.
The second thing I want to say—this is where I am anxious about the triple shield—is: does not knowing you are being abused mean that you are not abused? I say that as someone with some considerable personal abuse. I have my filter on and I am not on social media, but my children, my colleagues and some of the people I work with around the world do see what is said about me—it is a reputational thing, and for some of them it is a hurtful thing, and that is why I am reluctant in my support. However, I do agree with all the speakers who have said that our duty is to start with those people who are most vulnerable.
I want to mention the words of one of the 5Rights advisers—a 17 year-old girl—who, when invited to identify changes and redesign the internet, said, “Couldn’t we do all the kind things first and gradually get to the horrible ones?” I think that this could be a model for us in this Chamber. So, I do support the noble Baroness.
I want to move briefly to Amendment 42, which would see an arbitrary list of protected characteristics replaced by the Equality Act 2010. This has a lot to do with a previous discussion we had about human rights, and I want to say urgently to the Minister that the offer of the Online Safety Bill is not to downgrade human rights, children’s rights and UK law, but rather to bring forward a smart and comprehensive regime to hold companies accountable for human rights, children’s rights and UK law. We do not want to have a little list of some of our children’s rights or of some of our legislation; we would like our legislation and our rights embedded in the Bill.
I have to speak for Amendment 45. I express my gratitude to the noble Lord, Lord Stevenson, for tabling it. It would require Ofcom, six months after the event, to ask whether children need these user empowerment tools. It is hugely important. I remind the Committee that children have not only rights but an evolving capacity to be out there in the world. As I said earlier, the children’s safety duties have a cliff-edge feel to them. As children go out into the world on the cusp of adulthood, maybe they would like to have some of these user empowerment tools.
My Lords, the noble Baroness, Lady Kidron, said words to the effect that perhaps we should begin by having particular regard for certain vulnerabilities, but we are dealing with primary legislation and this really concerns me. Lists such as in Clause 12 are really dangerous. It is not a great way to write law. We could be with this law for a long time.
I took the Communications Act 2003 through for Her Majesty’s Opposition, and we were doing our absolute best to future-proof the legislation. There was no mention of the internet in that piece of legislation. With great respect to the noble Lord, Lord McNally, with whom I sparred in those days, in was not that Act that introduced Ofcom but a separate Act. The internet was not even mentioned until the late Earl of Northesk introduced an amendment with the word “internet” to talk about the investigative powers Act.
The reality is that we already had Facebook, and tremendous damage being done through it to people such as my daughter. Noble Lords will remember that in the early days it was Oxford, Cambridge, Yale and Harvard; that is how it all began. It was an amazing thing, and we could not foresee what would happen but there was a real attempt to future-proof. If you start having lists such as in Clause 12, you cannot just add on or change. Cultural mores change. This list, which looks great in 2023, might look really odd in about 2027. Different groups will have emerged and say, “Well, what about me, what about me?”.
I entirely agree with the noble Baroness, Lady Fox. Who will be the decider of what is right, what is rude or what is abusive? I have real concerns with this. The Government have had several years to get this right. I say that with great respect to my noble friend the Minister, but we will have to think about these issues a little further. The design of the technology around all this is what we should be imposing on the tech companies. I was on the Communications and Digital Committee in 2020 when that was a key plank of our report, following the inquiry that we carried out and prior to the Joint Committee, then looking at this issue of “legal but harmful”, et cetera. I am glad that was dropped because—I know that I should not say this—when I asked a civil servant what was meant by “harmful”, he said, “Well, it might upset people”.
It is a very subjective thing. This is difficult for the Government. We must do all we can to support the Government in trying to find the right solutions, but I am sorry to say that I am a lawyer—a barrister—and I worry. We are trying to make things right but, remember, once it is there in an Act, it is there. People will use that as a tool. In 2002, at New Scotland Yard, I was introduced to an incredible website about 65 ways to become a good paedophile. Where does that fit in Clause 12? I have not quite worked that out. Is it sex? What is it? We have to be really careful. I would prefer having no list and making it more general, relying on the system to allow us to opt in.
I support my noble friend Lady Morgan’s amendment on this, which would make it easier for people to say, “Well, that’s fine”, but would not exclude people. What happens if you do not fit within Clause 12? Do you then just have to suck it up? That is not a very House of Lords expression, but I am sure that noble Lords will relate to it.
We have to go with care. I will say a little more on the next group of amendments, on anonymity. It is really hard, but what the Government are proposing is not quite there yet.
That seemed to be provoked by me saying that we must look after the vulnerable, but I am suggesting that we use UK law and the rights that are already established. Is that not better than having a small list of individual items?
I agree. The small list of individual items is the danger.
Does the Minister therefore think that the Government condone the current system, where we are inundated algorithmically with material that we do not want? Are the Government condoning that behaviour, in the way that he is saying they would condone a safety measure?
We will come to talk about algorithms and their risks later on. There is an important balance to strike here that we have debated, rightly, in this group. I remind noble Lords that there are a range of measures that providers can put in place—
But as I think the noble Baroness understands from that reference, this is a definition already in statute, and with which Parliament and the courts are already engaged.
The Bill’s overarching freedom of expression duties also apply to Clause 12. Subsections (4) to (7) of Clause 18 stipulate that category 1 service providers are required to assess the impact on free expression from their safety policies, including the user empowerment features. This is in addition to the duties in Clause 18(2), which requires all user-to-user services to have particular regard to the importance of protecting freedom of expression when complying with their duties. The noble Baroness’s Amendment 283ZA would require category 1 providers to make judgments on user empowerment content to a similar standard required for illegal content. That would be disproportionate. Clause 170 already specifies how providers must make judgments about whether content is of a particular kind, and therefore in scope of the user empowerment duties. This includes making their judgment based on “all relevant information”. As such, the Bill already ensures that the user empowerment content features will be applied in a proportionate way that will not undermine free speech or hinder legitimate debate online.
Amendment 45, tabled by the noble Lord, Lord Stevenson of Balmacara, would require the Secretary of State to lay a Statement before Parliament outlining whether any of the user empowerment duties should be applied to children. I recognise the significant interest that noble Lords have in applying the Clause 12 duties to children. The Bill already places comprehensive requirements on Part 3 services which children are likely to access. This includes undertaking regular risk assessments of such services, protecting children from harmful content and activity, and putting in place age-appropriate protections. If there is a risk that children will encounter harm, such as self-harm content or through unknown or unverified users contacting them, service providers will need to put in place age- appropriate safety measures. Applying the user empowerment duties for child users runs counter to the Bill’s child safety objectives and may weaken the protections for children—for instance, by giving children an option to see content which is harmful to them or to engage with unknown, unverified users. While we recognise the concerns in this area, for the reasons I have set out, the Government do not agree with the need for this amendment.
I will resist the challenge of the noble Lord, Lord Knight, to talk about bots because I look forward to returning to that in discussing the amendments on future-proofing. With that, I invite noble Lords—
I noted the points made about the way information is pushed and, in particular, the speech of the right reverend Prelate. Nothing in the Government’s response has really dealt with that concern. Can the Minister say a few words about not the content but the way in which users are enveloped? On the idea that companies always act because they have a commercial imperative not to expose users to harmful material, actually, they have a commercial imperative to spread material and engage users. It is well recorded that a lot of that is in fact harmful material. Can the Minister speak a little more about the features rather than the content?
We will discuss this when it comes to the definition of content in the Bill, which covers features. I was struck by the speech by the right reverend Prelate about the difference between what people encounter online, and the analogy used by the noble Baroness, Lady Fox, about a bookshop. Social media is of a different scale and has different features which make that analogy not a clean or easy one. We will debate in other groups the accumulated threat of features such as algorithms, if the noble Baroness, Lady Kidron, will allow me to go into greater detail then, but I certainly take the points made by both the right reverend Prelate and the noble Baroness, Lady Fox, in their contributions.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberOkay; I thank my noble friend for his response. However, I would just say that we never would have broken like that, before 7.30 pm. I will leave it at that, but I will have a word with the usual channels.
My Lords, I rise to speak to Amendments 141 and 303 in the name of the noble Lord, Lord Stevenson. Before I do, I mention in passing how delighted I was to see Amendment 40, which carries the names of the Minister and the noble Lord, Lord Stevenson—may there be many more like that.
I am concerned that without Amendments 141 and 303, the concept of “verified” is not really something that the law can take seriously. I want to ask the Minister two rather technical questions. First, how confident can the Government and Ofcom be that with the current wording, Ofcom could form an assessment of whether Twitter’s current “verified by blue” system satisfies the duty in terms of robustness? If it does not, does Ofcom have the power to send it back to the drawing board? I am sure noble Lords understand why I raise this: we have recently seen “verified by blue” ticks successfully bought by accounts impersonating Martin Lewis, US Senators and Putin propagandists. My concern is that in the absence of a definition of verification in the Bill such as the one proposed in Amendments 141 and 303, where in the current wording does Ofcom have the authority to say that “verified by blue” does not satisfy the user verification duty?
I entirely understand what the noble Baroness is saying, and I know that she feels particularly strongly about these issues given her experiences. The whole Bill is about trying to weigh up different aspects—we are on day 5 now, and this has been very much the tenor of what we are trying to talk about in terms of balance.
I want to reassure the noble Baroness that we did discuss anonymity in relation to the issues that she has put forward. A company should not be able to use anonymity as an excuse not to deal with the situation, and that is slightly different from simply saying, “We throw our hands up on those issues”.
There was a difference between the fact that companies are using anonymity to say, “We don’t know who it is, and therefore we can’t deal with it”, and the idea that they should take action against people who are abusing the system and the terms of service. It is subtle, but it is very meaningful in relation to what the noble Baroness is suggesting.
That is a very fair description. We have tried to emphasise throughout the discussion on the Bill that it is about not just content but how the system and algorithms work in terms of amplification. In page 35 of our report, we try to address some of those issues—it is not central to the point about anonymity, but we certainly talked about the way that messages are driven by the algorithm. Obviously, how that operates in practice and how the Bill as drafted operates is what we are kicking the tyres on at the moment, and the noble Baroness is absolutely right to do that.
The Government’s response was reasonably satisfactory, but this is exactly why this group explores the definition of verification and so on, and tries to set standards for verification, because we believe that there is a gap in all this. I understand that this is not central to the noble Baroness’s case, but—believe me—the discussion of anonymity was one of the most difficult issues that we discussed in the Joint Committee, and you have to fall somewhere in that discussion.
Requiring platforms to allow users to see other users’ verification status is a crucial further pillar to user empowerment, and it provides users with a key piece of information about other users. Being able to see whether an account is verified would empower victims of online abuse or threats—I think this partly answers the noble Baroness’s question—to make more informed judgments about the source of the problem, and therefore take more effective steps to protect themselves. Making verification status visible to all users puts more choice in their hands as to how they manage the higher risks associated with non-verified and anonymous accounts, and offers them a lighter-touch alternative to filtering out all non-verified users entirely.
We on these Benches support the amendments that have been put forward. Amendment 141 aims to ensure that a user verification duty delivers in the way that the public and Government hope it will—by giving Ofcom a clear remit to require that the verification systems that platforms are required to develop in response to the duty are sufficiently rigorous and accessible to all users.
I was taken by what the noble Baroness, Lady Bull, said, particularly the case for Ofcom’s duties as regards those with disabilities. We need Ofcom to be tasked with setting out the principles and minimum standards, because otherwise platforms will try to claim, as verification, systems that do not genuinely verify a user’s identity, are unaffordable to ordinary users or use their data inappropriately.
Likewise, we support Amendment 303, which would introduce a definition of “user identity verification” into the Bill to ensure that we are all on the same page. In Committee in the House of Commons, Ministers suggested that “user identity verification” is an everyday term so does not need a definition. This amendment, which no doubt the noble Baroness, Lady Merron, will speak to in more detail, is bang on point as far as that is concerned. That was not a convincing answer, and that is why this amendment is particularly apt.
I heard what the noble Baroness, Lady Buscombe, had to say, but in many ways the amendment in the previous group in the name of the noble Lord, Lord Knight, met some of the noble Baroness’s concerns. As regards the amendment in the name of the noble Lord, Lord Moylan, we are all Wikipedia fans, so we all want to make sure that there is no barrier to Wikipedia operating successfully. I wonder whether perhaps the noble Lord is making quite a lot out of the Wikipedia experience, but I am sure the Minister will enlighten us all and will have a spot-on response for him.
I am sorry to interrupt the noble Lord. Is the answer to my question that the blue tick and the current Meta system will not be considered as verification under the terms of the Bill? Is that the implication of what he said?
Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.
Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.
Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.
The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.
Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.
My Lords, this is my first opportunity to speak in Committee on this important Bill, but I have followed it very closely, and the spirit in which constructive debate has been conducted has been genuinely exemplary. In many ways, it mirrors the manner in which the Joint Committee, on which I had the privilege to serve with other noble Lords, was conducted, and its report rightly has influenced our proceedings in so many ways. I declare an interest as deputy chairman of Telegraph Media Group, which is a member of the News Media Association, and a director of the Regulatory Funding Company, and note my other interests as set out in the register.
I will avoid the temptation to ruminate philosophically, as the noble Baroness, Lady Fox, entertained us by doing. I will speak to Amendment 48, in the name of the noble Lord, Lord Stevenson of Balmacara, and the other amendments which impact on the definition of “recognised news publisher”. As the noble Lord said, his amendments are pretty robust in what they seek to achieve, but I am very pleased that he has tabled them, because it is important that we have a debate about how the Bill impacts on freedom of expression—I use that phrase advisedly—and press and media freedom. The noble Lord’s aims are laudable but do not quite deliver what he intends.
I will explain why it is important that Clauses 13 and 14 stand part of the Bill, and without amendments of the sort proposed. The Joint Committee considered this issue in some detail and supported the inclusion of the news publisher content exemption. These clauses are crucial to the whole architecture of the Bill because they protect news publishers from being dragged into an onerous regime of statutory content control. The press—these clauses cover the broadcasters too—have not been subject to any form of statutory regulation since the end of the 17th century. That is what we understand by press freedom: that the state and its institutions do not have a role in controlling or censoring comment. Clauses 13 and 14 protect that position and ensure that the media, which is of course subject to rigorous independent standard codes as well as to criminal and civil law, does not become part of a system of state regulation by the back door because of its websites and digital products.
That is what is at the heart of these clauses. However, it is not a carte blanche exemption without caveats. As the Joint Committee looked at, and as we have heard, to qualify for it, publishers must meet stringent criteria, as set out in Clause 50, which include being subject to standards codes, having legal responsibility for material published, having effective policies to handle complaints, and so on. It is exactly the same tough definition as was set out in the National Security Bill, which noble Lords across the House supported when it was on Report here.
Without such clear definitions, alongside requirements not to take down or restrict access to trusted news sources without notification, opaque algorithms conjured up in Silicon Valley would end up restricting the access of UK citizens to news, with scant meaningful scope for reinstating it given the short shelf life of news. Ultimately, that would have a profound impact on the public’s right to access news, something which the noble Baroness rightly highlighted. That is why the Joint Committee recommended, at paragraph 304 of its report, that the Bill was
“strengthened to include a requirement that news publisher content should not be moderated, restricted or removed unless it is content the publication of which clearly constitutes a criminal offence, or which has been found to be unlawful by order of a court within the appropriate jurisdiction”.
The Government listened to that concern that the platforms would put themselves in the position of censor on issues of democratic importance, and quite rightly amended the draft Bill to deal with that point. Without it, instead of trusted, curated, regulated news comment, from the BBC to the Guardian to the Manchester Evening News, news would end up being filtered by Google and Facebook. That would be a crushing blow to free speech, to which all noble Lords are absolutely committed.
So, instead of these clauses acting as a bulwark against disinformation by protecting content of democratic importance, they would weaken the position of trusted news providers by introducing too much ambiguity into the system. As we all know, ambiguity brings with it legal challenge and constant controversy. This is especially so given that the exemptions that we are talking about already exist in statute elsewhere, which would cause endless confusion.
I understand the rationale behind many of the amendments, but I fear they would not work in practice. Free speech—and again I use the words advisedly—is a very delicate bloom, which can easily be swept away by badly drafted, uncertain or opaque laws. Its protection needs certainty, which is what the Bill, as it stands, provides. A general catch-all clause would be subject, I fear, to endless argument with the platforms, which are well known for such tactics and for endless legal wrangling.
I noted the remarks of the noble Lord, Lord Stevenson of Balmacara, in his superb speech on the opening day in Committee, when he said that one issue with the Bill is that it
“is very difficult to understand, in part because of its innate complexity and in part because it has been revised so often”. [Official Report, 19/4/23; col. 700.]
He added, in a welcome panegyric to clarity and concision, that given that it is a long and complex Bill, why would we add to it? I agree absolutely with him, but those are arguments for not changing the Bill in the way he proposes. I believe the existing provisions are clear and precise, practical and carefully calibrated. They do not leave room for doubt, and protect media freedom, investigative journalism and the citizen’s right to access authoritative news, which is why I support the Bill as it stands.
My Lords, given the lateness of the hour, I will make just three very brief points. The first is that I find it really fascinating that the amendments in the name of the noble Baroness, Lady Stowell, come from a completely different perspective, but still demand transparency over what is going on. I fully support the formation that she has found, and I think that in many ways they are better than the other ones which came from the other perspective. But what I urge the Minister to hear is that we all seek transparency over what is going on.
Secondly, in many of the amendments—I think I counted about 14 or 15 in the name of the noble Lord, Lord Moylan, and also of the noble Lord, Lord Kamall—there is absolutely nothing I disagree with. My problem with these amendments really goes back to the debate we had on the first day on Amendment 1, in the name of the noble Lord, Lord Stevenson. He set out the purposes of the Bill, and the Minister gave what was considered by most Members of your Lordships’ House to be the groundwork of a very excellent alternative, in the language of government. It appears, as we go on, that many dozens of amendments could be dropped in favour of this purposive clause, which itself could include reference to human rights, children’s rights, the Equality Act, the importance of freedom of expression under the law, and so on. I urge the Minister to consider the feeling of the House: that the things said at the Dispatch Box to be implicit, again and again, the House requires to be explicit. This is one way we could do it, in short form, as the noble Lord, Lord Black, just urged us.
Thirdly, I do have to speak against Amendment 294. I would be happy to take the noble Lord, Lord Moylan, through dozens of studies that show the psychological impact of online harms: systems that groom users to gamble, that reward them for being online at any cost to their health and well-being, that profile them to offer harmful material, and more of the same whether they ask for it or not, and so on. I am also very happy to put some expert voices at his disposal, but I will just say this: the biggest clue as to why this amendment is wrongheaded is the number of behavioural psychologists that are employed by the tech sector. They are there, trying to get at our behaviours and thoughts; they anticipate our move and actually try to predict and create the next move. That is why we have to have psychological harm in the Bill.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI thank the noble Lord for his intervention. He has made me think of the fact that a particular area where this may be of grave concern is cosmetic procedures, which I think we debated during the passage of the Health and Care Act. These things are all interrelated, and it is important that we see them in an interrelated way as part of what is now the health system.
My Lords, I will speak to a number of amendments in this group. I want to make the point that misinformation and disinformation was probably the issue we struggled with the most in the pre-legislative committee. We recognised the extraordinary harm it did, but also—as the noble Baroness, Lady Fox, said—that there is no one great truth. However, algorithmic spread and the drip, drip, drip of material that is not based on any search criteria or expression of an opinion but simply gives you more of the same, particularly the most shocking, moves very marginal views into the mainstream.
I am concerned that our debates over the last five days have concentrated so much on content, and that the freedom we seek does not take enough account of the way in which companies currently exercise control over the information we see. Correlations such as “Men who like barbecues are also susceptible to conspiracy theories” are then exploited to spread toxic theories that end in real-world harm or political tricks that show, for example, the Democrats as a paedophile group. Only last week I saw a series of pictures, presented as “evidence”, of President Biden caught in a compromising situation that gave truth to that lie. As Maria Ressa, the Nobel Peace Prize winner for her contribution to the freedom of expression, said in her acceptance speech:
“Tech sucked up our personal experiences and data, organized it with artificial intelligence, manipulated us with it, and created behavior at a scale that brought out the worst in humanity”.
That is the background to this set of amendments that we must take seriously.
As the noble Lord, Lord Bethell, said, Amendment 52 will ensure that platforms undertake a health misinformation risk assessment and provide a clear policy on dealing with harmful, false and misleading information. I put it to the Committee that, without this requirement, we will keep the status quo in which clicks are king, not health information.
It is a particular pleasure to support the noble Lord, Lord Moylan, on his Amendments 59 and 107. Like him, I am instinctively against taking material down. There are content-neutral ways of marking or questioning material, offering alternatives and signposting to diverse sources—not only true but diverse. These can break this toxic drip feed for long enough for people to think before they share, post and make personal decisions about the health information that they are receiving.
I am not incredibly thrilled by a committee for every occasion, but since the Bill is silent on the issue of misinformation and disinformation—which clearly will be supercharged by the rise of large language data models—it would be good to give a formal role to this advisory committee, so that it can make a meaningful and formal contribution to Ofcom as it develops not only this code of conduct but all codes of conduct.
Likewise, I am very supportive of Amendment 222, which seeks independence for the chair of the advisory body. I have seen at first hand how a combination of regulatory capture and a very litigious sector with deep pockets slows down progress and transparency. While the independence of the chair should be a given, our collective lived experience would suggest otherwise. This amendment would make that requirement clear.
Finally, and in a way most importantly, Amendment 224 would allow Ofcom to consider after the effect whether the code of conduct is necessary. This strikes a balance between adding to its current workload, which we are trying not to do, and tying one hand behind its back in the future. I would be grateful to hear from the Minister why we would not give Ofcom this option as a reasonable piece of future-proofing, given that this issue will be ever more important as AI creates layers of misinformation and disinformation at scale.
My Lords, I support Amendment 52, tabled by my noble friend Lady Merron. This is an important issue which must be addressed in the Bill if we are to make real progress in making the internet a safer space, not just for children but for vulnerable adults.
We have the opportunity to learn lessons from the pandemic, where misinformation had a devastating impact, spreading rapidly online like the virus and threatening to undermine the vaccine rollout. If the Government had kept their earlier promise to include protection from harmful false health content in their indicative list of harmful content that companies would have been required to address under the now removed adult safety duties, these amendments would not be necessary.
It is naive to think that platforms will behave responsibly. Currently, they are left to their own devices in how they tackle health misinformation, without appropriate regulatory oversight. They can remove it at scale or leave it completely unchecked, as illustrated by Twitter’s decision to stop enforcing its Covid-19 misinformation policies, as other noble Lords have pointed out.
It is not a question of maintaining free speech, as some might argue. It was the most vulnerable groups who suffered from the spread of misinformation online—pregnant women and the BAME community, who had higher illness rates. Studies have shown that, proportionately, more of them died, not just because they were front-line workers but because of rumours spread in the community which resulted in vaccine hesitancy, with devastating consequences. As other noble Lords have pointed out, in 2021 the Royal College of Obstetricians and Gynaecologists found that only 42% of women who had been offered the vaccine accepted it, and in October that year one in five of the most critically ill Covid patients were unvaccinated, pregnant women. That is a heartbreaking statistic.
Unfortunately, it is not just vaccine fears that are spread on the internet. Other harmful theories can affect patients with cancer, mental health issues and sexual health issues, and, most worryingly, can affect children’s health. Rumours and misinformation play on the minds of the most vulnerable. The Government have a duty to protect people, and by accepting this amendment they would go some way to addressing this.
Platforms must undertake a health misinformation risk assessment and have a clear policy on dealing with harmful, false and misleading health information in their terms of service. They have the money and the expertise to do this, and Parliament must insist. As my noble friend Lady Merron said, I do not think that the Minister can say that the false communications offence in Clause 160 will address the problem, as it covers only a user sending a knowingly false communication with the intention of causing harm. The charity Full Fact has stated that this offence will exclude most health misinformation that it monitors online.
My Lords, this debate has demonstrated the diversity of opinion regarding misinformation and disinformation—as the noble Lord said, the Joint Committee gave a lot of thought to this issue—as well as the difficulty of finding the truth of very complex issues while not shutting down legitimate debate. It is therefore important that we legislate in a way that takes a balanced approach to tackling this, keeping people safe online while protecting freedom of expression.
The Government take misinformation and disinformation very seriously. From Covid-19 to Russia’s use of disinformation as a tool in its illegal invasion of Ukraine, it is a pervasive threat, and I pay tribute to the work of my noble friend Lord Bethell and his colleagues in the Department of Health and Social Care during the pandemic to counter the cynical and exploitative forces that sought to undermine the heroic effort to get people vaccinated and to escape from the clutches of Covid-19.
We recognise that misinformation and disinformation come in many forms, and the Bill reflects this. Its focus is rightly on tackling the most egregious, illegal forms of misinformation and disinformation, such as content which amounts to the foreign interference offence or which is harmful to children—for instance, that which intersects with named categories of primary priority or priority content.
That is not the only way in which the Bill seeks to tackle it, however. The new terms of service duties for category 1 services will hold companies to account over how they say they treat misinformation and disinformation on their services. However, the Government are not in the business of telling companies what legal content they can and cannot allow online, and the Bill should not and will not prevent adults accessing legal content. In addition, the Bill will establish an advisory committee on misinformation and disinformation to provide advice to Ofcom on how they should be tackled online. Ofcom will be given the tools to understand how effectively misinformation and disinformation are being addressed by platforms through transparency reports and information-gathering powers.
Amendment 52 from the noble Baroness, Lady Merron, seeks to introduce a new duty on platforms in relation to health misinformation and disinformation for adult users, while Amendments 59 and 107 from my noble friend Lord Moylan aim to introduce new proportionality duties for platforms tackling misinformation and disinformation. The Bill already addresses the most egregious types of misinformation and disinformation in a proportionate way that respects freedom of expression by focusing on misinformation and disinformation that are illegal or harmful to children.
I am curious as to what the Bill says about misinformation and disinformation in relation to children. My understanding of primary priority and priority harms is that they concern issues such as self-harm and pornography, but do they say anything specific about misinformation of the kind we have been discussing and whether children will be protected from it?
I am sorry—I am not sure I follow the noble Baroness’s question.
Twice so far in his reply, the Minister has said that this measure will protect children from misinformation and disinformation. I was just curious because I have not seen any sight of that, either in discussions or in the Bill. I was making a distinction regarding harmful content that we know the shape of—for example, pornography and self-harm, which are not, in themselves, misinformation or disinformation of the kind we are discussing now. It is news to me that children are going to be protected from this, and I am delighted, but I was just checking.
Yes, that is what the measure does—for instance, where it intersects with the named categories of primary priority or priority content in the Bill, although that is not the only way the Bill does it. This will be covered by non-designated content that is harmful to children. As we have said, we will bring forward amendments on Report—which is perhaps why the noble Baroness has not seen them in the material in front of us—regarding material harms to children, and they will provide further detail and clarity.
Returning to the advisory committee that the Bill sets up and the amendments from the noble Baroness, Lady Merron, and my noble friend Lord Moylan, all regulated service providers will be forced to take action against illegal misinformation and disinformation in scope of the Bill. That includes the new false communication offences in the Bill that will capture communications where the sender knows the information to be false but sends it intending to cause harm—for example, hoax cures for a virus such as Covid-19. The noble Baroness is right to say that that is a slightly different approach from the one taken in her amendment, but we think it an appropriate and proportionate response to tackling damaging and illegal misinformation and disinformation. If a platform is likely to be accessed by children, it will have to protect them from encountering misinformation and disinformation content that meets the Bill’s threshold for content that is harmful to children. Again, that is an appropriate and proportionate response.
Turning to the points made by my noble friend Lord Moylan and the noble Baroness, Lady Fox, services will also need to have particular regard to freedom of expression when complying with their safety duties. Ofcom will be required to set out steps that providers can take when complying with their safety duties in the codes of practice, including what is proportionate for different providers and how freedom of expression can be protected.
My Lords, it is a pleasure to follow the noble Baroness, Lady Prashar, and I join her in thanking the noble Lord, Lord Knight, for introducing this group very clearly.
In taking part in this debate, I declare a joint interest with the noble Baroness, Lady Fox, in that I was for a number of years a judge in the Debating Matters events to which she referred. Indeed, the noble Baroness was responsible for me ending up in Birmingham jail, during the time that such a debate was conducted with the inmates of Birmingham jail. We have a common interest there.
I want to pick up a couple of additional points. Before I joined your Lordships’ Committee today I was involved in the final stages of the Committee debate on the economic crime Bill, where the noble Lord, Lord Sharpe of Epsom, provided a powerful argument—probably unintentionally—for the amendments we are debating here now. We were talking, as we have at great length in the economic crime Bill, about the issue of fraud. As the noble Lord, Lord Holmes of Richmond, highlighted, in the context of online harms fraud is a huge aspect of people’s lives today and one that has been under-covered in this Committee, although it has very much been picked up in the economic crime Bill Committee. As we were talking about online fraud, the noble Lord, Lord Sharpe of Epsom, said that consumers have to be “appropriately savvy”. I think that is a description of the need for education and critical thinking online, equipping people with the tools to be, as he said, appropriately savvy when facing the risks of fraud and scams, and all the other risks that people face online.
I have attached my name to two amendments here: Amendment 91, which concerns the providers of category 1 and 2A services having a duty, and Amendment 236, which concerns an Ofcom duty. This joins together two aspects. The providers are making money out of the services they provide, which gives them a duty to make some contribution to combatting the potential harms that their services present to people. Ofcom as a regulator obviously has a role. I think it was the noble Lord, Lord Knight, who said that the education system also has a role, and there is some reference in here to Ofsted having a role.
What we need is a cross-society, cross-systems approach. This is where I also make the point that we need to think outside the scope of the Bill—it is part of the whole package—about how the education system works, because media literacy is not a stand-alone thing that you can separate out from the issues of critical thinking more broadly. We need to think about our education system, which is far too often, for schools in particular, where we get pupils to learn and regurgitate a whole set of facts and then reward them for that. We need to think about how our education system prepares children for the modern online world.
There is a great deal we can learn from the example—often cited but worth referring to—of Finland, which by various tests has been ranked as the country most resistant to fake news. A very clearly built-in idea of questioning, scrutiny and challenge is being encouraged among pupils, starting from the age of seven. That is something we need to transform our education system to achieve. However, of course, many people using the internet now are not part of our education system, so this needs to be across our society. A focus on the responsibilities of Ofcom and the providers has to be in the Bill.
My Lords, over the last decade, I have been in scores of schools, run dozens of workshops and spoken to literally thousands of children and young people. A lot of what I pass off as my own wisdom in this Chamber is, indeed, their wisdom. I have a couple of points, and I speak really from the perspective of children under 18 with regard to these amendments, which I fully support.
Media literacy—or digital literacy, as it is sometimes called—is not the same as e-safety. E-safety regimes concentrate on the behaviour of users. Very often, children say that what they learn in those lessons is focused on adult anxieties about predators and bullies, and when something goes wrong, they feel that they are to blame. It puts the responsibility on children. This response, which I have heard hundreds of times, normally comes up after a workshop in which we have discussed reward loops, privacy, algorithmic bias, profiling or—my own favourite—a game which reveals what is buried in terms and conditions; for example, that a company has a right to record the sound of a device or share their data with more than a thousand other companies. When young people understand the pressures that they are under and which are designed into the system, they feel much better about themselves and rather less enamoured of the services they are using. It is my experience that they then go on to make better choices for themselves.
Secondly, we have outsourced much of digital literacy to companies such as Google and Meta. They too concentrate on user behaviour, rather than looking at their own extractive policies focused on engagement and time spent. With many schools strapped for cash and expertise, this teaching is widespread. However, when I went to a Google-run assembly, children aged nine were being taught about features available only on services for those aged over 13—and nowhere was there a mention of age limits and why they are important. It cannot be right that the companies are grooming children towards their services without taking full responsibility for literacy, if that is the literacy that children are being given in school.
Thirdly, as the Government’s own 2021 media literacy strategy set out, good media literacy is one line of defence from harm. It could make a crucial difference in people making informed and safe decisions online and engaging in a more positive online debate, at the same time as understanding that online actions have consequences offline.
However, while digital literacy and, in particular, critical thinking are fundamental to a contemporary education and should be available throughout school and far beyond, they must not be used as a way of putting responsibility on the user for the company’s design decisions. I am specifically concerned that in the risk-assessment process, digital literacy is one of the ways that a company can say it has mitigated a potential risk or harm. I should like to hear from the Minister that that is an additional responsibility and not instead of responsibility.
Finally, over all these years I have always asked at the end of the session what the young people care about the most. The second most important thing is that the system should be less addictive—it should have less addiction built into it. Again, I point the Committee in the direction of the safety-by-design amendments in the name of my noble friend Lord Russell that try to get to the crux of that. They are not very exciting amendments in this debate but they get to the heart of it. However, the thing the young people most often say is, “Could you do something to get my parents to put down their phones?” I therefore ask the Minister whether he can slip something into the Bill, and indeed ask the noble Lord, Lord Grade, whether that could emerge somewhere in the guidance. That is what young people want.
My Lords, I strongly support the amendments in the name of my noble friend Lord Knight and others in this group.
We cannot entirely contain harmful, misleading and dangerous content on the internet, no matter how much we strengthen the Bill. Therefore, it is imperative that we put a new duty on category 1 and category 2A services to require them to put in place measures to promote the media literacy of users so that they can use the service safely.
I know that Ofcom takes the issue of media literacy seriously, but it is regrettable that the Government have dropped their proposal for a new media literacy duty for Ofcom. So far, I see no evidence that the platforms take media literacy seriously, so they need to be made to understand that they have corporate social responsibilities towards their clients.
Good media literacy is the first line of defence from bad information and the kind of misinformation we have discussed in earlier groups. Schools are trying to prepare their pupils to understand that the internet can peddle falsehoods as well as useful facts, but they need support, as the noble Baroness, Lady Kidron, just said. We all need to increase our media literacy, especially with the increasing use of artificial intelligence, as it can make the difference between decisions based on sound evidence and decisions based on poorly informed opinions that can harm health and well-being, social cohesion and democracy.
In 2022, Ofcom found that a third of internet users are unaware of the potential for inaccurate or biased information online, and 61% of social media users who say they are confident in judging whether online content is true or false actually lack the skills to do so, as my noble friend Lord Knight, has pointed out.
Amendment 91 would mean that platforms have to instigate measures to give users an awareness and understanding of the nature and characteristics of the content that may be on the service, its potential impact and how platforms operate. That is a sensible and practical request that is not beyond the ability of companies to provide, and it will be to everyone’s benefit.
My Lords, this has been a good debate. I am glad that a number of noble Lords mentioned Lord Puttnam and the committee that he chaired for your Lordships’ House on democracy and digital technologies. I responded to the debate that we had on that; sadly, it was after he had already retired from your Lordships’ House, but he participated from the steps of the Throne. I am mindful of that report and the lessons learned in it in the context of the debate that we have had today.
We recognise the intent behind the amendments in this group to strengthen the UK’s approach to media literacy in so far as it relates to services that will be regulated by the Bill. Ofcom has a broad duty to promote media literacy under the Communications Act 2003. That is an important responsibility for Ofcom, and it is right that the regulator is able to adapt its approach to support people in meeting the evolving challenges of the digital age.
Amendments 52A and 91 from the noble Lord, Lord Knight, and Amendment 91A from the noble Lord, Lord Holmes of Richmond, seek to introduce duties on in-scope services, requiring them to put in place measures that promote users’ media literacy, while Amendment 98 tabled by the noble Lord, Lord Knight, would require Ofcom to issue a code of practice in relation to the new duty proposed in his Amendment 91. While we agree that the industry has a role to play in promoting media literacy, the Government believe that these amendments could lead to unintended, negative consequences.
I shall address the role of the industry and media literacy, which the noble Baroness, Lady Kidron, dwelt on in her remarks. We welcome the programmes that it runs in partnership with online safety experts such as Parent Zone and Internet Matters and hope they continue to thrive, with the added benefit of Ofcom’s recently published evaluation toolkit. However, we believe that platforms can go further to empower and educate their users. That is why media literacy has been included in the Bill’s risk assessment duties, meaning that regulated services will have to consider measures to promote media literacy to their users as part of the risk assessment process. Additionally, through work delivered under its existing media literacy duty, Ofcom is developing a set of best-practice design principles for platform-based media literacy measures. That work will build an evidence base of the most effective measures that platforms can take to build their users’ media literacy.
In response to the noble Baroness’s question, I say: no, platforms will not be able to avoid putting in place protections for children by using media literacy campaigns. Ofcom would be able to use its enforcement powers if a platform was not achieving appropriate safety outcomes. There are a range of ways in which platforms can mitigate risks, of which media literacy is but one, and Ofcom would expect platforms to consider them all in their risk assessments.
Let me say a bit about the unintended consequences we fear might arise from these amendments. First, the resource demands to create a code of practice and then to regulate firms’ compliance with this type of broad duty will place an undue burden on the regulator. It is also unclear how the proposed duties in Amendments 52A, 91 and 91A would interact with Ofcom’s existing media literacy duty. There is a risk, we fear, that these parallel duties could be discharged in conflicting ways. Amendment 91A is exposed to broad interpretation by platforms and could enable them to fulfil the duty in a way that lacked real impact on users’ media literacy.
The amendment in the name of my noble friend Lord Holmes proposes a duty to promote awareness of financial deception and fraud. The Government are already taking significant action to protect people from online fraud, including through their new fraud strategy and other provisions in this Bill. I know that my noble friends Lord Camrose, Lord Sharpe of Epsom and Lady Penn met noble Lords to talk about that earlier this week. We believe that measures such as prompts for users before they complete financial transactions sit more logically with financial service providers than with services in scope of this Bill.
Amendment 52A proposes a duty on carriers of journalistic content to promote media literacy to their users. We do not want to risk requiring platforms to act as de facto press regulators, assessing the quality of news publishers’ content. That would not be compatible with our commitment to press freedom. Under its existing media literacy duty, Ofcom is delivering positive work to support people to discern high-quality information online. It is also collaborating with the biggest platforms to design best practice principles for platform-based media literacy measures. It intends to publish these principles this year and will encourage platforms to adopt them.
It is right that Ofcom is given time to understand the benefits of these approaches. The Secretary of State’s post-implementation review will allow the Government and Parliament to establish the effectiveness of Ofcom’s current approach and to reconsider the role of platforms in enhancing users’ media literacy, if appropriate. In the meantime, the Bill introduces new transparency-reporting and information-gathering powers to enhance Ofcom’s visibility of platforms delivery and evaluation of media literacy activities. We would not want to see amendments that would inadvertently dissuade platforms from delivering these activities in favour of less costly and less effective measures.
My noble friend Lord Holmes asked about the Online Media Literacy Strategy, published in July 2021, which set out the Government’s vision for improving media literacy in the country. Alongside the strategy, we have committed to publishing annual action plans each financial year until 2024-25, setting out how we meet the ambition of the strategy. In April 2022 we published the Year 2 Action Plan, which included extending the reach of media literacy education to those who are currently disengaged, in consultation with the media literacy task force—a body of 17 cross-sector experts—expanding our grant funding programme to provide nearly £2.5 million across two years for organisations delivering innovative media literacy activities, and commissioning research to improve our understanding of the challenges faced by the sector. We intend to publish the research later this year, for the benefit of civil society organisations, technology platforms and policymakers.
The noble Lord, Lord Knight, in his Amendment 186, would stipulate that Ofcom must levy fees on regulated firms sufficient to fund the work of third parties involved in supporting it to meet its existing media literacy duties. The Bill already allows Ofcom to levy fees sufficient to fund the annual costs of exercising its online safety functions. This includes its existing media literacy duty as far as it relates to services regulated by this Bill. As such, the Bill already ensures that these media literacy activities, including those that Ofcom chooses to deliver through third parties, can be funded through fees levied on industry.
I turn to Amendments 188, 235, 236, 237 and 238. The Government recognise the intent behind these amendments, which is to help improve the media literacy of the general public. Ofcom already has a statutory duty to promote media literacy with regard to the publication of anything by means of electronic media, including services in scope of the Bill. These amendments propose rather prescriptive objectives, either as part of a new duty for Ofcom or through updating its existing duty. They reflect current challenges in the sector but run the risk of becoming obsolete over time, preventing Ofcom from adapting its work in response to emerging issues.
Ofcom has demonstrated flexibility in its existing duty through its renewed Approach to Online Media Literacy, launched in 2021. This presented an expanded media literacy programme, enabling it to achieve almost all the objectives specified in this group. The Government note the progress that Ofcom has already achieved under its renewed approach in the annual plan it produced last month. The Online Safety Bill strengthens Ofcom’s functions relating to media literacy, which is included in Ofcom’s new transparency-reporting and information-gathering powers, which will give it enhanced oversight of industry activity by enabling it to require regulated services to share or publish information about the work that that they are doing on media literacy.
The noble Baroness, Lady Prashar, asked about the view expressed by the Joint Committee on minimum standards for media literacy training. We agree with the intention behind that, but, because of the broad and varied nature of media literacy, we do not believe that introducing minimum standards is the most effective way of achieving that outcome. Instead, we are focusing efforts on improving the evaluation practices of media literacy initiatives to identify which ones are most effective and to encourage their delivery. Ofcom has undertaken extensive work to produce a comprehensive toolkit to support practitioners to deliver robust evaluations of their programmes. This was published in February this year and has been met with praise from practitioners, including those who received grant funding from the Government’s non-legislative media literacy work programme. The post-implementation review of Ofcom’s online safety regime, which covers its existing media literacy duty in so far as it relates to regulated services, will provide a reasonable point at which to establish the effectiveness of Ofcom’s new work programme, after giving it time to take effect.
Noble Lords talked about the national curriculum and media literacy in schools. Media literacy is indeed a crucial skill for everyone in the digital age. Key media literacy skills are already taught through a number of compulsory subjects in the national curriculum. Digital literacy is included in the computing national curriculum in England, which equips pupils with the knowledge, understanding and skills to use information and communication technology creatively and purposefully. I can reassure noble Lords that people such as Monica are being taught not about historic things like floppy disks but about emerging and present challenges; the computing curriculum ensures that pupils are taught how to design program systems and accomplish goals such as collecting, analysing, evaluating and presenting data.
Does the Minister know how many children are on computing courses?
My Lords, I had the great privilege of serving as a member of this House’s Fraud Act 2006 and Digital Fraud Committee under the excellent chairing of the noble Baroness, Lady Morgan. She has already told us of the ghastly effects that fraud has on individuals and indeed its adverse effects on businesses. We heard really dramatic statistics, such as when Action Fraud told us that 80% of fraud is cyber enabled.
Many of us here will have been victims of fraud—I have been a victim—or know people who have been victims of fraud. I was therefore very pleased when the Government introduced the fraudulent advertising provisions into the Bill, which will go some way to reducing the prevalence of online fraud. It seems to me that it requires special attention, which is what these amendments should do.
We heard in our inquiry about the problems that category 1 companies had in taking down fraudulent advertisements quickly. Philip Milton, the public policy manager at Meta, told us that it takes between 24 and 48 hours to review possibly harmful content after it has been flagged to the company. He recognised that, due to the deceptive nature of fraudulent advertising, Meta’s systems do not always recognise that advertising is fraudulent and, therefore, take-down rates would be variable. That is one of the most sophisticated tech platforms—if it has difficulties, just imagine the difficulty that other companies have in both recognising and taking down fraudulent advertising.
Again and again, the Bill recognises the difficulties that platforms have in systematising the protections provided in the Bill. Fraud has an ever-changing nature and is massively increasing—particularly so for fraudulent advertising. It is absolutely essential that the highest possible levels of transparency are placed upon the tech companies to report their response to fraudulent advertising. Both Ofcom and users need to be assured that not only do the companies have the most effective reporting systems but, just as importantly, they have the most effective transparency to check how well they are performing.
To do this, the obligations on platforms must go beyond the transparency reporting requirements in the Bill. These amendments would ensure that they include obligations to provide information on incidence of fraud advertising, in line with other types of priority illegal content. These increased obligations are part of checking the effectiveness of the Bill when it comes to being implemented.
The noble Baroness, Lady Stowell, told us on the fifth day of Committee, when taking about the risk-assessment amendments she had tabled:
“They are about ensuring transparency to give all users confidence”.—[Official Report, 9/5/23; col. 1755.]
Across the Bill, noble Lords have repeatedly stated that there needs to be a range of ways to judge how effectively the protections provided are working. I suggest to noble Lords that these amendments are important attempts to help make the Bill more accountable and provide the data to future-proof the harms it is trying to deal with. As we said in the committee report:
“Without sufficient futureproofing, technology will most likely continue to create new opportunities for fraudsters to target victims”.
I ask the Minister to at least look at some of these amendments favourably.
My Lords, I shall say very briefly in support of these amendments that in 2017, the 5Rights Foundation, of which I am the chair, published the Digital Childhood report, which in a way was the thing that put the organisation on the map. The report looked at the evolving capacity of children through childhood, what technology they were using, what happened to them and what the impact was. We are about to release the report again, in an updated version, and one of the things that is most striking is the introduction of fraud into children’s lives. At the point at which they are evolving into autonomous people, when they want to buy presents for their friends and parents on their own, they are experiencing what the noble Baroness, Lady Morgan, expressed as embarrassment, loss of trust and a sense of deserting confidence—I think that is probably the phrase. So I just want to put on the record that this is a problem for children also.
My Lords, this has been an interesting short debate and the noble Baroness, Lady Morgan, made a very simple proposition. I am very grateful to her for introducing this so clearly and comprehensively. Of course, it is all about the way that platforms will identify illegal, fraudulent advertising and attempt to align it with other user-to-user content in terms of transparency, reporting, user reporting and user complaints. It is a very straightforward proposition.
First of all, however, we should thank the Government for acceding to what the Joint Committee suggested, which was that fraudulent advertising should be brought within the scope of the Bill. But, as ever, we want more. That is what it is all about and it is a very straightforward proposition which I very much hope the Minister will accede to.
We have heard from around the Committee about the growing problem and I will be very interested to read the report that the noble Baroness, Lady Kidron, was talking about, in terms of the introduction of fraud into children’s lives—that is really important. The noble Baroness, Lady Morgan, mentioned some of the statistics from Clean Up the Internet, Action Fraud and so on, as did the noble Viscount, Lord Colville. And, of course, it is now digital. Some 80% of fraud, as he said, is cyber-enabled, and 23% of all reported frauds are initiated on social media—so this is bang in the area of the Bill.
It has been very interesting to see how some of the trade organisations, the ABI and others, have talked about the impact of fraud, including digital fraud. The ABI said:
“Consumers’ confidence is being eroded by the ongoing proliferation of online financial scams, including those predicated on impersonation of financial service providers and facilitated through online advertising. Both the insurance and long-term savings sectors are impacted by financial scams perpetrated via online paid-for advertisements, which can deprive vulnerable consumers of their life savings and leave deep emotional scars”.
So, this is very much a cross-industry concern and very visible to the insurance industry and no doubt to other sectors as well.
I congratulate the noble Baroness, Lady Morgan, on her chairing of the fraud committee and on the way it came to its conclusions and scrutinised the Bill. Paragraphs 559, 560 and 561 all set out where the Bill needs to be aligned to the other content that it covers. As she described, there are two areas where the Bill can be improved. If they are not cured, they will substantially undermine its ability to tackle online fraud effectively.
This has the backing of Which? As the Minister will notice, it is very much a cross-industry and consumer body set of amendments, supporting transparency reporting and making sure that those platforms with more fraudulent advertising make proportionately larger changes to their systems. That is why there is transparency reporting for all illegal harms that platforms are obliged to prevent. There is no reason why advertising should be exempt. On user reporting and complaints, it is currently unclear whether this applies only to illegal user-generated content and unpaid search content or if it also applies to illegal fraudulent advertisements. At the very least, I hope the Minister will clarify that today.
Elsewhere, the Bill requires platforms to allow users to complain if the platform fails to comply with its duties to protect users from illegal content and with regard to the content-reporting process. I very much hope the Minister will accede to including that as well.
Some very simple requests are being made in this group. I very much hope that the Minister will take them on board.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I will speak in support of Amendments 250A and 250B; I am not in favour of Amendment 56, which is the compromise amendment. I thank the noble Baroness, Lady Newlove, for setting out the reasons for her amendments in such a graphic form. I declare an interest as a member of the Expert Group on an Individual Complaints Mechanism for the Government of Ireland.
The day a child or parent in the UK has a problem with an online service and realises that they have nowhere to turn is the day that the online safety regime will be judged to have failed in the eyes of the public. Independent redress is a key plank of any regulatory system. Ombudsmen and independent complaint systems are available across all sectors, from finance and health to utilities and beyond. As the noble Lord, Lord Stevenson, set out, they are part of all the tech regulation that has been, or is in the process of being, introduced around the world.
I apologise in advance if the Minister is minded to agree to the amendment, but given that, so far, the Government have conceded to a single word in a full six days in Committee, I dare to anticipate that that is not the case and suggest three things that he may say against the amendment: first, that any complaints system will be overwhelmed; secondly, that it will offer a get-out clause for companies from putting their own robust systems in place; and, thirdly, that it will be too expensive.
The expert group of which I was a member looked very carefully at each of these questions and, after taking evidence from all around the globe, it concluded that the system need not be overwhelmed if it had the power to set clear priorities. In the case of Ireland, those priorities were complaints that might result in real-world violence and complaints from or on behalf of children. The expert group also determined that the individual complaints system should be
“afforded the discretion to handle and conclude complaints in the manner it deems most appropriate and is not unduly compelled toward or statutorily proscribed to certain courses of action in the Bill”.
For example, there was a lot of discussion on whether it could decide not to deal with copycat letters, treat multiple complaints on the same or similar issue as one, and so on.
Also, from evidence submitted during our deliberations, it became clear that many complainants have little idea of the law and that many complaints should be referred to other authorities, so among the accepted recommendations was that the individual complaints system should be
“provided with a robust legal basis for transferring or copying complaints to other bodies as part of the triage process”—
for example, to the data regulator, police, social services and other public bodies. The expert group concluded that this would actually result in better enforcement and compliance in the ecosystem overall.
On the point that the individual complaints mechanism may have the unintended consequence of making regulated services lazy, the expert group—which, incidentally, comprised a broad group of specialisms such as ombudsmen, regulators and legal counsel among others—concluded that it was important for the regulator to set a stringent report and redress code of practice for regulated companies so that it was not possible for any company to just sit back until people were so fed up that they went to the complaints body. The expert group specifically said in its report that it
“is acutely aware of the risk of … the Media Commission … drawing criticism for the failings of the regulated entities to adequately comply with systemic rules. In this regard, an individual complaints mechanism should not be viewed as a replacement for the online platforms’ complaint handling processes”.
Indeed, the group felt that an individual complaints system complemented the powers given to the regulator, which could and should take enforcement against those companies that persistently fail to introduce an adequate complaints system—not least because the flow of complaints would act as an early warning system of emerging harms, which is of course one of the regulator’s duties under the Bill.
When replying to a question from the noble Lord, Lord Knight of Weymouth, last week about funding digital literacy, the Minister made it clear that the online safety regime would be self-financing via the levy. In which case, it does not seem to be out of proportion to have a focused and lean system in which the urgent, the vulnerable and the poorly served have somewhere to turn.
The expert group’s recommendation was accepted in full by Ireland’s Minister for Media, Culture and Tourism, Catherine Martin, who said she would
“always take the side of the most vulnerable”
and the complaint system would deal with people who had
“exhausted the complaints handling procedures by any online services”.
I have had the pleasure of talking to its new leadership in recent weeks, and it is expected to be open for business in 2024.
I set that out at length just to prove that it is possible. It was one of the strong recommendations of the pre-legislative committee, and had considerable support in the other place, as we have heard. I think both Ofcom and DSIT should be aware that many media outlets have not yet clocked that this complicated Bill is so insular that the users of tech have no place to go and no voice.
While the Bill can be pushed through without a complaints system, this leaves it vulnerable. It takes only one incident or a sudden copycat rush of horrors, which have been ignored or trivialised by the sector with complainants finding themselves with nowhere to go but the press, to undermine confidence in the whole regulatory edifice.
My Lords, I had to miss a few sessions of the Committee but I am now back until the end. I remind fellow Members of my interests: I worked for one of the largest platforms for a decade, but I have no current interests. It is all in the register if people care to look. I want to contribute to this debate on the basis of that experience of having worked inside the platforms.
I start by agreeing with the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and my noble friend Lord Clement-Jones. The thrust of their amendments—the idea that something will be needed here—is entirely correct. We have created in the Online Safety Bill a mechanism that we in this Committee know is intended primarily to focus on systems and how Ofcom regulates them, but what the public out there hear is that we are creating a mechanism that will meet their concerns—and their concerns will not end with systems. As the noble Baroness, Lady Newlove, eloquently described, their concerns in some instances will be about specific cases and the question will be: who will take those up?
If there is no other mechanism and no way to signpost people to a place where they can seek redress, they will come to Ofcom. That is something we do not want. We want Ofcom to be focused on the big-ticket items of dealing with systems, not bogged down in dealing with thousands of individual complaints. So we can anticipate a situation in which we will need someone to be able to deal with those individual complaints.
I want to focus on making that workable, because the volume challenge might not be as people expect. I have seen from having worked on the inside that there is a vast funnel of reports, where people report content to platforms. Most of those reports are spurious or vexatious; that is the reality. Platforms have made their reporting systems easy, as we want them to do —indeed, in the Bill we say, “Make sure you have really easy-to-use reporting systems”—but one feature of that is that people will use them simply to express a view. Over the last couple of weeks, all the platforms will have been inundated with literally millions of reports about Turkish politicians. These will come from the supporters of either side, reporting people on the other side—claiming that they are engaged in hate speech or pornography or whatever. They will use whatever tool they can. That is what we used to see day in, day out: football teams or political groups that report each other. The challenge is to separate out the signal—the genuinely serious reports of where something is going wrong—from the vast amount of noise, of people simply using the reporting system because they can. For the ombudsman, the challenge will be that signal question.
Breaking that down, from the vast funnel of complaints coming in, we have a smaller subset that are actionable. Some of those will be substantive, real complaints, where the individual simply disagrees with the decision. That could be primarily for two reasons. The first is that the platform has made a bad decision and failed to enforce its own policies. For example, you reported something as being pornographic, and it obviously was, but the operator was having a bad day—they were tired, it was late in the day and they pressed “Leave up” instead of “Take down”. That happens on a regular basis, and 1% of errors like that across a huge volume means a lot of mistakes being made. Those kinds of issues, where there is a simple operator error, should get picked up by the platforms’ own appeal mechanisms. That is what they are there for, and the Bill rightly points to that. A second reviewer should look at it. Hopefully they are a bit fresher, understand that a mistake was made and can simply reverse it. Those operator error reports can be dealt with internally.
The second type would be where the platform enforces policies correctly but, from the complainant’s point of view, the policies are wrong. It may be a more pro-free speech platform where the person says, “This is hate speech”, but the platform says, “Well, according to our rules, it is not. Under our terms of service, we permit robust speech of this kind. Another platform might not, but we do”. In that case, the complainant is still unhappy but the platform has done nothing wrong—unless the policies the platform is enforcing are out of step with the requirements under the Online Safety Bill, in which case the complaint should properly come to Ofcom. Based on the individual complaint, a complainant may have something material for Ofcom. They are saying that they believe the platform’s policies and systems are not in line with the guidance issued by Ofcom—whether on hate speech, pornography or anything else. That second category of complaint would come to Ofcom.
The third class concerns the kind of complaint that the noble Baroness, Lady Newlove, described. In some ways, this is the hardest. The platform has correctly enforced its policies but, in a particular case, the effect is deeply unfair, problematic and harmful for an individual. The platform simply says, “Look, we enforced the policies. They are there. This piece of content did not violate them”. Any outsider looking at it would say, “There is an injustice here. We can clearly see that an individual is being harmed. A similar piece of content might not be harmful to another individual, but to this individual it is”. In those circumstances, groups such as the South West Grid for Learning, with which I work frequently, perform an invaluable task. We should recognise that there is a network of non-governmental organisations in the United Kingdom that do this day in, day out. Groups such as the Internet Watch Foundation and many others have fantastic relations and connections with the platforms and regularly bring exceptional cases to them.
We are glad to have the noble Lord back. I want also to put on the record that the South West Grid for Learning is very supportive of this amendment.
It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.
This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.
I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.
Following on from my friend, the noble Lord, Lord Russell, can I just say to the Minister that I would really welcome all of us having a meeting? As I am listening to this, I am thinking that three to five years is just horrific for the families. This Bill has gone on for so long to get where we are today. We are losing sight of humanity here and the moral compass of protecting human lives. For whichever Government is in place in three to five years to make the decision to say it does not work is absolutely shameful. Nobody in the Government will be accountable and yet for that family, that single person may commit suicide. We have met the bereaved families, so I say to the Minister that we need to go round the table and look at this again. I do not think it is acceptable to say that there is this timeline, this review, for the Secretary of State when we are dealing with young lives. It is in the public interest to get this Bill correct as it navigates its way back to the House of Commons in a far better state than how it arrived.
I would love the noble Viscount to answer my very specific question about who the Government think families should turn to when they have exhausted the complaints system in the next three to five years. I say that as someone who has witnessed successive Secretaries of State promising families that this Bill would sort this out. Yes?
I stress again that the period in question is two years not three.
It is between two and five years. It can be two; it can be five. I am very happy to meet my noble friend and to carry on doing so. The complaints procedure set up for families is to first approach the service provider in an enforceable manner and should the provider fail to meet its enforceable duties to then revert to Ofcom before the courts.
I am sorry but that is exactly the issue at stake. The understanding of the Committee currently is that there is then nowhere to go if they have exhausted that process. I believe that complainants are not entitled to go to Ofcom in the way that the noble Viscount just suggested.
Considerably more rights are provided than they have today, with the service provider. Indeed, Ofcom would not necessarily deal with individual complaints—
I have offered a meeting; I am very happy to host the meeting to bottom out these complaints.
I understand that the Minister has been given a sticky wicket of defending the indefensible. I welcome a meeting, as I think the whole Committee does, but it would be very helpful to hear the Government say that they have chosen to give individuals no recourse under the Bill—that this is the current situation, as it stands, and that there is no concession on the matter. I have been in meetings with people who have been promised such things, so it is really important, from now on in Committee, that we actually state at the Dispatch Box what the situation is. I spent quite a lot of the weekend reading circular arguments, and we now need to get to an understanding of what the situation is. We can then decide, as a Committee, what we do in relation to that.
As I said, I am very happy to hold the meeting. We are giving users greater protection through the Bill, and, as agreed, we can discuss individual routes to recourse.
I hope that, on the basis of what I have said and the future meeting, noble Lords have some reassurance that the Bill’s complaint mechanisms will, eventually, be effective and proportionate, and feel able not to press their amendments.
Yes, that would be a sensible way to view it. We will work on that and allow noble Lords to see it before they come to talk to us about it.
I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.
My Lords, I thank the Minister for his response. I take it as a win that we have been offered a meeting and further discussion, and the noble Lord, Lord Foulkes, agreeing with every word I said. I hope we can continue in this happy vein in my time in this House.
The suggestion from the noble Lord, Lord Stevenson, of a table is a welcome one. Something that has interested me is that some of the offences the Minister mentioned were open goals: there were holes leaving it open in Northern Ireland and not in England and Wales, or whatever. For example, epilepsy trolling is already a criminal offence in Scotland, but I am not sure that was appreciated when we started this discussion.
I look forward to the meeting and I thank the Minister for his response. I am still unconvinced that we have the right consultation process for any devolved authority wanting to apply for a subordinate devolved Administration to be included under this regime.
It concerns me that the Minister talked about leaving requesting data that Ofcom deemed to be appropriate. The feeling on the ground is that Ofcom, which is based in London, may not understand what is or is not necessarily appropriate in the devolved Administrations. The fact that in other legislation—for example, on broadcasting—it is mandated that it is broken down nation by nation is really important. It is even more important because of the interplay between the devolved and the reserved matters. The fact that there is no equivalent Minister in the Scottish Government to talk about digital and online safety things with means that a whole raft of different people will need to have relationships with Ofcom who have not hitherto.
I thank the Minister. On that note, I withdraw my amendment.
My Lords, I also support the amendments in the name of my noble friend Lady Finlay. I want to address a couple of issues raised by the noble Lord, Lord Allan. He made a fantastic case for adequate redress systems, both at platform level and at independent complaint level, to really make sure that, at the edge of all the decisions we make, there is sufficient discussion about where that edge lies.
The real issue is not so much the individuals who are in recovery and seeking to show leadership but those who are sent down the vortex of self-harm and suicide material that comes in its scores—in its hundreds and thousands—and completely overwhelms them. We must not make a mistake on the edge case and not deal with the issue at hand.
There is absolutely not enough signposting. I have seen at first hand—I will not go through it again; I have told the Committee already—companies justifying material that it was inconceivable to justify as being a cry for help. A child with cuts and blood running down their body is not a cry for help; that is self-harm material.
From experience, I think it is true that companies get defensive and seek to defend the indefensible on occasion. I agree with the noble Baroness on that, but I will balance it a little as I also work with people who were agonising over not wanting to make a bad situation worse. They were genuinely struggling and seeking to do the right thing. That is where the experts come in. If someone would say to them, “Look, take this stuff down; that is always better”, it would make their lives easier. If they said, “Please leave it up”, they could follow that advice. Again, that would make their lives easier. On the excuses, I agree that sometimes they are defending the indefensible, but also there are people agonising over the right thing to do and we should help them.
I absolutely agree. Of course, good law is a good system, not a good person.
I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.
In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.
Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.
I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.
I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.
My Lords, I support the noble Baroness, Lady Finlay of Llandaff, in her Amendment 96 and others in this group. The internet is fuelling an epidemic of self-harm, often leading to suicide among young people. Thanks to the noble Baroness, Lady Kidron, I have listened to many grieving families explaining the impact that social media had on their beloved children. Content that includes providing detailed instructions for methods of suicide or challenges or pacts that seek agreement to undertake mutual acts of suicide or deliberate self-injury must be curtailed, or platforms must be made to warn and protect vulnerable adults.
I recognise that the Government acknowledge the problem and have attempted to tackle it in the Bill with the new offence of encouraging or assisting serious self-harm and suicide and by listing it as priority illegal content. But I agree with charities such as Samaritans, which says that the Government are taking a partial approach by not accepting this group of amendments. Samaritans considers that the types of suicide and self-harm content that is legal but unequivocally harmful includes information, depictions, instructions and advice on methods of self-harm or suicide, content that portrays self-harm and suicide as positive or desirable and graphic descriptions or depictions of self-harm and suicide.
With the removal of regulation of legal but harmful content, much suicide and self-harm content can remain easily available, and platforms will not even need to consider the risk that such content could pose to adult users. These amendments aim to ensure that harmful self-harm and suicide content is addressed across all platforms and search services, regardless of their functionality or reach, and, importantly, for all persons regardless of age.
In 2017 an inquiry into suicides of young people found suicide-related internet use in 26% of deaths in under-20s and 13% of deaths in 20 to 24 year-olds. Three-quarters of people who took part in Samaritans’ research with Swansea University said that they had harmed themselves more severely after viewing self-harm content online, as the noble Baroness, Lady Finlay, pointed out. People of all ages can be susceptible to harm from this dangerous content. There is shocking evidence that between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were over 25.
Suicide is complex and rarely caused by one thing. However, there is strong evidence of associations between financial difficulties, mental health and suicide. People on the lowest incomes have a higher suicide risk than those who are wealthier, and people on lower incomes are also the most affected by rising prices and other types of financial hardship. In January and February this year the Samaritans saw the highest percentage of first-time phone callers concerned about finance or unemployment—almost one in 10 calls for help in February. With the cost of living crisis and growing pressure on adults to cope with stress, it is imperative that the Government urgently bring in these amendments to help protect all ages from harmful suicide and self-harm content by putting a duty on providers of user-to-user services to properly manage such content.
A more comprehensive online safety regime for all ages will also increase protections for children, as research has shown that age verification and restrictions across social media and online platforms are easily bypassed by them. As the Bill currently stands, there is a two-tier approach to safety which can still mean that children may circumnavigate safety controls and find this harmful suicide and self-harm content.
Finally, user empowerment duties that we debated earlier are no substitute for regulation of access to dangerous suicide and self-harm online content through the law that these amendments seek to achieve.
My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.
I am sorry to enter the fray again on complaints, but how will anyone know that they have failed in this way if there is no complaints system?
I refer to the meeting my noble friend Lord Camrose offered; we will be able to go through and unpick the issues raised in that group of amendments, rather than looping back to that debate now.
I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.
The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.
It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.
The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.
Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.
I asked a number of questions on specific scenarios. If the Minister cannot answer them straight away, perhaps he could write to me. They all rather called for “yes/no” answers.
The noble Baroness threw me off with her subsequent question. She was broadly right, but I will write to her after I refresh my memory about what she said when I look at the Official Report.
My Lords, I have added my name to Amendments 97 and 304, and I wholeheartedly agree with all that the noble Baroness, Lady Morgan, said by means of her excellent introduction. I look forward to hearing what the noble Baroness, Lady Kidron, has to say as she continues to bring her wisdom to the Bill.
Let me say from the outset, if it has not been said strongly enough already, that violence against women and girls is an abomination. If we allow a culture of intimidation and misogyny to exist online, it will spill over to offline experiences. According to research by Refuge, almost one in five domestic abuse survivors who experienced abuse or harassment from their partner or former partner via social media said they felt afraid of being attacked or being subjected to physical violence as a result. Some 15% felt that their physical safety was more at risk, and 5% felt more at risk of so-called honour-based violence. Shockingly, according to Amnesty International, 41% of women who experienced online abuse or harassment said that these experiences made them feel that their physical safety was threatened.
Throughout all our debates, I hesitate to differentiate between the real and virtual worlds, because that is simply not how we live our lives. Interactions online are informed by face-to-face interactions, and vice versa. To think otherwise is to misunderstand the lived experience of the majority—particularly, dare I say, the younger generations. As Anglican Bishop for HM Prisons, I recognise the complexity of people’s lives and the need to tackle attitudes underpinning behaviours. Tackling the root causes of offending should always be a priority; there is potential for much harm later down the line if we ignore warning signs of hatred and misogyny. Research conducted by Refuge found that one in three women has experienced online abuse or harassment perpetrated on social media or another online platform at some point in their lives. That figure rises to almost two in three, or 62%, among young women. This must change.
We did some important work in your Lordships’ House during the passage of the Domestic Abuse Act to ensure that all people, including women and girls, are safe on our streets and in their homes. As has been said, introducing a code of practice as outlined will help the Government meet their aim of making the UK the safest place in the world to be online, and it will align with the Government’s wider priority to tackle violence against women and girls as a strategic policing requirement. Other strategic policing requirements, including terrorism and child sexual exploitation, have online codes of practice, so surely it follows that there should be one for VAWG to ensure that the Bill aligns with the Government’s position elsewhere and that there is not a gap left online.
I know the Government care deeply about tackling violence against women and girls, and I believe they have listened to some concerns raised by the sector. The inclusion of the domestic abuse and victims’ commissioners as statutory consultees is welcomed, as is the Government’s amendment to recognise controlling and coercive behaviour as a priority offence. However, without this code of conduct, the Bill will fail to address duties of care in relation to preventing domestic abuse and violence against women and girls in a holistic and encompassing way. The onus should not be on women and girls to remove themselves from online spaces; we have seen plenty of that in physical spaces over the years. Women and girls must be free to appropriately express themselves online and offline without fear of harassment. We must do all we can to prevent expressions of misogyny from transforming into violent actions.
My Lords, I have added my name to Amendments 97 and 304, and I support the others in this group. It seems to be a singular failure of any version of an Online Safety Bill if it does not set itself the task of tackling known harms—harms that are experienced daily and for which we have a phenomenal amount of evidence. I will not repeat the statistics given in the excellent speeches made by the noble Baroness, Lady Morgan, and the right reverend Prelate, but will instead add two observations.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI shall speak briefly to Amendments 220E and 226. On Amendment 220E, I say simply that nothing should be left to chance on IWF. No warm words or good intentions replace the requirement for its work to be seamlessly and formally integrated into the OSB regime. I put on record the extraordinary debt that every one of us owes to those who work on the front line of child sexual abuse. I know from my own work how the images linger. We should all do all that we can to support those who spend every day chasing down predators and finding and supporting victims and survivors. I very much hope that, in his response, the Minister will agree to sit down with the IWF, colleagues from Ofcom and the noble Lords who tabled the amendment and commit to finding a language that will give the IWF the reassurance it craves.
More generally, I raise the issue of why the Government did not accept the pre-legislative committee’s recommendation that the Bill provide a framework for how bodies will work together, including when and how they will share powers, take joint action and conduct joint investigations. I have a lot of sympathy with the Digital Regulation Co-operation Forum in its desire to remain an informal body, but that is quite different from the formal power to share sensitive data and undertake joint action or investigation.
If history repeats itself, enforcing the law will take many years and very likely will cost a great deal of money and require expertise that it makes no sense for Ofcom to reproduce. It seems obvious that it should have the power to co-designate efficiently and effectively. I was listening to the Minister when he set out his amendment, and he went through the process that Ofcom has, but it did not seem to quite meet the “efficiently and effectively” model. I should be interested to know why there is not more emphasis on co-regulation in general and the sharing of powers in particular.
In the spirit of the evening, I turn to Amendment 226 and make some comments before the noble Baroness, Lady Merron, has outlined the amendment, so I beg her indulgence on that. I want to support and credit the NSPCC for its work in gathering the entire child rights community behind it. Selfishly, I have my own early warning system, in the form of the 5Rights youth advisory group, made up of the GYG—gifted young generation—from Gravesend. It tells us frequently exactly what it does not like and does like about the online world. More importantly, it reveals very early on in our interactions the features or language associated with emerging harms.
Because of the lateness of the hour, I will not give your Lordships all the quotes, but capturing and reflecting children’s insight and voices is a key part of future-proofing. It allows us to anticipate new harms and, where new features pop up that are having a positive or negative impact, it is quite normal to ask the user groups how they are experiencing those features and that language themselves. That is quite normal across all consumer groups so, if this is a children’s Bill, why are children not included in this way?
In the work that I do with companies, they often ask what emerging trends we are seeing. For example, they actually say that they will accept any additions to the list of search words that can lead to self-harm content, or “What do we know about the emoji language that is happening now that was not happening last week?” I am always surprised at their surprise when we say that a particular feature is causing anxiety for children. Rather than being hostile, their response is almost always, “I have never thought about it that way before”. That is the value of consulting your consumer—in this case, children.
I acknowledge what the Minister said and I welcome the statutory consultees—the Children’s Commissioner, the Victims’ Commissioner and so on. It is a very welcome addition, but this role is narrowly focused on the codes of practice at the very start of the regulatory cycle, rather than the regulatory system as a whole. It does not include the wider experience of those organisations that deal with children in real time, such as South West Grid for Learning or the NSPCC, or the research work done by 5Rights, academics across the university sector or research partners such as Revealing Reality—ongoing, real-time information and understanding of children’s perspectives on their experience.
Likewise, super-complaints and Ofcom’s enforcement powers are what happen after harms take place. I believe that we are all united in thinking that the real objective of the exercise is to prevent harm. That means including children’s voices not only because it is their right but because, so often in my experience, they know exactly what needs to happen, if only we would listen.
My Lords, I speak mainly to support Amendment 220E, to which I have added my name. I am also delighted to support government Amendment 98A and I entirely agree with the statutory consultees listed there. I will make a brief contribution to support the noble Lord, Lord Clement-Jones, who introduced Amendment 220E. I thank the chief executive at Ofcom for the discussions that we have had on the designation and the Minister for the reply he sent me on this issue.
I have a slight feeling that we are dancing on the head of a pin a little, as we know that we have an absolutely world-leading organisation in the form of the Internet Watch Foundation. It plays an internationally respected role in tackling child sexual abuse. We should be, and I think we are, very proud to have it in the United Kingdom, and the Government want to enhance and further build on the best practice that we have seen. As we have already heard and all know, this Bill has been a very long time in coming and organisations such as the Internet Watch Foundation, which are pretty certain because of their expertise and the good work they have done already, should be designated.
However, without knowing that and without having a strong steer of support from the Minister, it becomes harder for them to operate, as they are in a vacuum. Things such as funding and partnership working become harder and harder, as well, which is what I mean by dancing on the head of a pin—unless the Minister says something about another organisation.
The IWF was founded in 1996, when 18% of the world’s known child sexual abuse material was hosted in the UK. Today that figure is less than 1% and has been since 2003, thanks to the work of the IWF’s analysts and the partnership approach the IWF takes. We should say thank you to those who are at the front line of the grimmest material imaginable and who do this to keep our internet safe.
I mentioned, in the previous group, the IWF’s research on girls. It says that it has seen more girls appearing in this type of imagery. Girls now appear in 96% of the imagery it removes from the internet, up almost 30 percentage points from a decade ago. That is another good reason why we want the internet and online to be a safe place for women and girls. As I say, any delay in establishing the role and responsibility of an expert organisation such as the IWF in working with Ofcom risks leaving a vacuum in which the risk is to children. That is really the ultimate thing; if there is a vacuum left and the IWF is not certain about its position, then what happens is that the children who are harmed most by this awful material are the ones who are not being protected. I do not think that is what anybody wants to see, however much we might argue about whether an order should be passed by Parliament or by Ofcom.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I rise very briefly to support the amendments in the name of the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson. Like other speakers, I put on record my support for the regulator being offered independence and Parliament having a role.
However, I want to say one very brief and minor thing about timing—I feel somewhat embarrassed after the big vision of the noble Baroness, Lady Stowell. Having had quite a lot of experience of code making over the last three years, I experienced the amount of time that the department was able to take in responding to the regulator as being a point of power, a point of lobbying, as others have said, and a point of huge distraction. For those of us who have followed the Bill for five years and as many Secretaries of State, we should be concerned that none of the amendments has quite tackled the question of time.
The idea of acting within a timeframe is not without precedent; the National Security and Investment Act 2021 is just one recent example. What was interesting about that Act was that the reason given for the Secretary of State’s powers being necessary was as a matter of national security—that is, they were okay and what we all agree should happen—but the reason for the time restriction was for business stability. I put it to the Committee that the real prospect of children and other users being harmed requires the same consideration as business stability. Without a time limit, it is possible that inaction can be used to control or simply fritter away.
My Lords, I will make a short contribution on this substantive question of whether concerns about ministerial overreach are legitimate. Based on a decade of being on the receiving end of representations from Ministers, the short answer is yes. I want to expand on that with some examples.
My experience of working on the other side, inside a company, was that you often got what I call the cycle of outrage: something is shared on social media that upsets people; the media write a front-page story about it; government Ministers and other politicians get involved; that then feeds back into the media and the cycle spins up to a point where something must be done. The “something” is typically that the Minister summons people, such as me in my old job, and brings them into an office. That itself often becomes a major TV moment, where you are brought in, browbeaten and sent out again with your tail between your legs, and the Minister has instructed you to do something. That entire process takes place in the political rather than the regulatory domain.
I readily concede that, in many cases, something of substance needed to be addressed and there was a genuine problem. It is not that this was illegitimate, but these amendments are talking about the process for what we should do when that outrage is happening. I agree entirely with the tablers of the amendments that, to the extent that that process can be encapsulated within the regulator rather than a Minister acting on an ad hoc basis, it would be a significant improvement.
I also note that this is certainly not UK-specific, and it would happen in many countries with varying degrees of threat. I remember being summoned to the Ministry of the Interior in Italy to meet a gentleman who has now sadly passed. He brought me into his office, sat me down, pointed to his desk and said “You see that desk? That was Mussolini’s desk”. He was a nice guy and I left with a CD of his rhythm and blues band, but it was clear that I was not supposed to say no to him. He made a very clear and explicit political direction about content that was on the platform.
One big advantage of this Bill is that it has the potential to move beyond that world. It could move from individual people in companies—the noble Baroness, Lady Stowell of Beeston, made this point very powerfully—to changing the accountability model away from either platforms being entirely accountable themselves or platforms and others, including Ministers, somehow doing deals that will have an impact, as the noble Baroness, Lady Fox, and the noble Viscount, Lord Colville, said, on the freedom of expression of people across the country. We do not want that.
We want to move on in the Bill and I think we have a model which could work. The regulator will take on the outrage and go as far as it can under the powers granted in the Bill. If the regulator believes that it has insufficient powers, it will come back to Parliament and ask for more. That is the way in which the system can and should work. I think I referred to this at Second Reading; we have an opportunity to create clear accountability. Parliament instructs Ofcom, which instructs the platforms. The platforms do what Ofcom says, or Ofcom can sanction them. If Ofcom feels that its powers are deficient, it comes back to Parliament. The noble Lord, Lord Stevenson, and others made the point about scrutiny and us continually testing whether Ofcom has the powers and is exercising them correctly. Again, that is entirely beneficial and the Government should certainly be minded to accept those amendments.
With the Secretary of State powers, as drafted in the Bill and without the amendments we are considering today, we are effectively taking two steps forward and one step back on transparency and accountability. We have to ask: why take that step back when we are able to rely on Ofcom to do the job without these directions?
The noble Baroness, Lady Stowell of Beeston, made the point very clearly that there are other ways of doing this. The Secretary of State can express their view. I am sure that the Minister will be arguing that the Secretary of State’s powers in the Bill are better than the status quo because at least what the Secretary of State says will be visible; it will not be a back-room deal. The noble Baroness, Lady Stowell of Beeston, has proposed a very good alternative, where the Secretary of State makes visible their intentions, but not in the form of an order—rather in the form of advice. The public—it is their speech we are talking about—then have the ability to see whether they agree with Ofcom, the companies or the Secretary of State if there is any dispute about what should happen.
It is certainly the case that visible instructions from the Secretary of State would be better, but the powers as they are still leave room for arm-twisting. I can imagine a future scenario in which future employees of these platforms are summoned to the Secretary of State. But now the Secretary of State would have a draft order sitting there. The draft order is Mussolini’s desk. They say to the people from the platforms, “Look, you can do what I say, or I am going to send an order to Ofcom”. That takes us back to this world in which the public are not seeing the kind of instructions being given.
I hope that the Government will accept that some amendment is needed here. All the ones that have been proposed suggest different ways of achieving the same objective. We are trying to protect future Secretaries of State from an unhealthy temptation to intervene in ways that they should not.
My Lords, it is a privilege to introduce Amendments 123A, 142, 161 and 184 in my name and those of the noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. These amendments represent the very best of your Lordships’ House and, indeed, the very best of Parliament and the third sector because they represent an extraordinary effort to reach consensus between colleagues across the House including both opposition parties, many of the Government’s own Benches, a 40-plus group of Back-Bench Conservatives and the Opposition Front Bench in the other place. Importantly, they also enjoy the support of the commercial age check sector and a vast array of children’s charities and, in that regard, I must mention the work of Barnardo’s, CEASE and 5Rights, which have really led the charge.
I will spend the bulk of my time setting out in detail the amendments themselves, and I will leave my co-signatories and others to make the arguments for them. Before I do, I once again acknowledge the work of the noble Baroness, Lady Benjamin, who has been fighting this fight for many years, and the noble Baroness, Lady Harding, whose characteristic pragmatism was midwife to the drafting process. I also acknowledge the time spent talking about this issue with the Secretary of State, the noble Lord the Minister and officials at DSIT. I thank them for their time and their level of engagement.
Let me first say a few words about age assurance and age verification. Age assurance is the collective term for all forms and levels of age verification, which means an exact age, and age estimation, which is an approximate or probable age. Age assurance is not a technology; it is any system that seeks to achieve a level of certainty about the age or age range of a person. Some services with restricted products and services have no choice but to have the very highest level of assurance or certainty—others less so.
To be clear at the outset, checking someone’s age, whether by verification or estimation, is not the same as establishing identity. While it is absolutely the case that you can establish age as a subset of establishing someone’s identity, the reverse is not necessarily true. Checking someone’s age does not need to establish their identity.
Age assurance strategies are multifaceted. As the ICO’s guidance in the age-appropriate design code explains, online services can deploy a range of methods to achieve the necessary level of certainty about age or age range. For example, self-verification, parental authentication, AI estimation and/or the use of passports and other hard identifiers may all play a role in a single age assurance strategy, or any one of them may be a mechanism in itself in other circumstances. This means that the service must consider its product and make sure that the level of age assurance meets the level of risk.
Since we first started debating these issues in the context of the Digital Economy Act 2017, the technology has been transformed. Today, age assurance might just as effectively be achieved by assessing the fluidity of movement of a child dancing in a virtual reality game as by collecting their passport. The former is over 94% accurate within five seconds and is specific to that particular child, while a passport may be absolute but less reliable in associating the check with a particular child. So, in the specific context of that dancing child, it is likely that the former gives the greater assurance. When a service’s risk profile requires absolute or near absolute certainty—for example, any of the risks that are considered primary priority harms, including, but not limited to, pornography—having the highest possible level of assurance must be a precondition of access.
Age assurance can also be used to ensure that children who are old enough to use a service have an age-appropriate experience. This might mean disabling high-risk features such as hosting, livestreaming or private messaging for younger children, or targeting child users or certain age groups with additional safety, privacy and well-being interventions and information. These amendments, which I will get to shortly, are designed to ensure both. To achieve the levels of certainty and privacy which are widely and rightly demanded, the Bill must both reflect the current state of play and anticipate nascent and emerging technology that will soon be considered standard.
That was a long explanation, for which I apologise, but I hope it makes it clear that there is no single approach, but, rather, a need to clearly dictate a high bar of certainty for high-risk services. A mixed economy of approaches, all geared towards providing good outcomes for children, is what we should be promoting. Today we have the technology, the political will and the legislative mechanism to make good on our adult responsibilities to protect children online. While age assurance is eminently achievable, those responsible for implementing it and, even more importantly, those subject to it need clarity on standards; that is to say, rules of the road. In an era when data is a global currency, services have shown themselves unable to resist the temptation to repurpose information gleaned about the age of their users, or to facilitate the access to industrial amounts of harmful material for children for commercial gain. As with so many of tech’s practices, this has eroded trust and heightens the need for absolute clarity on how services build their age-assurance systems and what they do—and do not do—with the information they gather, and the efficacy and security of the judgments they make.
Amendment 125A simply underlines the point made frequently in Committee by the noble Baroness, Lady Ritchie of Downpatrick, that the Bill should make it clear that pornography should not be judged by where it is found but by the nature of the material itself. It would allow Ofcom to provide guidance on pornographic material that should be behind an age gate, either in Part 3 or Part 5.
Amendment 142 seeks to insert a new clause setting out matters that Ofcom must reflect in its guidance for effective age assurance; these are the rules of the road. Age assurance must be secure and maintain the highest levels of privacy; this is paramount. I do not believe I need to give examples of the numerous data leaks but I note the excessive data harvesting undertaken by some of the major platforms. Age assurance must not be an excuse to collect users’ personal and sensitive information unnecessarily, and it should not be sold, stored or used for other purposes, such as advertising, or offered to third parties.
Age assurance must be proportionate to the risk, as per the results of the child risk assessment, and let me say clearly that proportionality is not a route to allow a little bit of porn or a medium amount of self-harm, or indeed a lot of both, to a small number of children. In the proposed new clause, proportionality means that if a service is high-risk, it must have the highest levels of age assurance. Equally, if a service is low-risk or no-risk, it may be that no age assurance is necessary, or it should be unobtrusive in order to be proportionate. Age-assurance systems must provide mechanisms to challenge or change decisions to ensure that everyone can have confidence in their use, and they do not keep individuals—adults or children—out of spaces they have the right to be in. It must be inclusive and accessible so that children with specific accessibility needs are considered at the point of its design, and it must provide meaningful information so that users can understand the mode of operation. I note that the point about accessibility is of specific concern to the 5Rights young advisers. Systems must be effective. It sounds foolish to say so, but look at where we are now, when law in the US, Europe, the UK and beyond stipulates age restrictions and they are ignored to the tune of tens of millions of children.
Age assurance is not to rely solely on the user to provide information; a tick box confirming “I am 18” is not sufficient for any service that carries a modicum of risk. It must be compatible with the following laws: the Data Protection Act, the Human Rights Act, the Equality Act and the UNCRC. It must have regard to the risks and opportunities of interoperable age assurance, which, in the future, will see these systems seamlessly integrated into our services, just as opening your phone with your face, or using two-factor authentication when transferring funds, are already normalised. It must consult with the Information Commissioner and other persons relevant to technological expertise and an understanding of child development.
On that point, I am in full support of the proposal from the noble Lord, Lord Allan, to require Ofcom to produce regular reports on age-assurance technology, and see his amendment as a necessary companion piece to these amendments. Importantly, the amendment stipulates that the guidance should come forward in six months and that all systems of age assurance, whether estimated or verified, whether operated in-house or by third-party providers, and all technologies must adhere to the same principles. It allows Ofcom to point to technical standards in its guidance, which I know that the ISO and the IEEE are currently drafting with this very set of principles in mind.
My Lords, I thank everyone for their contributions this evening. As the noble Lord, Lord Stevenson, said, it is very compelling when your Lordships’ House gets itself together on a particular subject and really agrees, so I thank noble Lords very much for that.
I am going to do two things. One is to pick up on a couple of questions and, as has been said by a number of noble Lords, concentrate on outcomes rather than contributions. On a couple of issues that came up, I feel that the principle of pornography being treated in the same way in Parts 3 and 5 is absolute. We believe we have done it. After Committee we will discuss that with noble Lords who feel that is not clear in the amendment to make sure they are comfortable that it is so. I did not quite understand in the Minister’s reply that pornography was being treated in exactly the same way in Parts 3 and 5. When I say “exactly the same way”, like the noble Lord, Lord Allan, I mean not necessarily by the same technology but to the same level of outcome. That is one thing I want to emphasise because a number of noble Lords, including the noble Baroness, Lady Ritchie, the noble Lord, Lord Farmer, and others, are rightly concerned that we should have an outcome on pornography, not concentrate on how to get there.
The second thing I want to pick up very briefly, because it was received so warmly, is the question of devices and on-device age assurance. I believe that is one method, and I know that at least one manufacturer is thinking about it as we speak. However, it is an old battle in which companies that do not want to take responsibility for their services say that people over here should do something different. It is very important that devices, app stores or any of the supposed gatekeepers are not given an overly large responsibility. It is the responsibility of everyone to make sure that age assurance is adequate.
I hope that what the noble Baroness is alluding to is that we need to include gatekeepers, app stores, device level and sideloading in another part of the Bill.
But of course—would I dare otherwise? What I am saying is that these are not silver bullets and we must have a mixed economy, not only for what we know already but for what we do not know. We must have a mixed economy, and we must not make an overly powerful one platform of age assurance. That is incredibly important, so I wanted to pick up on that.
I also want to pick up on user behaviour and unintended consequences. I think there was a slight reference to an American law, which is called COPPA and is the reason that every website says 13. That is a very unhelpful entry point. It would be much better if children had an age-appropriate experience from five all the way to 18, rather than on and off at 13. I understand that issue, but that is why age assurance has to be more than one thing. It is not only a preventive thing but an enabling thing. I tried to make that very clear so I will not detain the Committee on that.
On the outcome, I say to the Minister, who has indeed given a great deal of time to this, that more time is needed because we want a bar of assurance. I speak not only for all noble Lords who have made clear their rightful anxiety about pornography but also on behalf of the bereaved parents and other noble Lords who raised issues about self-harming of different varieties. We must have a measurable bar for the things that the Bill says that children will not encounter—the primary priority harms. In the negotiation, that is non-negotiable.
On the time factor, I am sorry to say that we are all witness to what happened to Part 3. It was pushed and pushed for years, and then it did not happen—and then it was whipped out of the Bill last week. This is not acceptable. I am happy, as I believe other noble Lords are, to negotiate a suitable time that gives Ofcom comfort, but it must be possible, with this Bill, for a regulator to bring something in within a given period of time. I am afraid that history is our enemy on this one.
The third thing is that I accept the idea that there has to be more than principles, which is what I believe Ofcom will provide. But the principles have to be 360 degrees, and the questions that I raised about security, privacy and accessibility should be in the Bill so that Ofcom can go away and make some difficult judgments. That is its job; ours is to say what the principle is.
I will tell one last tiny story. About 10 years ago, I met in secret with one of the highest-ranking safety officers in one of the companies that we always talk about. They said to me, “We call it the ‘lost generation’. We know that regulation is coming, but we know that it is not soon enough for this generation”. On behalf of all noble Lords who spoke, I ask the Government to save the next generation. With that, I withdraw the amendment.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, first, I want to recognise the bravery of the families of Olly, Breck, Molly, Frankie and Sophie in campaigning for the amendments we are about to discuss. I also pay tribute to Mia, Archie, Isaac, Maia and Aime, whose families I met this morning on their way to the House. It is a great privilege to stand alongside them and witness their courage and dignity in the face of unimaginable grief. On behalf of myself, my co-signatories—the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Baroness, Lady Morgan—and the huge number of Peers and MPs who have supported these amendments, I thank them for their work and the selflessness they have shown in their determination to ensure that other families do not suffer as they have.
This group includes Amendments 198, 199, 215 and 216, which, together, would create a pathway for coroners and, by extension, families to get access to information relevant to the death of a child from technology services. The amendments would put an end to the inhumane situation whereby coroners and families in crisis are forced to battle faceless corporations to determine whether a child’s engagement with a digital service contributed to their death. Bereaved families have a right to know what happened to their children, and coroners have a duty to ensure that lessons are learned and that those who have failed in their responsibilities are held accountable.
Since the Minister is going to be the bearer of good news this afternoon, I will take the time to make arguments for the amendments as they stand. I simply say that, while parents have been fighting for access to information, those same companies have continued to suggest friends, material and behaviours that drive children into places and spaces in which they are undermined, radicalised into despair and come to harm. In no other circumstance would it be acceptable to withhold relevant information from a court procedure. It is both immoral and a failure of justice if coroners cannot access and review all relevant evidence. For the families, it adds pain to heartbreak as they are unable to come to terms with what has happened because there is still so much that they do not know.
I am grateful to the Government for agreeing to bring forward on Report amendments that will go a very long way towards closing the loopholes that allow companies to refuse coroners’ demands and ignore parents’ entreaties. The Government’s approach is somewhat different from that in front of us, but it covers the same ground. These amendments are the result of the considerable efforts of Ministers and officials from DSIT and the Ministry of Justice, with the invaluable support of the right honourable Sajid Javid MP. I wish to note on the record the leadership of the Secretary of State, who is currently on leave, and the Minister here, the noble Lord, Lord Parkinson.
The Government’s amendments will create an express power for Ofcom to require information from services about a deceased child user’s online activity following the receipt of a Schedule 5 request from a coroner. This will vastly increase the reach and power of that coroner. Information that Ofcom can request from regulated companies under the Online Safety Bill is extremely wide and includes detailed data on what is recommended; the amount of time the child spent on the service when they accessed it; their user journey; what content they liked, shared, rewatched, paused and reported; and whether other users raised red flags about the child’s safety or well-being before their death.
Information notices prompted by a Schedule 5 request from a coroner will be backed by Ofcom’s full enforcement powers and will apply to all regulated companies. If a service fails to comply, it may be subject to enforcement action, including senior management liability and fines of up to £18 million or 10% of global turnover—vastly different from the maximum fine of £1,000 under the Coroners and Justice Act 2009. Moreover, these amendments will give coroners access to Ofcom’s expertise and understanding of how online services work and of online services’ safety duties to children. Also, there will be provisions empowering Ofcom to share information freely to assist coroners in their inquiries. Companies must provide a dedicated means of communication to manage requests for information from bereaved parents and provide written responses to those requests. I look forward to the Minister setting out that these will be operated by a team of experts and backed up by Ofcom in ensuring that the communication is adequate, timely and not obstructive. Importantly, if the communication is not adequate, bereaved families will be able to notify Ofcom.
There are a small number of outstanding questions. We remain concerned that only larger companies will be required to set out their policies on disclosure. Sadly, children are often coerced and nudged into smaller sites that have less robust safety mechanisms. Small is not safe. A further issue is to ensure that a coroner is able, via a Schedule 5 notice given to Ofcom, to compel senior management to appear at an inquest. This is a crucial ask of the legal community, who battled and failed to get companies to attend inquests, notably Wattpad at the Frankie Thomas inquest and Snap Inc at Molly Russell’s inquest. Can the Minister undertake to close these gaps before Report?
A number of matters sit outside the scope of the Online Safety Bill. I am particularly grateful to the Secretary of State for committing in writing to further work beyond the Bill to ensure that the UK’s approach is comprehensive and watertight. The Government will be exploring ways in which the Data Protection and Digital Information (No. 2) Bill can support and complement these provisions, including the potential for a code that requires data preservation if a parent or enforcement officer contacts a helpline or if there is constructive knowledge, such as when a death has been widely reported, even before a Schedule 5 notice has been delivered.
The Government are engaging with the Chief Coroner to provide training in order to ensure that coroners have the knowledge they need to carry out inquests where children’s engagement with online services is a possible factor in their death. I am concerned about the funding of this element of the Government’s plans and urge the Minister to indicate whether this could be part of Ofcom’s literacy duties and therefore benefit from the levy. Possibly most importantly, the Secretary of State has undertaken to approach the US Government to ensure that coroners can review private messages that fall outside the scope of this Bill in cases where a child’s death is being investigated. I am grateful to the noble Lord, Lord Allan, for his support in articulating the issue, and accept the invitation to work alongside the department to achieve this.
There are only two further things to say. First, delivery is in the drafting, and I hope that when he responds, the Minister will assure the House that we will see the proposed amendments well before Report so that we can ensure that this works as we have all agreed. Secondly, the Government are now looking very carefully at other amendments which deal with prevention of harm in one way or another. I share the gratitude of Bereaved Parents for Online Safety for the work that has gone into this set of amendments. However, we want to see safety by design; a comprehensive list of harms to children in the Bill, including harms caused or amplified by the design of service; principles for age assurance which ensure that the systems put in place by regulated services are measurable, secure and fit for purpose; and a proper complaints service, so that children have somewhere to turn when things go wrong. What we have been promised is a radical change of status for the coroner and for the bereaved families. What we want is fewer dead children. I beg to move.
My Lords, some of the issues that we have been dealing with in this Bill are more abstract or generic harms, but here we are responding to a specific need of families in the UK who are facing the most awful of circumstances.
I want to recognise the noble Baroness, Lady Kidron, for her direct support for many of those families, and for her persistent efforts to use policy and the tools we have available to us here to improve the situation for families who, sadly, will face similar tragedies in future. I appreciate the time that she has spent with me in the spirit of finding workable solutions. It is an alliance that might seem improbable, given our respective responsibilities, which have sometimes placed us in publicly adversarial roles. However, one of the strengths of this Committee process is that it has allowed us to focus on what is important and to find that we have more in common than separates us. Nothing could be more important than the issue we are dealing with now.
I am pleased that it looks like we will be able to use this Bill to make some significant improvements in this area to address the challenges faced by those families, some of whom are here today, challenges which add to their already heart-wrenching distress. The first challenge these families face is to find someone at an online service who is willing and able to answer their questions about their loved one’s use of that platform. This question about contacts at online platforms is not limited to these cases but comes up in other areas.
As noble Lords will know, I used to work for Facebook, where I was often contacted by all sorts of Governments asking me to find people in companies, often smaller companies, concerning very serious issues such as terrorism. Even when they were dealing with the distribution of terrorist content, they would find it very challenging. There is a generic problem around getting hold of people at platforms. A real strength of the Online Safety Bill is that it will necessarily require Ofcom to develop contacts at all online services that offer user-to-user and search services to people in the UK. The Government estimate that 25,000 entities are involved. We are talking about Ofcom building a comprehensive database of pretty much any service that matters to people in the UK.
Primarily, these contacts will be safety focused, as their main responsibility will be to provide Ofcom with evidence that the service is meeting its duties of care under the Bill, so again, they will have the right people in the right companies on their database in future. Importantly, Ofcom will have a team of several hundred people, paid for by a levy on these regulated services, to manage the contacts at the right level. We can expect that, certainly for the larger services, there may be a team of several people at Ofcom dedicated to working with them, whereas for the smaller services it may be a pooled arrangement whereby one Ofcom staff member deals with a group. However, in all cases there will be someone at the regulator with a responsibility for liaising with those companies. We do not expect Ofcom to use those contacts to resolve questions raised by individuals in the UK as a matter of course, but it makes sense to make this channel available where there is a relatively small number of highly impactful cases such as we are dealing with here.
I do indeed welcome it. I do not feel I can do justice to all the speakers; I think I will cry, as I did when the noble Baroness, Lady Newlove, was speaking. I shall not do that, but I will thank all noble Lords from the bottom of my heart and will speak to just a couple of technical matters.
First, I accept the help of the noble Lord, Lord Allan, on the progress of the data protection negotiations with the US Government. That will be very helpful. I want to put on the record that there has been a lot of discussion about the privacy of other users and ensuring that it is central, particularly because other young people are in these interactions and we have to protect them, too. That is very much in our mind.
I welcome and thank the Minister. He said a couple of things, including that he hoped that what he will bring forward will rise to the expectation—so do I. The expectation is set high, and I hope that the Government rise to it. In relation to that, I note that a number of noble Lords carefully planted their expectations in Hansard. I will be giving the noble Lord a highlighter so that he can find them. I note that it was a particular skill of the ex-Secretary of State for DCMS, for laying down the things she expected to see.
I understood “exploring” and “in our mind”; the Government have certain things in their mind. I understand the context of that because we are talking about other Bills and things that are yet to come. I want to make a statement—I do not know whether it is a promise or a threat; I rather suspect it is both. I will not rest until this entire ecosystem is sorted. This is not about winning an amendment or a concession. This is about putting it right for families and, indeed, for coroners, who are not doing a good job under the current regime.
Finally, I echo those who have pointed out the other amendments that we are seeking on safety by design, age assurance and having the harms in the Bill. I believe I speak for Bereaved Parents for Online Safety; that is what they wish to see come from their pain. It has been the privilege of my life to deal with these parents and these families and I thank the Committee for its support. With my conditions set out, I wish to withdraw my amendment.
My Lords, I will speak briefly to Amendment 218JA, spoken to by the noble Lord, Lord Allan. My name is attached to it online but has not made it on to the printed version. He introduced it so ably and comprehensively that I will not say much more, but I will be more direct with my noble friend the Minister.
This amendment would remove Clause 133(11). The noble Lord, Lord Allan, mentioned that BT has raised with us—I am sure that others have too—that the subsection gives examples of access facilities, such as ISPs and application stores. However, as the noble Lord said, there are other ways that services could use operating systems, browsers and VPNs to evade these access restriction orders. While it is convention for me to say that I would support this amendment should it be moved at a later stage, this is one of those issues that my noble friend the Minister could take off the table this afternoon—he has had letters about it to which there have not necessarily been replies—just by saying that subsection (11) does not give the whole picture, that there are other services and that it is misleading to give just these examples. Will he clarify at the Dispatch Box and on the record, for the benefit of everyone using the Bill now and in future, what broader services are caught? We could then take the issue off the table on this 10th day of Committee.
My Lords, I will be even more direct than the noble Baroness, Lady Morgan, and seek some confirmation. I understood from our various briefings in Committee that, where content is illegal, it is illegal anywhere in the digital world—it is not restricted simply to user to user, search and Part 5. Can the Minister say whether I have understood that correctly? If I have, will he confirm that Ofcom will be able to use its disruption powers on a service out of scope, as it were, such as a blog or a game with no user-to-user aspect, if it were found to be persistently hosting illegal content?
My Lords, this has been an interesting debate, though one of two halves, if not three.
The noble Lord, Lord Bethell, introduced his amendment in a very measured way. My noble friend Lady Benjamin really regrets that she cannot be here, but she strongly supports it. I will quote her without taking her speech entirely on board, as we have been admonished for that previously. She would have said that
“credit card companies have claimed ignorance using the excuse of how could they be expected to know they are supporting porn if they were not responsible for maintaining porn websites … This is simply not acceptable”.
Noble Lords must forgive me—I could not possibly have delivered that in the way that my noble friend would have done. However, I very much took on board what the noble Lord said about how this makes breaches transparent to the credit card companies. It is a right to be informed, not an enforcement power. The noble Lord described it as a simple and proportionate measure, which I think is fair. I would very much like to hear from the Minister why, given the importance of credit card companies in the provision of pornographic content, this is not acceptable to the Government.
The second part of this group is all about effective enforcement, which the noble Lord, Lord Bethell, spoke to as well. This is quite technical; it is really important that these issues have been raised, in particular by the noble Lord. The question is whether Ofcom has the appropriate enforcement powers. I was very taken by the phrase
“pre-empt a possible legal challenge”,
as it is quite helpful to get your retaliation in first. Underlying all this is that we need to know what advice the Minister and Ofcom are getting about the enforcement powers and so on.
I am slightly more sceptical about the amendments from the noble Lord, Lord Curry. I am all in favour of the need for speed in enforcement, particularly having argued for it in competition cases, where getting ex-ante powers is always a good idea—the faster one can move, the better. However, restricting the discretion of Ofcom in those circumstances seems to me a bit over the top. Many of us have expressed our confidence in Ofcom as we have gone through the Bill. We may come back to this in future; none of us thinks the Bill will necessarily be the perfect instrument, and it may prove that we do not have a sufficiently muscular regulator. I entirely respect the noble Lord’s track record and experience in regulation, but Ofcom has so far given us confidence that it will be a muscular regulator.
I turn now to the third part of the group. I was interested in the context in which my noble friend placed enforcement; it is really important and supported by the noble Baroness, Lady Morgan. It is interesting what questions have been asked about the full extent of the Government’s ambitions in this respect: are VPNs going to be subject to these kinds of notices? I would hope so; if VPNs are really the gateway to some of the unacceptable harms that we are trying to prevent, we should know about that. We should be very cognisant of the kind of possible culture being adopted by some of the social media and regulated services, and we should tailor our response accordingly. I will be interested to hear what the Government have to say on that.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I too want to support this group of amendments, particularly Amendment 234, and will make just a couple of brief points.
First, one of the important qualities of the online safety regime is transparency, and this really speaks to that point. It is beyond clear that we are going to need all hands on deck, and again, this speaks to that need. I passionately agree with the noble Baroness, Lady Fox, on this issue and ask, when does an independent researcher stop being independent? I have met quite a lot on my journey who suddenly find ways of contributing to the digital world other than their independent research. However, the route described here offers all the opportunities to put those balancing pieces in place.
Secondly, I am very much aware of the fear of the academics in our universities. I know that a number of them wrote to the Secretary of State last week saying that they were concerned that they would be left behind their colleagues in Europe. We do not want to put up barriers for academics in the UK. We want the UK to be at the forefront of governance of the digital world, this amendment speaks to that, and I see no reason for the Government to reject it.
Finally, I want to emphasise the importance of research. Revealing Reality did research for 5Rights called Pathways, in which it built avatars for real children and revealed the recommendation loops in action. We could see how children were being offered self-harm, suicide, extreme diets and livestream porn within moments of them arriving online. Frances Haugen has already been mentioned. She categorically proved what we have been asserting for years, namely that Instagram impacts negatively on teenage girls. As we put this regime in place, it is not adequate to rely on organisations that are willing to work in the grey areas of legality to get their research or on whistleblowers—on individual acts of courage—to make the world aware.
One of the conversations I remember happened nearly five years ago, when the then Secretary of State asked me what the most important thing about the Bill was. I said, “To bring a radical idea of transparency to the sector”. This amendment goes some way to doing just that.
My Lords, I, too, support Amendments 233 and 234, and Amendment 233A, from the noble Lord, Lord Allan. As the noble Baroness, Lady Kidron, said, it has been made clear in the past 10 days of Committee that there is a role for every part of society to play to make sure that we see the benefits of the digital world but also mitigate the potential harms. The role that researchers and academics can play in helping us understand how the digital world operates is critical—and that is going to get ever more so as we enter a world of large language models and AI. Access to data in order to understand how digital systems and processes work will become even more important—next week, not just in 10 years’ time.
My noble friend Lord Bethell quite rightly pointed out the parallels with other regulators, such as the MHRA and the Bank of England. A number of people are now comparing the way in which the MHRA and other medical regulators regulate the development of drugs with how we ought to think about the emergence of regulation for AI. This is a very good read-across: we need to set the rules of the road for researchers and ensure, as the noble Baroness, Lady Kidron, said—nailing it, as usual—that we have the most transparent system possible, enabling people to conduct their research in the light, not in the grey zone.
It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.
We are sympathetic to the amendment. It is complex, and this has been a useful debate—
I wonder whether the Minister has an answer to the academic community, who now see their European colleagues getting ahead through being able to access data through other legislation in other parts of the world. Also, we have a lot of faith in Ofcom, but it seems a mistake to let it be the only arbiter of what needs to be seen.
We are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.
My Lords, I congratulate the noble Baroness on having elucidated this arcane set of amendments. Unfortunately, though, it makes me deeply suspicious when I see what the amendments seem to do. I am not entirely clear about whether we are returning to some kind of merits-based appeal. If so, since the main litigators are going to be the social media companies, it will operate for their benefit to reopen every single thing that they possibly can on the basis of the original evidence that was taken into account by Ofcom, as opposed to doing it on a JR basis. It makes me feel quite uncomfortable if it is for their benefit, because I suspect it is not going to be for the ordinary user who has been disadvantaged by a social media company. I hope our brand spanking new independent complaints system—which the Minister will no doubt assure us is well on the way—will deal with that, but this strikes me as going a little too far.
My Lords, I enter the fray with some trepidation. In a briefing, Carnegie, which we all love and respect, and which has been fantastic in the background in Committee days, shared some concerns. As I interpret its concerns, when Ofcom was created in 2003 its decisions could be appealed on their merits, as the noble Lord has just suggested, to the Competition Appeal Tribunal, and I believe that this was seen as a balancing measure against an untested regime. What followed was that the broad basis on which appeal was allowed led to Ofcom defending 10 appeals per year, which really frustrated its ability as a regulator to take timely decisions. It turned out that the appeals against Ofcom made up more than 80% of the workload of the Competition Appeal Tribunal, whose work was supposed to cover a whole gamut of matters. When there was a consultation in the fringes of the DEA, it was decided to restrict appeal to judicial review and appeal on process. I just want to make sure that we are not opening up a huge and unnecessary delaying tactic.
I thank all those who have spoken, and I very much appreciate the spirit in which the amendments were tabled. They propose changes to the standard of appeal, the standing to appeal and the appeals process itself. The Government are concerned that enabling a review of the full merits of cases, as proposed by Amendments 243 and 245, could prove burdensome for the courts and the regulator, since a full-merits approach, as we have been hearing, has been used by regulated services in other regulatory regimes to delay intervention, undermining the effectiveness of the enforcement process. With deep-pocketed services in scope, allowing for a full-merits review could incentivise speculative appeals, both undermining the integrity of the system and slowing the regulatory process.
While the Government are fully committed to making sure that the regulator is properly held to account, we feel that there is not a compelling case for replacing the decisions of an expert and well-resourced regulator with those of a tribunal. Ofcom will be better placed to undertake the complex analysis, including technical analysis, that informs regulatory decisions.
Amendment 245 would also limit standing and leave to appeal only to providers and those determined eligible entities to make super-complaints under Clause 150. This would significantly narrow the eligibility requirements for appeals. For appeals against Ofcom notices we assess that the broader, well-established standard in civil law of sufficient interest is more appropriate. Super-complaints fulfil a very different function from appeals. Unlike appeals, which will allow regulated services to challenge decisions of the regulator, super-complaints will allow organisations to advocate for users, including vulnerable groups and children, to ensure that systemic issues affecting UK users are brought to Ofcom’s attention. Given the entirely distinct purposes of these functions, it would be inappropriate to impose the eligibility requirements for super-complaints on the appeals system.
I am also concerned about the further proposal in Amendment 245 to allow the tribunal to replace Ofcom’s decision with its own. Currently, the Upper Tribunal is able to dismiss an appeal or quash Ofcom’s decision. Quashed decisions must be remitted to Ofcom for reconsideration, and the tribunal may give directions that it considers appropriate. Amendment 245 proposes instead allowing the Upper Tribunal to
“impose or revoke, or vary the amount of, a penalty … give such directions or take such other steps as OFCOM could itself have given or taken, or … make any other decision which OFCOM could itself have made”.
The concern is that this risks undermining Ofcom’s independence and discretion in applying its powers and issuing sanctions, and in challenging the regulator’s credibility and authority. It may also further incentivise well-resourced providers to appeal opportunistically, with a view to securing a more favourable outcome at a tribunal.
On that basis, I fear that the amendments tabled by the noble Lord would compromise the fundamental features of the current appeals provisions, without any significant benefits, and risk introducing a range of inadvertent consequences. We are confident that the Upper Tribunal’s judicial review process, currently set out in the Bill, provides a proportionate, effective means of appeal that avoids unnecessary expense and delays, while ensuring that the regulator’s decisions can be thoroughly scrutinised. It is for these reasons that I hope the noble Baroness will withdraw the amendment.
The fact that I labelled it as being AI-generated helped your Lordships to understand, and the transparency eases the debate. I beg to move.
My Lords, I thank the noble Lord, Lord Knight, for laying out the amendment and recognise that there was a very thoughtful debate on the subject of machine-generated content on Amendment 125 in my name on a previous day of Committee.
I appreciate that the concept of labelling or watermarking machine-generated material is central to recent EU legislation, but I am equally aware that there is more than one school of thought on the efficacy of that approach among AI experts. On the one hand, as the noble Lord, Lord Knight, beautifully set out—with the help of his artificial friend—there are those who believe that visibly marking the division of real and altered material is a clue for the public to look more carefully at what they are seeing and that labelling it might provide an opportunity for both creators and digital companies to give greater weight to “human-created material”. For example, it could be that the new BBC Verify brand is given greater validity by the public, or that Google’s search results promote it above material labelled as machine-generated as a more authentic source. There are others who feel that the scale of machine-generated material will be so vast that this labelling will be impossible or that labelling will downgrade the value of very important machine-generated material in the public imagination, when in the very near future it is likely that most human activity will be a blend of generated material and human interaction.
I spent the first part of this week locked in a room with others at the Institute for Ethics in AI in Oxford debating some of these issues. While this is a very live discussion, one thing is clear: if we are to learn from history, we must act now before all is certain, and we should act with pragmatism and a level of humility. It may be that either or both sets of experts are correct.
Industry has clearly indicated that there is an AI arms race, and many companies are launching services that they do not understand the implications of. This is not my view but one told to me by a company leader, who said that the speed of distribution was so great that the testing was confined to whether deploying large language models crashed the platforms; there was no testing for safety.
The noble Lord, Lord Stevenson, says in his explanatory statement that this is a probing amendment. I therefore ask the Minister whether we might meet before Report and look once again at the gaps that might be covered by some combination of Amendment 125 and the amendment in front of us, to make certain that the Bill adequately reflects the concerns raised by the enforcement community and reflects the advice of those who best understand the latest iterations of the digital world.
The Communications Act 2003 made a horrible mistake in not incorporating digital within it; let us not do the same here. Adding explicit safety duties to AI and machine learning would not slow down innovation but would ensure that innovation is not short-sighted and dangerous for humanity. It is a small amendment for what may turn out to be an unimaginably important purpose.
My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I will try to keep my remarks brief.
It is extremely helpful that we have the opportunity to talk about this labelling question. I see it more as a kind of aperitif for our later discussion of AI regulation writ large. Given that it is literally aperitif hour, I shall just offer a small snifter as to why I think there may be some challenges around labelling—again, perhaps that is not a surprise to the noble Baroness.
When we make rules, as a general matter we tend to assume that people are going to read them and respond in a rationalist, conformist way. In reality, particularly in the internet space, we often see that there is a mixed environment and there will be three groups. There are the people who will look at the rules and respond in that rational way to them; a large group of people will just ignore them—they will simply be unaware and not at all focused on the rules; and another group will look for opportunities to subvert them and use them to their own advantage. I want to comment particularly on that last group by reference to cutlery and call centres, two historic examples of where rules have been subverted.
On the cutlery example, I am a Sheffielder, and “Made in Sheffield” used to mean that you had made the entire knife in Sheffield. Then we had this long period when we went from knives being made in Sheffield to bringing them to Sheffield and silver-plating them, to eventually just sharpening them and putting them in boxes. That is relevant in the context of AI. Increasingly, if there is an advantage to be gained by appearing to be human, people will look at what kind of finishing you need, so: “The content may have been generated by AI but the button to post it was pushed by a human, therefore we do not think it is AI because we looked at it and posted it”. On the speech of the noble Lord, Lord Knight, does the fact that my noble friend intervened on him and the noble Lord had to use some of his own words now mean that his speech in Hansard would not have to be labelled “AI-generated” because we have now departed from it? Therefore, there is that question of individuals who will want something to appear human-made even if it was largely AI-generated, and whether they will find the “Made in Sheffield” way of bypassing it.
Interestingly, we may see the phenomenon flipping the other way, and this is where my call centres come in. If people go to a popular search engine and type in “SpinVox”, they will see the story of a tech company that promised to transcribe voicemails into written text. This was a wonderful use of technology, and it was valued on the basis that it had developed that fantastic technology. However, it turned out—or at least there were claims, which I can repeat here under privilege—that it was using call centres in low-cost, low-wage environments to type those messages out. Therefore, again, we may see, curiously, some people seeing an advantage to presenting content as AI-generated when it is actually made by humans. That is just to flag that up—as I say, it is a much bigger debate that we are going to have. It is really important that we are having it, and labelling has a role to play. However, as we think about it, I urge that we remember those communities of people who will look at whatever rules we come up with and say, “Aha! Where can I get advantage?”, either by claiming that something is human when it is generated by AI or claiming that it is generated by AI if it suits them when it was actually produced by humans.
My Lords, we already had a long debate on this subject earlier in Committee. In the interim, many noble Lords associated with these amendments have had conversations with the Government, which I hope will bear some fruit before Report. Today, I want to reiterate a few points that I hope are clarifying to the Committee and the department. In the interests of everyone’s evening plans, the noble Lord, Lord Bethell, and the noble Baroness, Lady Harding, wish to associate themselves with these remarks so that they represent us in our entirety.
For many years, we thought age verification was a gold standard, primarily because it involved a specific government-issued piece of information such as a passport. By the same token, we thought age estimation was a lesser beast, given that it is an estimate by its very nature and that the sector primarily relied on self-declarations with very few checks and balances. In recent years, many approaches to age checking have flourished. Some companies provide age assurance tokens based on facial recognition; others use multiple signals of behaviour, friendship group, parental controls and how you move your body in gameplay; and, only yesterday, I saw the very impressive on-device privacy-preserving age-verification system that Apple rolled out in the US two weeks ago. All of these approaches, used individually and cumulatively, have a place in the age-checking ecosystem, and all will become more seamless over time. But we must ensure that, when they are used, they are adequate for the task they are performing and are quality controlled so that they do not share information about a child, are secure and are effective.
That is why, at the heart of the package of measures put forward in my name and that of the noble Lords, Lord Stevenson and Lord Bethell, and the right reverend Prelate the Bishop of Oxford, are two concepts. First, the method of measuring age should be tech neutral so that all roads can be used. Secondly, there must be robust mechanism of measurement of effectiveness so that only effective systems can be used in high-risk situations, particularly those of primary priority harms such as self-harm and pornography, and that such a measurement will be determined by Ofcom, not industry.
From my work over the last decade and from recent discussion with industry, I am certain that any regime of age assurance must be measurable and hold to certain principles. We cannot create a situation where children’s data is loosely held and liberally shared; we cannot have a system that discriminates against, or does not have automatic appeal mechanisms for, children of colour or those who are 17 or 19, who are at most likelihood of error. Systems should aim to be interoperable and private, not leave traces as children go from one service to another.
Each of the principles of our age-verification package set out in the schedule are of crucial importance. I hope that the Government will see the sense in that because, without them, this age checking will not be trusted. Equally, I urge the Committee to embrace the duality of age verification and estimation that the Government have put forward, because, if a child uses an older sibling’s form of verification and a company understands through the child’s behaviour that they are indeed a child, then we do not want to set up a perverse situation in which the verification is considered of a higher order and they cannot take action based on estimation; ditto, if estimation in gameplay is more accurate than tokens that verify whether someone is over or under 18, it may well be that estimation gives greater assurance that the company will treat the child according to their age.
I hope and believe that, in his response, the Minister will confirm that definitions of age assurance and age estimation will be on the face of the Bill. I also urge him to make a generous promise to accept the full gamut of our concerns about age checking and bring forward amendments in his name on Report that reflect them in full. I beg to move.
My Lords, I associate these Benches with the introduction by the noble Baroness, Lady Kidron, support her amendments and, likewise, hope that they form part of the package that is trundling on its way towards us.
I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.
As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.
The Bill already includes a definition of age assurance in Clause 207, which is
“measures designed to estimate or verify the age or age-range of users of a service”.
As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.
This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, I will speak to Amendment 1, to which I was happy to add my name alongside that of the Minister. I too thank the noble Lord, Lord Stevenson, for tabling the original amendment, and my noble and learned friend Lord Neuberger for providing his very helpful opinion on the matter.
I am especially pleased to see that ensuring that services are safe by design and offer a higher standard of protection for children is foundational to the Bill. I want to say a little word about the specificity, as I support the noble Baroness, Lady Merron, in trying to get to the core issue here. Those of your Lordships who travel to Westminster by Tube may have seen TikTok posters saying that
“we’re committed to the safety of teens on TikTok. That’s why we provide an age appropriate experience for teens under 16. Accounts are set to private by default, and their videos don’t appear in public feeds or search results. Direct messaging is also disabled”.
It might appear to the casual reader that TikTok has suddenly decided unilaterally to be more responsible, but each of those things is a direct response to the age-appropriate design code passed in this House in 2018. So regulation does work and, on this first day on Report, I want to say that I am very grateful to the Government for the amendments that they have tabled, and “Please do continue to listen to these very detailed matters”.
With that, I welcome the amendment. Can the Minister confirm that having safety by design in this clause means that all subsequent provisions must be interpreted through that lens and will inform all the decisions of Report and those of Ofcom, and the Secretary of State’s approach to setting and enforcing standards?
My Lords, I too thank my noble friend the Minister for tabling Amendment 1, to which I add my support.
Very briefly, I want to highlight one word in it, to add to what the noble Baroness, Lady Kidron, has just said. The word is “activity”. It is extremely important that in Clause 1 we are setting out that the purpose is to
“require providers of services regulated by this Act to identify, mitigate and manage”
not just illegal or harmful content but “activity”.
I very much hope that, as we go through the few days on Report, we will come back to this and make sure that in the detailed amendments that have been tabled we genuinely live up to the objective set out in this new clause.
My Lords, I will speak briefly to Amendments 5C and 7A in this group. I welcome the Government’s moves to criminalise cyberflashing. It is something that many have campaigned for in both Houses and outside for many years. I will not repeat the issues so nobly introduced by the noble Baroness, Lady Burt, and I say yet again that I suspect that the noble Baroness, Lady Featherstone, is watching, frustrated that she is still not able to take part in these proceedings.
It is worth making the point that, if actions are deemed to be serious enough to require criminalisation and for people potentially to be prosecuted for them, I very much hope that my noble friend the Minister will be able to say in his remarks that this whole area of the law will be kept under review. There is no doubt that women and girls’ faith in the criminal justice system, both law enforcement and the Crown Prosecution Service, is already very low. If we trumpet the fact that this offence has been introduced, and then there are no prosecutions because the hurdles have not been reached, that is even worse than not introducing the offence in the first place. So I hope very much that this will be kept under review, and no doubt there will be opportunities to return to it in the future.
I do not want to get into the broader debate that we have just heard, because we could be here for a very long time, but I would just say to the noble Baronesses, Lady Kennedy and Lady Fox, that we will debate this in future days on Report and there will be specific protection and mention of women and girls on the face of the Bill—assuming, of course, that Amendment 152 is approved by this House. The guidance might not use the words that have been talked about, but the point is that that is the place to have the debate—led by the regulator with appropriate public consultation—about the gendered nature of abuse that the noble Baroness, Lady Kennedy, has so eloquently set out. I hope that will also be a big step forward in these matters.
I look forward to hearing from the Minister about how this area of law will be kept under review.
My Lords, I understand that, as this is a new stage of the Bill, I have to declare my interests: I am the chair of 5Rights Foundation, a charity that works around technology and children; I am a fellow at the computer science department at Oxford University; I run the Digital Futures Commission, in conjunction with the 5Rights Foundation and the London School of Economics; I am a commissioner on the Broadband Commission; I am an adviser for the AI ethics institute; and I am involved in Born in Bradford and the Lancet commission, and I work with a broad number of civil society organisations.
My comments will be rather shorter. I want to make a detailed comment about Amendment 5B, which I strongly support and which is in the name of the noble Lord, Lord Allan. It refers to,
“a genuine medical, scientific or educational purpose, … the purposes of peer support”
I would urge him to put “genuine peer support”. That is very important because there is a lot of dog whistling that goes on in this area. So if the noble Lord—
My working assumption would be that that would be contestable. If somebody claimed the peer support defence and it was not genuine, that would lead to them becoming liable. So I entirely agree with the noble Baroness. It is a very helpful suggestion.
I also want to support the noble Baroness, Lady Kennedy. The level of abuse to women online and the gendered nature of it has been minimised; the perpetrators have clearly felt immune to the consequences of law enforcement. What worries me a little in this discussion is the idea or conflation that anything said to a woman is an act of violence. I believe that the noble Baroness was being very specific about the sorts of language that could be caught under her suggestions. I understand from what she said that she has been having conversations with the Minister. I very much hope that something is done in this area, and that it is explored more fully, as the noble Baroness, Lady Morgan, said, in the guidance. However, I just want to make the point that online abuse is also gamified: people make arrangements to abuse people in groups in particular ways that are not direct. If they threaten violence, that is quite different to a pile-in saying that you are a marvellous human being.
My Lords, I too must declare my interests on the register—I think that is the quickest way of doing it to save time. We still have time, and I very much hope that the Minister will listen to this debate and consider it. Although we are considering clauses that, by and large, come at the end of the Bill, there is still time procedurally—if the Minister so decides—to come forward with an amendment later on Report or at Third Reading.
We have heard some very convincing arguments today. My noble friend explained that the Minister did not like the DPP solution. I have looked back again at the Law Commission report, and I cannot for the life of me see the distinction between what was proposed for the offence in its report and what is proposed by the Government. There is a cigarette paper, if we are still allowed to use that analogy, between them, but the DPP is recommended—perhaps not on a personal basis, although I do not know quite what distinction is made there by the Law Commission, but certainly the Minister clearly did not like that. My noble friend has come back with some specifics, and I very much hope that the Minister will put on the record that, in those circumstances, there would not be a prosecution. As we heard in Committee, 130 different organisations had strong concerns, and I hope that the Minister will respond to those concerns.
As regards my other noble friend’s amendment, again creatively she has come back with a proposal for including reckless behaviour. The big problem here is that many people believe that, unless you include “reckless” or “consent”, the “for a laugh” defence operates. As the Minister knows, quite expert advice has been had on this subject. I hope the Minister continues his discussions. I very much support my noble friend in this respect. I hope he will respond to her in respect of timing and monitoring—the noble Baroness, Lady Morgan, mentioned the need for the issue to be kept under review—even if at the end of the day he does not respond positively with an amendment.
Everybody believes that we need a change of culture—even the noble Baroness, Lady Fox, clearly recognises that—but the big difference is whether or not we believe that these particular amendments should be made. We very much welcome what the Law Commission proposed and what the Government have put into effect, but the question at the end of day is whether we truly are making illegal online what is illegal offline. That has always been the Government’s test. We must be mindful of that in trying to equate online behaviour with offline behaviour. I do not believe that we are there yet, however much moral leadership we are exhorted to display. I very much take the point of the noble Baroness, Lady Morgan, about the violence against women and girls amendment that the Government are coming forward with. I hope that will have a cultural change impact as well.
As regards the amendments of the noble Baroness, Lady Kennedy, I very much take the point she made, both at Committee and on Report. She was very specific, as the noble Baroness, Lady Kidron, said, and was very clear about the impact, which as men we severely underestimate if we do not listen to what she said. I was slightly surprised that the noble Baroness, Lady Fox, really underestimates the impact of that kind of abuse—particularly that kind of indirect abuse.
I was interested in what the Minister had to say in Committee:
“In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007”.—[Official Report, 22/6/23; col. 424.]
Is that still the Government's position? Has that been explained to the noble Baroness, Lady Kennedy, who I would have thought was pretty expert in the 2007 Act? If she does not agree with the Minister, that is a matter of some concern.
Finally, I agree that we need to consider the points raised at the outset by the noble and learned Lord, Lord Garnier, and I very much hope that the Government will keep that under review.
I also welcome these amendments and want to pay tribute to Maria Miller in the other place for her work on this issue. It has been extraordinary. I too was going to raise the issue of the definition of “photograph”, so perhaps the Minister could say or, even better, put it in the Bill. It does extend to those other contexts.
My main point is about children. We do not want to criminalise children, but this is pervasive among under-18s. I do want to make the distinction between those under-18s who intentionally harm another under-18 and have to be responsible for what they have done in the meaning of the law as the Minister set it out, and those who are under the incredible pressure—I do not mean coercion, because that is another out-clause—of oversharing that is inherent in the design of many of these services. That is an issue I am sure we are going to come back to later today. I would love to hear the Minister say something about the Government’s intention from the Dispatch Box: that it is preventive first and there is a balance between education and punishment for under-18s who find themselves unavoidably in this situation.
Very briefly, before I speak to these amendments, I want to welcome them. Having spoken to and introduced some of the threats of sharing intimate images under the Domestic Abuse Act 2021, I think it is really welcome that everything has been brought together in one place. Again, I pay tribute to the work of Dame Maria Miller and many others outside who have raised these as issues. I also want to pay tribute to the Ministry of Justice Minister Edward Argar, who has also worked with my noble friend the Minister on this.
I have one specific question. The Minister did mention this in his remarks, but could he be absolutely clear that these amendments do not mention specifically the lifetime anonymity of claimants and the special measures in relation to giving evidence that apply to witnesses. That came up in the last group of amendments as well. Because they are not actually in this drafting, it would be helpful if he could put on record the relationship with the provisions in the Sexual Offences Act 2003. I know that would be appreciated by campaigners.
I believe I misspoke when I asked my question. I referred to under-18s. Of course, if they are under 18 then it is child sexual abuse. I meant someone under the age of 18 with an adult image. I put that there for the record.
If the noble Baroness misspoke, I understood what she intended. I knew what she was getting at.
With that, I hope noble Lords will be content not to press their amendments and that they will support the government amendments.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, I will speak to my Amendments 281 to 281B. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to them. I will deal first with Amendments 281 and 281B, then move to 281A.
On Amendments 281 and 281B, the Minister will recall that in Committee we had a discussion around how functionality is defined in the Bill and that a great deal of the child risk assessments and safety duties must have regard to functionality, as defined in Clause 208. However, as it is currently written, this clause appears to separate out functionalities of user-to-user services and search services. These two amendments are designed to adjust that slightly, to future-proof the Bill.
Why is this necessary? First, it reflects that it is likely that in the future, many of the functionalities that we currently see on user-to-user services will become present on search services and possibly vice versa. Therefore, we need to try to take account of how the world is likely to move. Secondly, this is already happening, and it poses a risk to children. Some research done by the 5Rights Foundation has found that “predictive search”, counted in the Bill as a search service functionality, is present on social media websites, leading one child user using a search bar to be presented in nanoseconds with prompts associated with eating disorders. In Committee, the Minister noted that the functionalities listed in this clause are non-exhaustive. At the very least, it would be helpful to clarify this in the Bill language.
Amendment 281A would add specific functionalities which we know are addictive or harmful to children and put them in the Bill. We have a great deal of research and evidence which demonstrates how persuasive certain design strategies are with children. These are features which are solely designed to keep users on the platform, at any cost, as much as possible and for as long as possible. The more that children are on the platform, the more harm they are likely to suffer. Given that the purpose of this Bill is for services to be safe by design, as set out usefully in Amendment 1, please can we make sure that where we know—and we do know—that risk exists, we are doing our utmost to tackle it?
The features that are listed in this amendment are known as “dark patterns”—and they are known as “dark patterns” for a very good reason. They have persuasive and pervasive design features which are deliberately baked into the design of the digital services and products, to capture and hold, in this case, children’s attention, and to create habitual, even compulsive behaviours. The damage this does to children is proven and palpable. For example, one of the features mentioned is infinite scroll, which is now ubiquitous on most major social media platforms. The inventor of infinite scroll, a certain Aza Raskin, who probably thought it was a brilliant idea at the time, has said publicly that he now deeply regrets ever introducing it, because of the effect it is having on children.
One of the young people who spoke to the researchers at 5Rights said of the struggle they have daily with the infinite scroll feature:
“Scrolling forever gives me a sick feeling in my stomach. I’m so aware of how little control I have and the feeling of needing to be online is overwhelming and consuming”.
Features designed to keep users—adults, maybe fine, but children not fine—online at any cost are taking a real toll. Managing public and frequent interactions online, which the features encourage, creates the most enormous pressures for young people, and with that comes anxiety, low self-esteem and mental health challenges. This is only increasing, and unless we are very specific about these, they are going to continue.
We have the evidence. We know what poses harm and risk to children. Please can we make sure that this is reflected accurately in the Bill?
My Lords, I rise briefly to support many of the amendments in this group. I will start with Amendments 281, 281A and 281B in the name of my noble friend Lord Russell, to which I have added my name. The noble Lord set out the case very well. I will not reiterate what he said, but it is simply the case that the features and functionalities of regulated companies should not be separated by search and user-to-user but should apply across any regulated company that has that feature. There is no need to worry about a company that does not have one of the features on the list, but it is a much more dangerous thing to have an absent feature than it is to have a single list and hold companies responsible for their features.
Only this morning, Meta released Thread as its challenger to Twitter. In the last month, Snapchat added generative AI to its offering. Instagram now does video, and TikTok does shopping. All these companies are moving into a place where they would like to be the one that does everything. That is their commercial endgame, and that is where the Bill should set its sights.
Separating out functionality and, as the noble Lord, Lord Russell, said, failing to add what we already know, puts the Bill in danger of looking very old before the ink is dry. I believe it unnecessarily curtails Ofcom in being able to approach the companies for what they are doing, rather than for what the Bill thought they might be doing at this point. So, if the Minister is not in a position to agree to the amendment, I urge him at least to take it away and have a look at it, because it is a technical rather than an ideological matter. It would be wonderful to fix it.
To follow on from that, we are talking about the obligation to bring exemptions to Parliament. Well, we are in Parliament and we are bringing exemptions. The noble Lord is recommending that we bring very specific exemptions while those that the noble Lord, Lord Moylan, and I have been recommending may be rather broad—but I thought we were bringing exemptions to Parliament. I am not being facetious. The point I am making is, “Why can’t we do it now?” We are here now, doing this. We are saying, as Parliament, “Look at these exemptions”. Can the Minister not look at them now instead of saying that we will look at them some other time?
I may as well intervene now as well, so that the Minister can get a good run at this. I too am concerned at the answer that has been given. I can see the headline now, “Online Safety Bill Age-Gates Wikipedia”. I cannot see how it does not, by virtue of some of the material that can be found on Wikipedia. We are trying to say that there are some services that are inherently in a child’s best interests—or that are in their best interests according to their evolving capacity, if we had been allowed to put children’s rights into the Bill. I am concerned that that is the outcome of the answer to the noble Lord, Lord Allan.
I thank the Minister for that. In conclusion, I hope he will reflect on those issues and come back, maybe at the end of the next group. I remind the House that in February the APPG on Commercial Sexual Exploitation, in its inquiry on pornography, recommended that the regulation of pornography should be consistent across all online platforms and between the online and offline spheres. I hope we can incorporate the voices I have already mentioned in the NGO sphere in order to assist the Government and both Houses in ensuring that we regulate the online platforms and that children are protected from any harms that may arise.
My Lords, I shall speak briefly to Amendment 174 in my name and then more broadly to this group—I note that the Minister got his defence in early.
On the question of misinformation and disinformation, I recognise what he said and I suppose that, in my delight at hearing the words “misinformation and disinformation”, I misunderstood to some degree what he was offering at the Dispatch Box, but I make the point that this poses an enormous risk to children. As an example, children are the fastest-growing group of far-right believers/activists online, and there are many areas in which we are going to see an exponential growth in misinformation and disinformation as large language models become the norm. So I ask him, in a tentative manner, to look at that.
On the other issue, I have to push back at the Minister’s explanation. Content classification around sexual content is a well-established norm. The BBFC does it and has done it for a very long time. There is an absolute understanding that what is suitable for a U, a PG, a 12 or a 12A are different things, and that as children’s capacities evolve, as they get older, there are things that are more suitable for older children, including, indeed, stronger portrayals of sexual behaviour as the age category rises. So I cannot accept that this opens a new can of worms: this is something that we have been doing for many, many years.
I think it is a bit wrongheaded to imagine that if we “solve” the porn problem, we have solved the problem—because there is still sexualisation and the commercialisation of sex. Now, if you say something about feet to a child, they start to giggle uproariously because, in internet language, you get paid for taking pictures of feet and giving them to strange people. There are such detailed and different areas that companies should be looking at. This amendment in my name and the names of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, should be taken very seriously. It is not new ground, so I would ask the Minister to reconsider it.
More broadly, the Minister will have noticed that I liberally added my name to the amendments he has brought forward to meet some of the issues we raised in Committee, and I have not added my name to the schedule of harms. I want to be nuanced about this and say I am grateful to the Government for putting them in the Bill, I am grateful that the content harms have been discussed in this Chamber and not left for secondary legislation, and I am grateful for all the conversations around this. However, harm cannot be defined only as content, and the last grouping got to the core of the issue in the House. Even when the Minister was setting out this amendment, he acknowledged that the increase in harm to users may be systemic and by design. In his explanation, he used the word “harm”; in the Bill, it always manifests as “harmful content”.
While the systemic risk of increasing the presence of harmful content is consistently within the Bill, which is excellent, the concept that the design of service may in and of itself be harmful is absent. In failing to do that, the Government, and therefore the Bill, have missed the bull’s-eye. The bull’s-eye is what is particular about this method of communication that creates harm—and what is particular are the features, functionalities and design. I draw noble Lords back to the debate about Wikipedia. It is not that we all love Wikipedia adoringly; it is that it does not pursue a system of design for commercial purposes that entraps people within its grasp. Those are the harms we are trying to get at. I am grateful for the conversations I have had, and I look forward to some more. I have laid down some other amendments for Monday and beyond that would, I hope, deal with this—but until that time, I am afraid this is an incomplete picture.
My Lords, I have a comment about Amendment 174 in the name of the noble Baroness, Lady Kidron. I have no objection to the insertion of subsection (9B), but I am concerned about (9A), which deals with misinformation and disinformation. It is far too broad and political, and if we start at this late stage to try to run off into these essentially political categories, we are going to capsize the Bill altogether. So I took some heart from the fact that my noble friend on the Front Bench appeared disinclined to accept at least that limb of the amendment.
I did want to ask briefly some more detailed questions about Amendment 172 and new subsection (2) in particular. This arises from the danger of having clauses added at late stages of the Bill that have not had the benefit of proper discussion and scrutiny in Committee. I think we are all going to recognise the characteristics that are listed in new subsection (2) as mapping on to the Equality Act, which appears to be their source. I note in passing that it refers in that regard to gender reassignment. I would also note that most of the platforms, in their terms and conditions, refer not to gender reassignment but to various other things such as gender identity, which are really very different, or at least different in detail, and I would be interested to ask my noble friend how effectively he expects it to be enforced that the words used in English statute are actually applied by these foreign platforms—I am going to come back to this in a further amendment later—or how the words used in English statute are applied by what are, essentially, foreign platforms when they are operating for an audience in the United Kingdom.
My Lords, this has been a useful debate. As the noble Baroness, Lady Kidron, says, because I spoke first to move the government amendments, in effect I got my response in first to her Amendment 174, the only non-government amendment in the group. That is useful because it allows us to have a deeper debate on it.
The noble Baroness asked about the way that organisations such as the British Board of Film Classification already make assessments of sexualised content. However, the Bill’s requirement on service providers and the process that the BBFC takes to classify content are not really comparable. Services will have far less time and much more content to consider them the BBFC does, so will not be able to take the same approach. The BBFC is able to take an extended time to consider maybe just one scene, one image or one conversation, and therefore can apply nuance to its assessments. That is not possible to do at the scale at which services will have to apply the child safety duties in the Bill. We therefore think there is a real risk that they would excessively apply those duties and adversely affect children’s rights online.
I know the noble Baroness and other noble Lords are rightly concerned with protecting rights to free expression and access to information online for children and for adults. It is important that we strike the right balance, which is what we have tried to do with the government amendments in this group.
To be clear, the point that I made about the BBFC was not to suggest a similar arrangement but to challenge the idea that we cannot categorise material of a sexualised nature. Building on the point made by the noble Lord, Lord Allan, perhaps we could think about it in terms of the amber light rather than the red light—in other words, something to think about.
I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.
My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.
The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—
With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?
I will leap to the Minister’s defence on this occasion. I remind noble colleagues that this is not about individual pieces of content; there would have to be a consistent flow of such information being proffered to children before Ofcom would ask for a change.
My Lords, these words have obviously appeared in the Bill in one of those unverified sections; I have clicked the wrong button, so I cannot see them. Where does it say in Amendment 172 that it has to be a consistent flow?
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, I reiterate what the noble Lord, Lord Bethell, has said and thank him for our various discussions between Committee and Report, particularly on this set of amendments to do with age verification. I also welcome the Government’s responsiveness to the concerns raised in Committee. I welcome these amendments, which are a step forward.
In Committee, I was arguing that there should be a level playing field for regulating any online platform with pornographic content, whether it falls under Part 3 or Part 5 of the Bill. I welcome the Government’s significant changes to Clauses 11 and 72 to ensure that robust age verification or estimation must be used and that standards are consistent across the Bill.
I have a few minor concerns that I wish to highlight. I am thoughtful about whether enough is required of search services in preventing young people from accessing pornography in Clause 25. I recognise the Government believe they have satisfied the need. I fear they may have done enough in the short term, but there is a real concern that this clause is not sufficiently future-proofed. Of course, only time will tell. Maybe the Minister could advise us further in that particular regard.
In Committee, I also argued that the duties in respect of pornography in Parts 3 and 5 must come into effect at the same time. I welcome the government commitment to placing a timeframe for the codes of practice and guidance on the face of the Bill through amendments including Amendment 230. I hope that the Minister will reassure us today that it is the Government’s intention that the duties in Clauses 11 and 72 will come into effect at the same time. Subsection (3) of the new clause proposed in Amendment 271 specifically states that the duties could come into effect at different times, which leaves a loophole for pornography to be regulated differently, even if only for a short time, between Part 3 and Part 5 services. This would be extremely regrettable.
I would also like to reiterate what I said last Thursday, in case the Minister missed my contribution when he intervened on me. I say once again that I commend the Minister for the announcement of the review of the regulation, legislation and enforcement of pornography offences, which I think was this time last week. I once again ask the Minister: will he set out a timetable for publishing the terms of reference and details of how this review will take place? If he cannot set out that timetable today, will he write to your Lordships setting out the timetable before the Recess, and ensure a copy is placed in the Library?
Finally, all of us across the House have benefited from the expertise of expert organisations as we have considered this Bill. I repeat my request to the Minister that he consider establishing an external reference group to support the review, consisting of those NGOs with particular and dedicated expertise. Such groups would have much to add to the process—they have much learning and advice, and there is much assistance there to the Government in that regard.
Once again, I thank the Minister for listening and responding. I look forward to seeing the protections for children set out in these amendments implemented. I shall watch implementation very closely, and I trust and hope that the regulator will take robust action once the codes of practice and guidance are published. Children above all will benefit from a safer internet.
My Lords, I welcome the government amendments in this group, which set out the important role that age assurance will play in the online safety regime. I particularly welcome Amendment 210, which states that companies must employ systems that are “highly effective” at correctly determining whether a particular user is a child to prevent access to pornography, and Amendment 124, which sets out in a code of practice principles which must be followed when implementing age assurance—principles that ensure alignment of standards and protections with the ICO’s age appropriate design code and include, among other things, that age assurance systems should be easy to use, proportionate to the risk and easy to understand, including to those with protected characteristics, as well as aiming to be interoperable. The code is a first step from current practice, in which age verification is opaque, used to further profile children and related adults and highly ineffective, to a world in which children are offered age-appropriate services by design and default.
I pay tribute again to the noble Lord, Lord Bethell, and the noble Baroness, Lady Benjamin, and I associate myself with the broad set of thanks that the noble Lord, Lord Bethell, gave in his opening speech. I also thank colleagues across your Lordships’ House and the other place for supporting this cause with such clarity of purpose. On this matter, I believe that the UK is world-beating, and it will be a testament to all those involved to see the UK’s age verification and estimation laws built on a foundation of transparency and trust so that those impacted feel confident in using them—and we ensure their role in delivering the online world that children and young people deserve.
I have a number of specific questions about government Amendment 38 and Amendment 39. I would be grateful if the Minister were able to answer them from the Dispatch Box and in doing so give a clear sign of the Government’s intent. I will also speak briefly to Amendments 125 and 217 in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, as well as Amendment 184 in the names of the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan. All three amendments address privacy.
Government Amendment 38, to which I have added my name, offers exemptions in new subsections (3A) and (3B) that mean that a regulated company need not use age verification or estimation to prevent access to primary priority content if they already prevent it by means of its terms of service. First, I ask the Minister to confirm that these exemptions apply only if a service effectively upholds its terms of service on a routine basis, and that failure to do so would trigger enforcement action and/or an instruction from Ofcom to apply age assurance.
My Lords, I shall follow on directly from some of the comments of the noble Baroness, Lady Kidron, around privacy. I shall not repeat the arguments around children and pornography but touch on something else, which is the impact of these amendments on the vast majority of internet users, the 85%-plus who are 18 or older. Of course, when we introduce age assurance measures, they will affect everyone: we should not kid ourselves that it is only about children, because everyone will have to pass through these gateways.
I shall speak particularly to Amendments 184 and 217 on privacy. I am sure that most adults will support extra safety measures for children, but they also want to be able to access a wide range of online services with the least possible friction and the lowest risk to their own personal data. We can explore how this might work in practice by looking at something that we might all do in this Chamber. Looking round, I believe that we are all at least 18 years old, and we might find ourselves idly passing the time creating an account on a new user-to-user or search service that has been recommended. We should consider this group of amendments by how that might play out. In future, the services will have to check that we are in the United Kingdom—there is a range of ways in which they can do that. Having confirmed that, they will need to understand whether we are 18-plus or a child user so that they can tailor their service appropriately.
I hope we all agree that the services should not be asking us for passports or driving licences, for example, as that would be entirely contrary to the thrust of privacy regulations and would be a huge gateway to fraud and other problems. The most efficient way would be for them to ask us for some sort of digital certificate—a certificate that we have on our devices where we have proven to a trusted third party that we are 18-plus. The certificate does not need to contain any personal data but simply confirms that we are of age. That is very similar to the way in which secure websites work: they send a digital certificate to your browser and you verify that certificate with a trusted third party—a certificate authority—and then you can open an encrypted connection. We are reversing the flow: the service will ask the user for a certificate and then verify that before granting access. A user may have a setting on their device in future where they confirm that they are happy for their 18-plus certificate to be given to anybody or whether they would like to be asked every time there will be a new set of privacy controls.
Building the infrastructure for this is non-trivial. Many things could go wrong but at least the kind of model I am describing has some hope of achieving widespread adoption. It is very good for the adult users as they can continue to have the frictionless experience as long as they are happy for their device to send a certificate to new services. It is good for the market of internet services if new services can bring users on easily. It is good for privacy by avoiding lots of services each collecting personal data, as most people access a multiplicity of services. Perhaps most importantly in terms of the Bill’s objectives, it is good for children if services can separate out the vast majority of their users who are 18-plus and then focus their real efforts on tailoring the services for the minority of users who will be children. The Bill will introduce a whole set of new obligations.
We should not underestimate the scale of the challenge in practice; it will work only if major internet companies are willing to play the game and get into the market of offering 18-plus certificates. Companies such as Google, Meta, Amazon, Apple and Microsoft—the ones we normally love to hate—will potentially provide the solution, as well as not-for-profits. There will be foundations for those who object to the big internet companies, but it is those big internet companies which will have the reach; they each have millions of users in the United Kingdom. This is not to fly the flag for those companies; it is simply a question of efficiency. I suspect that everyone in the Chamber uses a combination of services from those big providers. We already share with them the personal data necessary for age assurance, and there would be no additional sharing of data. If they were willing to provide a certificate, they could do so at the kind of scale necessary for the 50 million or so adult internet users in the United Kingdom to be able to get one easily and then pass it to services when they choose to access them.
There may be some discomfort with big tech playing this role, but I cannot see the kind of aggressive targets that we are setting in the amendments working unless we take advantage of those existing platforms and use them to make this work. Amendment 230 tells us that we have about 18 months, which is very soon in terms of trying to build something. We should be clear that if we are to deliver this package it will depend on persuading some of those big names in tech to create age certification schemes for UK users.
For this to have widespread adoption and a competitive market, we need it to be free of direct financial costs to individual users and to services choosing to age-verify, as we have asked them to do so. We need to think very carefully about that, as it raises a whole series of competition questions that I am sure Ofcom and the Competition and Markets Authority will have to address, not least because we will be asking companies to provide age certification free of charge that will be used by their existing and future competitors to meet their compliance requirements.
There may be some listening who think that we can rely on small age-assurance start-ups. Some of them have a really important role to play and we should be proud of our homegrown industry, but we should be realistic that they will reach scale only if they work with and through the large service providers. Many of them are already seeking those kinds of relationship.
As a test case, we might think of an application such as Signal, a messaging app that prides itself on being privacy-first. It does not want to collect any additional information from its users, which is perfectly reasonable, given where it is coming from. It will be really interesting to see how comfortable such a service will be with working with certification schemes, under which it can prove that users are over 18 by taking advantage of the data held by other services which collect significant amounts of data and have a very good idea of how old we are.
I have not focused on under-18s but, once this system is in place, application providers will be thinking very carefully about the pros and cons of allowing under-18s on at all. I know that the noble Baroness, Lady Kidron, is also concerned about this. There will be services that will think very carefully, if they find that the vast majority of their users are 18-plus, about the extent to which they want to put time and effort into tailoring them for users under 18. We do not intend that outcome from the Bill, but we need realistically to consider it.
Just to be clear, I say that the purpose of my question to the Minister was to get at the fact that, for low-risk situations, there can be age assurance that is a lot less effective or intrusive, for that very reason.
I agree; that is very helpful. I think Amendments 74, 93 and 99 also talk about the exclusion, as the noble Baroness raised, of services from the child safety duties if they can show that they are only 18-plus. It will be quite material and critical to know at what level they can demonstrate that.
I have avoided talking about pornography services directly, but there are interesting questions around what will happen if this model develops, as it likely will. If big tech is now starting to provide age certification for the kinds of mainstream services we may all want to access, they may be much less comfortable providing that same certification to pornography providers, for reputational reasons. A mainstream provider would not want to enter that market. Ofcom will need to take a view on this. We have talked about interoperability in the framework we have created, but it is a big question for Ofcom whether it wants to steer all age certification providers also to provide 18-plus certification for pornography providers or, effectively, to allow two markets to develop—one for mainstream certification and one for certification for pornography.
I have taken a few minutes because this is a very high-risk area for the Bill. There are material risks in willing into existence a model that depends on technical infrastructure that has not yet been built. The noble Lord, Lord Bethell, referred to prior experience; one of the reasons why we have not delivered age assurance before is that the infrastructure was not there. We now want it built, so must recognise that it is quite a high-risk endeavour. That does not mean it is not worth attempting, but we must recognise the risks and work on them.
If the implementation is poor, it will frustrate adult users, which may bring the Bill into disrepute. We need to recognise that as a genuine risk. There are people out there already saying that the Bill means that every internet service in the world will ask you for your passport. If that is not the case, we need to stress that we do not expect that to happen. There are also potentially significant impacts on the market for online services available to both adults and children in the UK, depending on the design of this system.
The purpose of thinking about some of these risks today is not to create a doom-laden scenario and say that it will not work. It is entirely the opposite—to say that, if we are to move ahead into a world in which children are protected from harmful content, for which very good reasons have been articulated and a huge amount of work has gone ahead, and in which services can tailor and gear access to the age of the child, we have to be able to take the 18-plus out of that, put it into a separate box and do so in a really easy, straightforward manner. If not, the 18-plus will end up dragging down what we want to do for the underage.
I hope that explanation helps in the context of these amendments. We will need to test them against it as implementation happens over the next few months.
My Lords, I rise to speak to all the amendments in this group. It is a cause of great regret that, despite many private meetings with officials, government lawyers and Ministers, we have not yet come to an agreement that would explicitly include in the Bill harm that does not derive from content. I will be listening very carefully to the Minister, if he should change his mind during the debate.
The amendments in this group fall into three categories. First, there is a series of amendments in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford: Amendments 35, 36, 37A and 85. I hope the Government will accept them as consequential because, in meetings last week, they would not accept that harm to children can arise from the functionality and design of services and not just from the content. Each of these amendments simply makes it clear that harm can arise absent from content: nothing more, nothing less. If the Minister agrees that harm may derive from the design of products and services, can he please explain, when he responds, why these amendments are not acceptable? Simply put, it is imperative that the features, functionalities or behaviours that are harmful to children, including those enabled or created by the design or operation of the service, are in scope of the Bill. This would make it utterly clear that a regulated company has a duty to design its service in a manner that does not harm children.
The Government have primary priority harmful content, priority content or non-designated harmful content, the latter being a category that is yet to be defined, but not the harm that emerges from how the regulated company designs its service. For example, there are the many hundreds of small reward loops that make up a doomscroll or make a game addictive; commercial decisions such as Pokémon famously did for a time, which was to end every game in a McDonald’s car park; or, more sinister still, the content-neutral friend recommendations that introduce a child to other children like them, while pushing children into siloed groups. For example, they deliberately push 13 year-old boys towards Andrew Tate—not for any content reason, but simply on the basis that 13 year-old boys are like each other and one of them has already been on that site.
The impact of a content-neutral friend recommendation has rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers. To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents.
The focus on content is old-fashioned and looks backwards. The Bill is drafted as if it has particular situations and companies in mind but does not think about how fast the business moves. When we started the Bill, none of us thought about the impact of TikTok; last week, we saw a new service, Threads, go from zero to 70 million users in a single day. It is an act of stunning hubris to be so certain of the form of harm. To be unprepared to admit that some harm is simply design means that, despite repeated denials, this is just a content Bill. The promise of systems and processes being at the heart of the Bill has been broken.
The second set of amendments in this group are in the name of my noble friend Lord Russell. Amendments 46 and 90 further reveal the attitude of the Government, in that they are protecting the companies rather than putting them four-square in the middle of their regime. The Government specifically exempt the manner of dissemination from the safety duties. My noble friend Lord Russell’s amendment would leave that out and ensure that the manner of dissemination, which is fundamental to the harm that children experience, is included. Similarly, Amendment 240 would take out “presented by content” so that harm that is the result of the design decisions is included in the Bill.
The third set are government Amendments 281C and 281D, and Amendment 281F, in my name. For absence of doubt, I am totally supportive of government Amendments 281C to 281E, which acknowledge the cumulative harms; for example, those that Molly Russell experienced as she was sent more and more undermining and harmful content. In as far as they are a response to my entreaties, and those of other noble Lords, that we ensure that cumulative harmful content is the focus of our concerns, I am grateful to the Government for tabling them. However, I note that the Government have conceded only the role of cumulative harm for content. Amendments 281D and 281E once again talk about content as the only harm to children.
The noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names to Amendment 281F, and I believe I am right in saying that if there were not a limit to four names, there were a great many Peers who would have added their names also. For the benefit of the House, I will quote directly from the amendment:
“When in relation to children, references to harm include the potential impact of the design and operation of a regulated service separately and additionally from harms arising from content, including the following considerations … the potential cumulative impact of exposure to harm or a combination of harms … the potential for harm to result from features, functionalities or behaviours enabled or created by the design and operation of services … the potential for some features and functionalities within a service to be higher risk than other aspects of the service … that a service may, when used in conjunction with other services, facilitate harm to a child on a different service … the potential for design strategies that exploit a child’s developmental vulnerabilities to create harm, including validation metrics and compulsive reward loops … the potential for real time services, features and functionalities such as geolocation, livestream broadcasts or events, augmented and virtual environments to put children at immediate risk … the potential for content neutral systems that curate or generate environments, content feeds or contacts to create harm to children … that new and emerging harms may arise from artificial intelligence, machine generated and immersive environments”.
Before I continue, I ask noble Lords to consider which of those things they would not like for their children, grandchildren or, indeed, other people’s children. I have accepted that the Government will not add the schedule of harms as I first laid it: the four Cs of content, conduct, contact and commercial harms. I have also accepted that the same schedule, written in the less comfortable language of primary priority, priority and non-designated harms, has also been rejected. However, the list that I just set out, and the amendment to the duties that reflect those risks, would finally put the design of the system at the heart of the Bill. I am afraid that, in spite of all our conversations, I cannot accept the Government’s argument that all harm comes from content.
Even if we are wrong today—which we are most definitely not—in a world of AI, immersive tech and augmented reality, is it not dangerous and, indeed, foolish, to exclude harm that might come from a source other than content? I imagine that the Minister will make the argument that the features are covered in the risk assessment duties and that, unlike content, features may be good or bad so they cannot be characterised as harmful. To that I say: if the risk assessment is the only process that matters, why do the Government feel it necessary to define the child safety duties and the interpretation of harm? The truth is, they have meaning. In setting out the duty of a company to a child, why would the Government not put the company’s design decisions right at the centre of that duty?
As for the second part of the argument, a geolocation feature may of course be great for a map service but less great if it shows the real-time location of a child to a predator, and livestreaming from a school concert is very different from livestreaming from your bedroom. Just as the noble Lord, Lord Allan, explained on the first day on Report, there are things that are red lines and things that are amber; in other words, they have to be age-appropriate. This amendment does not seek—nor would it mean—that individual features or functionalities would be prevented, banned or stopped. It would mean that a company had a duty to make sure that their features and functionalities were age-appropriate and did not harm children—full stop. There would be no reducing this to content.
Finally, I want to repeat what I have said before. Sitting in the court at Molly Russell’s inquest, I watched the Meta representative contest content that included blood cascading down the legs of a young woman, messages that said, “You are worthless”, and snippets of film of people jumping off buildings. She said that none of those things met the bar of harmful content according to Meta’s terms and conditions.
Like others, I believe that the Online Safety Bill could usher in a new duty of care towards children, but it is a category error not to see harm in the round. Views on content can always differ but the outcome on a child is definitive. It is harm, not harmful content, that the Bill should measure. If the Minister does not have the power to accede, I will, with great regret, be testing the opinion of the House. I beg to move.
My Lords, as so often in the course of the Bill, I associate myself wholeheartedly with the comments that the noble Baroness, Lady Kidron, just made. I, too, thank my noble friend the Minister and the Secretary of State for listening to our debates in Committee on the need to be explicit about the impact of cumulative harmful content. So I support Amendments 281C, 281D and 281E, and I thank them for tabling them.
My Lords, I thank everybody who has spoken for these amendments. I also thank the Minister for our many discussions and apologise to the House for the amount of texts that I sent while we were trying to get stand-alone harms into the Bill—unfortunately, we could not; we were told that it was a red line.
It is with some regret that I ask the House to walk through the Lobbies. Before I do so, I acknowledge that the Government have met me on very many issues, for which I am deeply grateful. There are no concessions on this Bill, only making it better. From my perspective, there is no desire to push anybody anywhere, only to protect children and give citizens the correct relationship with the digital world.
I ask those who were not here when I said this before: please think about your children and grandchildren and other people’s children and grandchildren before you vote against these amendments. They are not only heartfelt, as the Minister said, but have been drafted with reference to many experts and people in the business, who, in their best practice, meet some of these things already. We do not want the Bill, by concentrating on content, to be a drag on what we are pushing forward. We want it to be aspirational and to push the industry into another culture and another place. At a personal level, I am very sorry to the Minister, for whom I have a great deal of respect, but I would like to test the opinion of the House.
My Lords, as somebody who is only five feet and two inches, I have felt that size does not matter for pretty much all my life and have long wanted to say that in a speech. This group of amendments is really about how size does not matter; risk does. I will briefly build on the speech just given by the noble Lord, Lord Allan, very eloquently as usual, to describe why risk matters more than size.
First, there are laws for which size does matter—small companies do not need to comply with certain systems and processes—but not those concerned with safety. I have in my mind’s eye the small village fête, where we expect a risk assessment if we are to let children ride on rides. That was not the case 100 years ago, but is today because we recognise those dangers. One of the reasons why we stumbled into thinking that size should matter in this Bill is that we are not being honest about the scale of the risk for our children. If the risk is large enough, we should not be worrying about size; we should be worrying about that risk. That is the first reason why we have to focus on risk and not size.
The second reason is subsequent to what I have just said—the principles of the physical world should apply to the online world. That is one of the core tenets of this Bill. That means that if you recognise the real and present risks of the digital world you have to say that it does not matter whether a small number of people are affected. If it is a small business, it still has an obligation not to put people in harm’s way.
Thirdly, small becomes big very quickly—unfortunately, that has not been true for me, but it is true in the digital world as Threads has just shown us. Fourthly, we also know that in the digital world re-engineering something once it has got very big is really difficult. There is also a practical reason why you want engineers to think about the risks before they launch services rather than after the event.
We keep being told, rightly, that this is a Bill about systems and processes. It is a Bill where we want not just the outcomes that the noble Lord, Lord Allan, has referred to in terms of services in the UK genuinely being safer; we are trying to effect a culture change. I would argue one of the most important culture changes is that any bright, young tech entrepreneur has to start by thinking about the risks and therefore the safety procedures they need to put in place as they build their tech business from the ground up and not once they have reached some artificial size threshold.
My Lords, I have to admit that it was incompetence rather than lack of will that meant I did not add my name to Amendment 39 in the name of the noble Lord, Lord Bethell, and I would very much like the Government to accept his argument.
In the meantime, I wonder whether the Minister would be prepared to make it utterly clear that proportionality does not mean a little bit of porn to a large group of children or a lot of porn to a small group of children; rather, it means that high-risk situations require effective measures and low-risk situations should be proportionate to that. On that theme, I say to the noble Lord, Lord Allan, whose points I broadly agree with, that while we would all wish to see companies brought into the fold rather than being out of the fold, it rather depends on their risk.
This brings me neatly to Amendments 43 and 87 from the noble Lord, Lord Russell, to which I managed to add my name. They make a very similar point to Amendment 39 but across safety duties. Amendment 242 in my name, to which the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names, makes the same point—yet again—in relation to Ofcom’s powers.
All these things are pointing in the same direction as Amendment 245 in the name of the noble Baroness, Lady Morgan, which I keep on trumpeting from these Benches and which offers an elegant solution. I urge the Minister to consider Amendment 245 before day four of Report because if the Government were to accept it, it would focus company resources, focus Ofcom resources and, as we discussed on the first day of Report, permit companies which do not fit the risk profile of the regime and are unable to comply with something that does not fit their model yet leaves them vulnerable to enforcement also to be treated in an appropriate way.
Collectively, the ambition is to make sure that we are treating things in proportion to the risk and that proportionate does not start meaning something else.
My Lords, I agree with the noble Baroness, Lady Kidron, that all these amendments are very much heading in the same direction, and from these Benches I am extremely sympathetic to all of them. It may well be that this is very strongly linked to the categorisation debate, as the noble Baroness, Lady Kidron, said.
The amendment from the noble Lord, Lord Bethell, matters even more when we are talking about pornography in the sense that child safety duties are based on risks. I cannot for the life of me see why we should try to contradict that by adding in capacity and size and so on.
My noble friend made a characteristically thoughtful speech about the need for Ofcom to regulate in the right way and make decisions about risk and the capacity challenges of new entrants and so on. I was very taken by what the noble Baroness, Lady Harding, had to say. This is akin to health and safety and, quite frankly, it is a cultural issue for developers. What after all is safety by design if it is not advance risk assessment of the kinds of algorithm that you are developing for your platform? It is a really important factor.
My Lords, I rise briefly to note that, in the exchange between the noble Lords, Lord Allan and Lord Moylan, there was this idea about where you can complain. The independent complaints mechanism would be as advantageous to people who are concerned about freedom of speech as it would be for any other reason. I join and add my voice to other noble Lords who expressed their support for the noble Baroness, Lady Fox, on Amendment 162 about the Public Order Act.
My Lords, we are dangerously on the same page this evening. I absolutely agree with the noble Baroness, Lady Kidron, about demonstrating the need for an independent complaints mechanism. The noble Baroness, Lady Stowell, captured quite a lot of the need to keep the freedom of expression aspect under close review, as we go through the Bill. The noble Baroness, Lady Fox, and the noble Lord, Lord Moylan, have raised an important and useful debate, and there are some crucial issues here. My noble friend captured it when he talked about the justifiable limitations and the context in which limitations are made. Some of the points made about the Public Order Act offences are extremely valuable.
I turn to one thing that surprised me. It was interesting that the noble Lord, Lord Moylan, quoted the Equality and Human Rights Commission, which said it had reservations about the protection of freedom of expression in the Bill. As we go through the Bill, it is easy to keep our eyes on the ground and not to look too closely at the overall impact. In its briefing, which is pretty comprehensive, paragraph 2.14 says:
“In a few cases, it may be clear that the content breaches the law. However, in most cases decisions about illegality will be complex and far from clear. Guidance from Ofcom could never sufficiently capture the full range or complexity of these offences to support service providers comprehensively in such judgements, which are quasi-judicial”.
I am rather more optimistic than that, but we need further assurance on how that will operate. Its life would probably be easier if we did not have the Public Order Act offences in Schedule 7.
I am interested to hear what the Minister says. I am sure that there are pressures on him, from his own Benches, to look again at these issues to see whether more can be done. The EHRC says:
“Our recommendation is to create a duty to protect freedom of expression to provide an effective counterbalance to the duties”.
The noble Lord, Lord Moylan, cited this. There is a lot of reference in the Bill but not to the Ofcom duties. So this could be a late contender to settle the horses, so to speak.
This is a difficult Bill; we all know that so much nuance is involved. We really hope that there is not too much difficulty in interpretation when it is put into practice through the codes. That kind of clarity is what we are trying to achieve, and, if the Minister can help to deliver that, he will deserve a monument.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, as I set out in Committee, the Government are bringing forward a package of amendments to address the challenges that bereaved parents and coroners have faced when seeking to access data after the death of a child.
These amendments have been developed after consultation with those who, so sadly, have first-hand experience of these challenges. I thank in particular the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens and Frankie Thomas for raising awareness of the challenges they have faced when seeking access to information following the heartbreaking cases involving their children. I am also grateful to the noble Baroness, Lady Kidron, for championing this issue in Parliament and more widely. I am very happy to say that she is supporting the government amendments in this group.
The loss of any life is heartbreaking, but especially so when it involves a child. These amendments will create a more straightforward and humane process for accessing data and will help to ensure that parents and coroners receive the answers they need in cases where a child’s death may be related to online harms. We know that coroners have faced challenges in accessing relevant data from online service providers, including information about a specific child’s online activity, where that might be relevant to an investigation or inquest. It is important that coroners can access such information.
As such, I turn first to Amendments 246, 247, 249, 250, 282, 283 and 287, which give Ofcom an express power to require information from regulated services about a deceased child’s online activity following a request from a coroner. This includes the content the child had viewed or with which he or she had engaged, how the content came to be encountered by the child, the role that algorithms and other functionalities played, and the method of interaction. It also covers any content that the child generated, uploaded or shared on the service.
Crucially, this power is backed up by Ofcom’s existing enforcement powers, so that, where a company refuses to provide information requested by Ofcom, companies may be subject to enforcement action, including senior management liability. To ensure that there are no barriers to Ofcom sharing information with coroners, first, Amendment 254 enables Ofcom to share information with a coroner without the prior consent of a business to disclose such information. This will ensure that Ofcom is free to provide information it collects under its existing online safety functions to coroners, as well as information requested specifically on behalf of a coroner, where that might be useful in determining whether social media played a part in a child’s death.
Secondly, coroners must have access to online safety expertise, given the technical and fast-moving nature of the industry. As such, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest, following a request from a coroner. This may include, for example, information about a company’s systems and processes, including how algorithms have promoted specific content to a child. To this end, the Chief Coroner’s office will consider issuing non-statutory guidance and training for coroners about social media as appropriate, subject to the prioritisation of resources. We are confident that this well-established framework provides an effective means to provide coroners with training on online safety issues.
It is also important that we address the lack of transparency from large social media services about their approach to data disclosure. Currently, there is no common approach to this issue, with some services offering memorialisation or contact-nomination processes, while others seemingly lack any formal policy. To tackle this, a number of amendments in this group will require the largest services—category 1, 2A and 2B services—to set out policies relating to the disclosure of data regarding the online activities of a deceased child in a clear, accessible and sufficiently detailed format in their terms of service. These companies will also be required to provide a written response to data requests in a timely manner and must provide a dedicated helpline, or similar means, for parents to communicate with the company, in order to streamline the process. This will address the painful radio silence experienced by many bereaved parents. The companies must also offer options so that parents can complain when they consider that a platform is not meeting its obligations. These must be easy to access, easy to use and transparent.
The package of amendments will apply not only to coroners in England and Wales but also to Northern Ireland and equivalent investigations in Scotland, where similar sad events have occurred.
The Government will also address other barriers which are beyond the scope of this Bill. For example, we will explore measures to introduce data rights for bereaved parents who wish to request information about their deceased children through the Data Protection and Digital Information Bill. We are also working, as I said in Committee, with our American counterparts to clarify and, where necessary, address unintended barriers to information sharing created by the United States Stored Communications Act. I beg to move.
My Lords, I thank the Minister and indeed the Secretary of State for bringing forward these amendments in the fulsome manner that they have. I appreciate it, but I know that Bereaved Families for Online Safety also appreciates it. The Government committed to bringing forward these amendments on the last day in Committee, so they have been pre-emptively welcomed and discussed at some length. One need only read through Hansard of 22 June to understand the strength of feeling about the pain that has been caused to families and the urgent need to prevent others experiencing the horror faced by families already dealing with the loss of their child.
I will speak briefly on three matters only. First, I must once again thank bereaved families and colleagues in this House and in the other place for their tireless work in pressing this issue. This is one of those issues that does not allow for celebration. As I walked from the Chamber on 22 June, I asked one of the parents how they felt. They said: “It is too late for me”. It was not said in bitterness but in acknowledgement of their profound hurt and the failure of companies voluntarily to do what is obvious, moral and humane. I ask the Government to see the sense in the other amendments that noble Lords brought forward on Report to make children safer, and make the same, pragmatic, thoughtful solution to those as they have done on this group of amendments. It makes a huge difference.
Secondly, I need to highlight just one gap; I have written to the Secretary of State and the Minister on this. I find it disappointing that the Government did not find a way to require senior management to attend an inquest to give evidence. Given that the Government have agreed that senior managers should be subject to criminal liability under some circumstances, I do not understand their objections to summoning them to co-operate with legal proceedings. If a company submits information in response to Ofcom and at the coroner’s request the company’s senior management is invited to attend the inquest, it makes sense that someone should be required to appear to answer and follow up those questions. Again, on behalf of the bereaved families and specifically their legal representatives, who are very clear on the importance of this part of the regime, I ask the Government to reconsider this point and ask the Minister to undertake to speak to the department and the MoJ, if necessary, to make sure that, if senior managers are asked to attend court, they are mandated to do so.
Thirdly, I will touch on the additional commitments the Minister made beyond the Bill, the first of which is the upcoming Data Protection and Digital Information Bill. I am glad to report that some of the officials working on the Bill have already reached out, so I am grateful to the Minister that this is in train, but I expect it to include guidance for companies that will, at a minimum, cover data preservation orders and guidance about the privacy of other users in cases where a child has died. I think that privacy for other users is central to this being a good outcome for everybody, and I hope we are able to include that.
I am pleased to hear about the undertaking with the US regarding potential barriers, and I believe—and I would love to hear from the Minister—that the objective is to make a bilateral agreement that would allow data to be shared between the two countries in the case of a child’s death. It is very specific requirement, not a wide-ranging one. I believe, if we can do it on a bilateral basis, it would be easier than a broad attempt to change the data storage Act.
I turn finally to training for coroners. I was delighted that the Chief Coroner made a commitment to consider issuing non-legislative guidance and training on social media for coroners and the offer of consultation with experts, including Ofcom, the ICO and bereaved families and their representatives, but this commitment was made subject to funding. I ask the Minister to agree to discuss routes to funding from the levy via Ofcom’s digital literacy duty. I have proposed an amendment to the government amendment that would make that happen, but I would welcome the opportunity to discuss it with the Minister. Coroners must feel confident in their understanding of the digital world, and I am concerned that giving this new route to regulated companies via Ofcom without giving them training on how to use it may create a spectre of failure or further frustration and distress for bereaved families. I know there is not a person in the House who would want that to be the outcome of these welcome government amendments.
My Lords, I also welcome this group of amendments. I remember a debate led by the noble Baroness, Lady Kidron, some time ago in the Moses Room, where we discussed this, and I said at the time I thought it would get fixed in the Online Safety Bill. I said that in a spirit of hope, not knowing any of the detail, and it is really satisfying to see the detail here today. As she said, it is testimony to the families, many of whom got in touch with me at that time, who have persisted in working to find a solution for other families—as the noble Baroness said, it is too late for them, but it will make a real difference to other families—and it is so impressive that, at a time of extreme grief and justifiable anger, people have been able to channel that into seeking these improvements.
The key in the amendments, which will make that difference, is that there will be a legal order to which the platforms know they have to respond. The mechanism that has been selected—the information notice—is excellent because it will become well known to every one of the 25,000 or so platforms that operate in the United Kingdom. When they get an information notice from Ofcom, that is not something that they will have discretion over; they will need to comply with it. That will make a huge difference.
My Lords, I apologise for speaking once more today. I shall introduce Amendments 100 and 101 on the child user condition. They are very technical in nature and simply align the definition of “significant” in the Bill with the ICO’s age-appropriate design code to ensure regulatory alignment and to ensure the protection of the greatest number of children.
The Minister has stated on the record that the child-user condition is the same as the age-appropriate design code; however, in Clause 30(3) of the Bill, a service is “likely to be accessed” by children if
“(a) there is a significant number of children who are users of the service or of that part of it, or (b) the service, or that part of it, is of a kind likely to attract a significant number of users who are children”.
At Clause 30(4),
“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service or … part of a service”.
That is a key issue: “in proportion”. Because, by contrast, the ICO’s age-appropriate design code states that a service is “likely to be accessed” if
“children form a substantive and identifiable user group”.
That is quite a different threshold.
In addition, the ICO’s draft guidance on “likely to be accessed” sets out a list of factors that should be taken into consideration when making this assessment. These factors are far more extensive than Clause 30(4) and specifically state:
“‘Significant’ in this context does not mean that a large number of children must be using the service or that children form a substantial proportion of your users. It means that there are more than a de minimis or insignificant number of children using the service”.
In other words, it is possibly quite a small group, or a stand-alone group, that is not in proportion to the users. I will stop here to make the point that sometimes users are in their millions or tens of millions, so a small proportion could be many hundreds of thousands of children—just to be really clear that this matters and I am not quite dancing on the head of a pin here.
Amendment 101 mirrors the ICO’s draft guidance on age assurance on this point. I really struggle to see, if the intention of the Government is that these two things align, why this would not be just a technical amendment that they can just say yes to and we can move on.
I finish by reminding the House that the legal opinion of my noble and learned friend Lord Neuberger, the former head of the Supreme Court, which I shared with the Government, highlights the importance of regulatory alignment, clarity and consistency, particularly in new areas of law where concepts such as “likely to be accessed” are becoming a phrase that is in more than one Act.
My noble and learned friend states:
“As the Minister rightly says, simplicity and clarity are desirable in a statute, and it serves both simplicity and clarity if the same expression is used in the two statutes, and it is made clear that the same meaning is intended … The currently drafted reference in the Bill to ‘a significant number of children’ appears to me to be something of a recipe for uncertainty, especially when compared with the drafting of section 123 of the DPA”.
With that, I beg to move.
My Lords, Amendments 100 and 101 seek further to define the meaning of “significant” in the children’s access assessment, with the intention of aligning this with the meaning of “significant” in the Information Commissioner’s draft guidance on the age-appropriate design code.
I am grateful to the noble Baroness, Lady Kidron, for the way in which she has set out the amendments and the swiftness with which we have considered it. The test in the access assessment in the Bill is already aligned with the test in the code, which determines whether a service is likely to be accessed by children in order to ensure consistency for all providers. The Information Commissioner’s Office has liaised with Ofcom on its new guidance on the likely to access test for the code, with the intention of aligning the two regulatory regimes while reflecting that they seek to do different things. In turn, the Bill will require Ofcom to consult the ICO on its guidance to providers, which will further support alignment between the tests. So while we agree about the importance of alignment, we think that it is already catered for.
With regard to Amendment 100, Clause 30(4)(a) already states that
“the reference to a ‘significant’ number includes a reference to a number which is significant in proportion to the total number of United Kingdom users of a service”.
There is, therefore, already provision in the Bill for this being a significant number in and of itself.
On Amendment 101, the meaning of “significant” must already be more than insignificant by its very definition. The amendment also seeks to define “significant” with reference to the number of children using a service rather than seeking to define what is a significant number.
I hope that that provides some reassurance to the noble Baroness, Lady Kidron, and that she will be content to withdraw the amendment.
I am not sure that, at this late hour, I completely understood what the Minister said. On the basis that we are seeking to align, I will withdraw my amendment, but can we check that we are aligned as my speech came directly from a note from officials that showed a difference? On that basis, I am happy to withdraw.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, a lot of positive and interesting things have been said that I am sympathetic to, but this group of amendments raises concerns about a democratic deficit: if too much of the Bill is either delegated to the Secretary of State or open to interference in relation to the Secretary of State and Ofcom, who decides what those priorities are? I will ask for a couple of points of clarification.
I am glad to see that the term “public policy” has been replaced, because what did that mean? Everything. But I am not convinced that saying that the Secretary of State can decide not just on national security but on public safety and public health is reassuring in the present circumstances. The noble Lord, Lord Allan, has just pointed out what it feels like to be leaned on. We had a very recent example internationally of Governments leaning on big tech companies in relation to Covid policies, lockdowns and so on, and removing material that was seen to contradict official public health advice—often public health advice that turned out not to be accurate at all. There should at least have been a lot more debate about what were political responses to a terrible virus. Noble Lords will know that censorship became a matter of course during that time, and Governments interfering in or leaning on big tech directly was problematic. I am not reassured that the Government hold to themselves the ability to lean on Ofcom around those issues.
It is also worth remembering that the Secretary of State already has a huge amount of power to designate, as we have discussed previously. They can designate what constitute priority illegal offences and priority content harmful to children, and that can all change beyond what we have discussed here. We have already seen that there is a constant expansion of what those harms can be, and having those decisions removed using only secondary legislation, unaccountable to Parliament or to public scrutiny, really worries me. It is likely to give a green light to every identity group and special interest NGO to demand that the list of priority harms and so on should be dealt with. That is likely to make the job of the Secretary of State to respond to “something must be done” moral panics all the more difficult. If that is going to happen, we should have parliamentary scrutiny of it; it cannot just be allowed to happen elsewhere.
It is ironic that the Secretary of State is more democratic, because they are elected, than an unelected regulator. I just feel that there is a danger in so much smoke and mirrors. When the Minister very kindly agreed to see the noble Lord, Lord Moylan, and me, I asked in a rather exasperated way why Ofcom could not make freedom of expression a priority, with codes of practice so that it would have to check on freedom of speech. The Minister said, “It’s not up to me to tell Ofcom what to do”, and I thought, “The whole Bill is telling Ofcom what to do”. That did not seem to make any sense.
I had another exchange with the present Secretary of State—again, noble Lords will not be surprised to hear that it was not a sophisticated intervention on my part—in which I said, “Why can’t the Government force the big tech companies to put freedom of expression in their terms and conditions or terms of service?” The Minister said, “They are private companies; we’re not interfering in what they do”. So you just end up thinking, “The whole Bill is telling companies that they’re going to be compelled to act in relation to harm and safety, but not on freedom of expression”. What that means is that you feel all the time as though the Government are saying that they are outsourcing this to third parties, which means that you cannot hold anyone to account.
Civil liberties campaigner Guy Herbert compared this to what is happening with the banks at the moment; they are being blamed by the Government and held to account for things such as politically exposed people and Ts and Cs that overconcentrate on values such as EDI and ESG that may be leading to citizens of this country having their bank accounts closed down. The Government say that they will tell the regulator that it has to act and say that the banks cannot behave in this way, but this all came from legislation—it is not as though the regulator was doing it off its own bat. Maybe it overinterpreted the legislation and the banks then overinterpreted it again and overremoved.
The obvious analogy for me is that there is a danger here that we will not be able to hold anyone to account for overremoval of legitimate democratic discussion from the online world, because everyone is pointing the finger at everyone else. At the very least, the amendments are trying to say that any changes beyond what we have discussed so far on this Bill must come before Parliament. That is very important for any kind of democratic credibility to be attached to this legislation.
My Lords, I too express my admiration to the noble Baroness, Lady Stowell, for her work on this group with the Minister and support the amendments in her name. To pick up on what the noble Baroness, Lady Harding, said about infinite ping-pong, it can be used not only to avoid making a decision but as a form of power and of default decision-making—if you cannot get the information back, you are where you are. That is a particularly important point and I add my voice to those who have supported it.
I have a slight concern that I want to raise in public, so that I have said it once, and get some reassurance from the Minister. New subsection (B1)(d) in Amendment 134 concerns the Secretary of State directing Ofcom to change codes that may affect
“relations with the government of a country outside the United Kingdom”.
Many of the companies that will be regulated sit in America, which has been very forceful about protecting its sector. Without expanding on this too much, when it was suggested that senior managers would face some sort of liability in international fora, various parts of the American Government and state apparatus certainly made their feelings clearly known.
I am sure that the channels between our Government and the US are much more straightforward than any that I have witnessed, but it is absolutely definite that more than one Member of your Lordships’ House was approached about the senior management and said, “This is a worry to us”. I believe that where we have landed is very good, but I would like the Minister to say what the limits of that power are and acknowledge that it could get in a bit of a muddle with the economic outcomes that we were talking about, celebrating that they had been taken off the list, and government relations. That was the thing that slightly worried me in the government amendments, which, in all other ways, I welcome.
My Lords, this has been a consistent theme ever since the Joint Committee’s report. It was reported on by the Delegated Powers and Regulatory Reform Committee, and the Digital and Communications Committee, chaired by the noble Baroness, Lady Stowell, has rightly taken up the issue. Seeing some movement from the Minister, particularly on Clause 29 and specifically in terms of Amendments 134 to 137, is very welcome and consistent with some of the concerns that have been raised by noble Lords.
There are still questions to answer about Amendment 138, which my noble friend has raised. I have also signed the amendments to Clause 38 because I think the timetabling is extremely welcome. However, like other noble Lords, I believe we need to have Amendments 139, 140, 144 and 145 in place, as proposed by the noble Baroness, Lady Stowell of Beeston. The phrase “infinite ping-pong” makes us all sink in gloom, in current circumstances—it is a very powerful phrase. I think the Minister really does have to come back with something better; I hope he will give us that assurance, and that his discussions with the noble Baroness Stowell will bear further fruit.
I may not agree with the noble Lord, Lord Moylan, about the Clause 39 issues, but I am glad he raised issues relating to Clause 159. It is notable that of all the recommendations by the Delegated Powers and Regulatory Reform Committee, the Government accepted four out of five but did not accept the one related to what is now Clause 159. I have deliberately de-grouped the questions of whether Clauses 158 and 159 should stand part of the Bill, so I am going to pose a few questions which I hope, when we get to the second group which contains my clause stand part proposition, the Minister will be able to tell me effortlessly what he is going to do. This will prevent me from putting down further amendments on those clauses, because it seems to me that the Government are being extraordinarily inconsistent in terms of how they are dealing with Clauses 158 and 159 compared with how they have amended Clause 39.
For instance, Clause 158 allows the Secretary of State to issue a direction to Ofcom, where the Secretary of State has reasonable grounds for believing that there is a threat to public health and safety or national security, and they can direct Ofcom to set objectives in how they use their media-literacy powers in Section 11 of the Communications Act for a specific period to address the threat, and make Ofcom issue a public-statement notice. That is rather extraordinary. I will not go into great detail at this stage, and I hope the Minister can avoid me having to make a long speech further down the track, but the Government should not be in a position to be able to direct a media regulator on a matter of content. For instance, the Secretary of State has no powers over Ofcom on the content of broadcast regulation—indeed, they have limited powers to direct over radio spectrum and wires—and there is no provision for parliamentary involvement, although I accept that the Secretary of State must publish reasons for the direction. There is also the general question of whether the threshold is high enough to justify this kind of interference. So Clause 158 is not good news at all. It raises a number of questions which I hope the Minister will start to answer today, and maybe we can avoid a great debate further down the track.
My Lords, I rise briefly to support the noble Baroness, Lady Morgan, to welcome the government amendment and to say that this is a moment of delight for many girls—of all varieties. I echo the noble Baroness, Lady Fox, on the issue of having a broad consultation, which is a good idea. While our focus during the passage of this Bill was necessarily on preventing harm, I hope this guidance will be part of the rather more aspirational and exciting part of the digital world that allows young people to participate in social and civic life in ways that do not tolerate abuse and harm on the basis of their gender. In Committee, I said that we have a duty not to allow digital tech to be regressive for girls. I hope that this is a first step.
My Lords, on behalf of my party, all the groups mentioned by the noble Baroness, Lady Morgan, and potentially millions of women and girls in this country, I briefly express my appreciation for this government amendment. In Committee, many of us argued that a gender-neutral Bill would not achieve strong enough protection for women and girls as it would fail to recognise the gendered nature of online abuse. The Minister listened, as he has on many occasions during the passage of the Bill. We still have differences on some issues—cyberflashing, for instance—but in this instance I am delighted that he is amending the Bill, and I welcome it.
Why will Ofcom be required to produce guidance and not a code, as in the amendment originally tabled by the noble Baroness, Lady Morgan? Is there a difference, or is it a case of a rose by any other name? Is there a timescale by which Ofcom should produce this guidance? Are there any plans to review Ofcom’s guidance once produced, just to see how well it is working?
We all want the same thing: for women and girls to be free to express themselves online and not to be harassed, abused and threatened as they are today.
My Lords, I am most grateful to the noble Lord, Lord Clement-Jones, for tabling the amendment. If I had been quicker, I would have added my name to it, because he may— I use the word “may” advisedly, because I am not sure—have identified quite a serious gap in terms of future-proofing. As far as I understand it, in a somewhat naive way, the amendment probes whether there is a gap between provider-generated content and user-generated content and whether provider-generated content could lead to a whole lot of ghastly stuff on the metaverse without any way of tackling it because it is deemed to have fallen outside the scope of the Bill.
I am grateful to Carnegie UK for having tried to talk me through this—it is pretty complicated. As a specific example, I understand that a “Decentraland” avatar pops up on gaming sites, and it is useful because it warns you about the dangers of gambling and what it can lead to. But then there is the problem about the backdrop to this avatar: at the moment, it seems to be against gambling, but you can see how those who have an interest in gambling would be quite happy to have the avatar look pretty hideous but have a backdrop of a really enticing casino with lots of lights and people streaming in, or whatever. I am not sure where that would fit, because it seems that this type of content would be provider-generated. When it comes to the metaverse and these new ways of interacting with 3D immersion, I am not clear that we have adequately caught within the Bill some of these potentially dangerous applications. So I hope that the Minister will be able to clarify it for us today and, if not, possibly to write between now and the next time that we debate this, because I have an amendment on future-proofing, but it is in a subsequent group.
My Lords, I am interested to hear what the Minister says, but could he also explain to the House the difference in status of this sort of material in Part 5 versus Part 3? I believe that the Government brought in a lot of amendments that sorted it out and that many of us hoped were for the entire Bill, although we discovered, somewhat to our surprise, that they were only in Part 5. I would be interested if the Minister could expand on that.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content
“generated directly on the service by a user”,
which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content
“uploaded to or shared on the service by a user”,
which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.
A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.
Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?
We talked about bots controlled by service providers before the noble Lord, Lord Knight, asked questions on this. The Bill is designed to make online service providers responsible for the safety of their users in light of harmful activities that their platforms might facilitate. Providers of a user-to-user service will need to adhere to their duties of care, which apply to all user-generated content present on their service. The Bill does not, however, regulate content published by user-to-user providers themselves. That is because the providers are liable for the content they publish on the service themselves. The one exception to this—as the noble Baroness, Lady Kidron, alluded to in her contribution—is pornography, which poses a particular risk to children and is regulated by Part 5 of the Bill.
I am pleased to reassure the noble Lord, Lord Clement- Jones, that the Bill—
I thank the noble Lord for giving way. The Minister just said that private providers will be responsible for their content. I would love to understand what mechanism makes a provider responsible for their content?
I will write to noble Lords with further information and will make sure that I have picked up correctly the questions that they have asked.
On Amendment 152A, which the noble Lord, Lord Clement-Jones, has tabled, I am pleased to assure him that the Bill already achieves the intention of the amendment, which seeks to add characters and objects that might interact with users in the virtual world to the Bill’s definition of user-generated content. Let me be clear again: the Bill already captures any service that facilitates online user-to-user interaction, including in the metaverse or other augmented reality or immersive online worlds.
The Bill broadly defines “content” as
“anything communicated by means of an internet service”,
so it already captures the various ways in which users may encounter content. Clause 211 makes clear that “encounter” in relation to content for the purposes of the Bill means to,
“read, view, hear or otherwise experience”
content. That definition extends to the virtual worlds which noble worlds have envisaged in their contributions. It is broad enough to encompass any way of encountering content, whether that be audio-visually or through online avatars or objects.
In addition, under the Bill’s definition of “functionality”,
“any feature that enables interactions of any description between users of the service”
will be captured. That could include interaction between avatars or interaction by means of an object in a virtual world. All in-scope services must therefore consider a range of functionalities as part of their risk assessment and must put in place any necessary measures to mitigate and manage any risks that they identify.
I hope that that provides some assurance to the noble Lord that the concerns that he has raised are covered, but I shall happily write on his further questions before we reach the amendment that the noble Baroness, Lady Finlay, rightly flagged in her contribution.
My Lords, I strongly support Amendment 180, tabled by the noble Baroness, Lady Merron. I will also explain why I put forward Amendment 180A. I pay tribute to the noble Baroness, Lady Hayman, who pursued this issue with considerable force through her Question in the House.
There is clearly an omission in the Bill. One of its primary aims is to protect children from harmful online content, and animal cruelty content causes harm to the animals involved and, critically, to the people who view it, especially children. In Committee, in the Question and today, we have referred to the polling commissioned by the RSPCA, which found that 23% of 10 to 18 year-olds had seen animal cruelty on social media sites. I am sure that the numbers have increased since that survey in 2018. A study published in 2017 found—if evidence were needed—that:
“There is emerging evidence that childhood exposure to maltreatment of companion animals is associated with psychopathology in childhood and adulthood.”
The noble Baroness made an extremely good case, and I do not think that I need to add to it. When the Bill went through the Commons, assurances were given by the former Minister, Damian Collins, who acknowledged that the inclusion of animal cruelty content in the Bill deserves further consideration as the Bill progresses through its parliamentary stages. We need to keep up that pressure, and we will be very much supporting the noble Baroness if she asks for the opinion of the House.
Turning to my Amendment 180A, like the noble Baroness, I pay tribute to the Social Media Animal Cruelty Coalition, which is a very large coalition of organisations. We face a global extinction crisis which the UK Government themselves have pledged to reverse. Algorithmic amplification tools and social media recommendation engines have driven an explosive growth in online wildlife trafficking. A National Geographic article from 2020 quoted US wildlife officials describing the dizzying scale of the wildlife trade on social media. The UK’s national wildlife crime units say that cyber-enabled wildlife crime has become their priority focus, since virtually all wildlife cases they now investigate have a cyber component to them, usually involving social media or e-commerce platforms. In a few clicks it is easy to find pages, groups and postings selling wildlife products made from endangered species, such as elephant ivory, rhino horn, pangolin scales and marine turtle shells, as well as big cats, reptiles, birds, primates and insects for the exotic pet trade. This vast, unregulated trade in live animals and their parts is not only illegal but exacerbates the risk of another animal/human spillover event such as the ones that caused Ebola, HIV and the Covid-19 pandemic.
In addition to accepting the animal welfare amendment tabled by the noble Baroness, which I hope they do, the Government should also add offences under the Control of Trade in Endangered Species Regulations 2018 to Schedule 7 to the Bill. This would definitely help limit the role of social media platforms in enabling wildlife trafficking, helping to uphold the UK’s commitments to tackling global wildlife crime.
My Lords, I rise very briefly to support the noble Baroness, Lady Merron, and to make only one point. As someone who has the misfortune of seeing a great deal of upsetting material of all kinds, I have to admit that it sears an image on your mind. I have had the misfortune to see the interaction of animal and human cruelty in the same sequences, again and again. In making the point that there is a harm to humans in witnessing and normalising this kind of material, I offer my support to the noble Baroness.
My Lords, Amendments 180 and 180A seek to require the Secretary of State to conduct a review of existing legislation and how it relates to certain animal welfare offences and, contingent on this review, to make them priority offences under the regulatory framework.
I am grateful for this debate on the important issue of protecting against animal cruelty online, and all of us in this House share the view of the importance of so doing. As the House has discussed previously, this Government are committed to strong animal welfare standards and protections. In this spirit, this Government recognise the psychological harm that animal cruelty content can cause to children online. That is why we tabled an amendment that lists content that depicts real or realistic serious violence or injury against an animal, including by fictional creatures, as priority content that is harmful to children. This was debated on the first day of Report.
In addition, all services will need proactively to tackle illegal animal cruelty content where this amounts to an existing offence such as extreme pornography. User-to-user services will be required swiftly to remove other illegal content that targets an individual victim once made aware of its presence.
The noble Baroness asked about timing. We feel it is important to understand how harm to animals as already captured in the Bill will function before committing to the specific remedy proposed in the amendments.
As discussed in Committee, the Bill’s focus is rightly on ensuring that humans, in particular children, are protected online, which is why we have not listed animal offences in Schedule 7. As many have observed, this Bill cannot fix every problem associated with the internet. While we recognise the psychological harm that can be caused to adults by seeing this type of content, listing animal offences in Schedule 7 is likely to dilute providers’ resources away from protecting humans online, which is the Bill’s main purpose.
However, I understand the importance of taking action on animal mistreatment when committed online, and I am sympathetic to the intention of these amendments. As discussed with the noble Baroness, Defra is confident that the Animal Welfare Act 2006 and its devolved equivalents can successfully bring prosecutions for the commission and action of animal torture when done online in the UK. These Acts do not cover acts of cruelty that take place outside the UK. I know from the discussion we have had in this House that there are real concerns that the Animal Welfare Act 2006 cannot tackle cross-border content, so I wish to make a further commitment today.
The Government have already committed to consider further how the criminal law can best protect individuals from harmful communications, alongside other communications offences, as part of changes made in the other place. To that end, we commit to include the harm caused by animal mistreatment communications as part of this assessment. This will then provide a basis for the Secretary of State to consider whether this offence should be added to Schedule 7 to the OSB via the powers in Clause 198. This work will commence shortly, and I am confident that this, in combination with animal cruelty content listed as priority harms to children, will safeguard users from this type of content online.
For the reasons set out, I hope the noble Baroness and the noble Lord will consider not pressing their amendments.
My Lords, these seem very sensible amendments. I am curious about why they have arrived only at this stage, given this was a known problem and that the Bill has been drafted over a long period. I am genuinely curious as to why this issue has been raised only now.
On the substance of the amendments, it seems entirely sensible that, given that we are now going to have 20,000 to 25,000 regulated entities in scope, some of which will never have encountered child sexual exploitation or abuse material or understood that they have a legal duty in relation to it, it will be helpful for them to have a clear set of regulations that tell them how to treat their material.
Child sexual exploitation or abuse material is toxic in both a moral and a legal sense. It needs to be treated almost literally as toxic material inside a company, and sometimes that is not well understood. People feel that they can forward material to someone else, not understanding that in doing so they will break the law. I have had experiences where well-meaning people acting in a vigilante capacity sent material to me, and at that point you have to report them to police. There are no ifs or buts. They have committed an offence in doing so. As somebody who works inside a company, your computer has to be quarantined and taken off and cleaned, just as it would be for any other toxic material, because we framed the law, quite correctly, to say that we do not want to offer people the defence of saying “I was forwarding this material because I’m a good guy”. Forwarding the material is a strict liability offence, so to have regulations that explain, particularly to organisations that have never dealt with this material, exactly how they have to deal with it in order to be legally compliant will be extremely helpful.
One thing I want to flag is that there are going to be some really fundamental cross-border issues that have to be addressed. In many instances of child sexual exploitation or abuse material, the material has been shared between people in different jurisdictions. The provider may not be in a UK jurisdiction, and we have got to avoid any conflicts of laws. I am sure the Government are thinking about this, but in drafting those regulations, what we cannot do, for example, is order a provider to retain data in a way that would be illegal in the jurisdiction from which it originates or in which it has its headquarters. The same would apply vice versa. We would not expect a foreign Government to order a UK company to act in a way that was against UK law in dealing with child sexual exploitation or abuse material. This all has to be worked out. I hope the Government are conscious of that.
I think the public interest is best served if the United Kingdom, the United States and the European Union, in particular, adopt common standards around this. I do not think there is anything between us in terms of how we would want to approach child sexual exploitation or abuse material, so the extent to which we end up having common legal standards will be extraordinarily helpful.
As a general matter, to have regulations that help companies with their compliance is going to be very helpful. I am curious as to how we have got there with the amendment only at this very late stage.
My Lords, I rise to make a slightly lesser point, but I also welcome these amendments. I want to ask the Minister where the consultation piece of this will lie and to check that all the people who have been in this space for many years will be consulted.
My Lords, as ever, my noble friend Lord Allan and the noble Baroness, Lady Kidron, have made helpful, practical and operational points that I hope the Minister will be able to answer. In fact, the first half of my noble friend’s speech was really a speech that the Minister himself could have given in welcoming the amendment, which we do on these Benches.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, I shall speak to my Amendment 275A in this group. It would place a duty on Ofcom to report annually on areas where our legal codes need clarification and revision to remain up to date as new technologies emerge—and that is to cover technologies, some of which we have not even thought of yet.
Government Amendments 206 and 209 revealed the need for an amendment to the Bill and how it would operate, as they clarify that reference to pornographic content in the Bill includes content created by a bot. However, emerging technologies will need constant scrutiny.
As the noble Lord, Lord Clement-Jones, asked, what about provider content, which forms the background to the user interaction and may include many harms. For example, would a game backdrop that includes anti-Semitic slurs, a concentration camp, a sex shop or a Ku Klux Klan rally be caught by the Bill?
The Minister confirmed that “content” refers to anything communicated by means of an internet service and the encounter includes any content that individuals read, view, hear or otherwise experience, making providers liable for the content that they publish. Is this liable under civil, regulatory or criminal law?
As Schedule 1 goes to some lengths to exempt some service-to-provider content, can the Minister for the record provide chapter and verse, as requested by the noble Lord, Lord Clement-Jones, on provider liability and, in particular, confirm whether such content would be dealt with by the Part 3 duties under the online safety regime or whether users would have to rely on similar law for claims at their own expense through the courts or the police carry the burden of further enforcement?
Last week, the Minister confirmed that “functionality” captures any feature enabling interactions of any description between service users, but are avatars or objects created by the provider of a service, not by an individual user, in scope and therefore subject to risk assessments and their mitigation requirements? If so, will these functionalities also be added to user empowerment tools, enabling users to opt out of exposure to them, or will they be caught only by child safety duties? Are environments provided by a service provider, such as a backdrop to immersive environments, in scope through the definition of “functionality”, “content” or both? When this is provider content and not user-generated content, will this still hold true?
All this points to a deeper issue. Internet services have become more complex and vivid, with extremely realistic avatars and objects indistinguishable from people and objects in the real world. This amendment avoids focusing on negatives associated with AI and new technologies but tries to ensure that the online world is as safe as the offline world should be. It is worth noting that Interpol is already investigating how to deal with criminals in the metaverse and anticipating crimes against children, data theft, money laundering, fraud and counterfeit, ransomware, phishing, sexual assault and harassment, among other things. Many of these behaviours operate in grey areas of the law where it is not clear whether legal definitions extend to the metaverse.
Ofcom has an enormous task ahead, but it is best placed to consider the law’s relationship to new technological developments and to inform Parliament. Updating our laws through the mechanisms proposed in Amendment 275A will provide clarity to the courts, judges, police and prosecution service. I urge the Minister to provide as full an answer as possible to the many questions I have posed. I am grateful to him for all the work he has been doing. If he cannot accept my amendment as worded, will he provide an assurance that he will return to this with a government amendment at Third Reading?
My Lords, I will speak to Amendment 191A in my name. I also support Amendment 186A in the name of the noble Lord, Lord Moylan, Amendment 253 in the name of the noble Lord, Lord Clement-Jones, and Amendment 275A in the name of my noble friend Lady Finlay. I hope that my words will provide a certain level of reassurance to the noble Lord, Lord Moylan.
In Committee and on Report, the question was raised as to how to support the coronial system with information, education and professional development to keep pace with the impact of the fast-changing digital world. I very much welcome the Chief Coroner’s commitment to professional development for coroners but, as the Minister said, this is subject to funding. While it is right that the duty falls to the Chief Coroner to honour the independence and expert knowledge associated with his roles, this amendment seeks to support his duties with written guidance from Ofcom, which has no such funding issue since its work will be supported by a levy on regulated companies—a levy that I argue could usefully and desirably contribute to the new duties that benefit coroners and bereaved parents.
The role of a coroner is fundamental. They must know what preliminary questions to ask and how to triage the possibility that a child’s digital life is relevant. They must know that Ofcom is there as a resource and ally and how to activate its powers and support. They must know what to ask Ofcom for, how to analyse information they receive and what follow-up questions might be needed. Importantly, they must feel confident in making a determination and describing the way in which the use of a regulated service has contributed to a child’s death, in the case that that is indeed their finding. They must be able to identify learnings that might prevent similar tragedies happening in the future. Moreover, much of the research and information that Ofcom will gather in the course of its other duties could be usefully directed at coroners. All Amendment 191A would do is add to the list of reports that Ofcom has to produce with these issues in mind. In doing so, it would do the Chief Coroner the service of contributing to his own needs and plans for professional development.
I turn to Amendment 186A in the name of the noble Lord, Lord Moylan, who makes a very significant point in bringing it forward. Enormous effort goes into creating an aura of exceptionality for the tech sector, allowing it to avoid laws and regulations that routinely apply to other sectors. These are businesses that benefit from our laws, such as intellectual copyright or international tax law. However, they have negotiated a privileged position in which they have privatised the benefits of our attention and data while outsourcing most of the costs of their service to the public purse or, indeed, their users.
Terms and conditions are a way in which a company enters into a clear agreement with its users, who then “pay” for access with their attention and their data: two of the most valuable commodities in today’s digital society. I am very sympathetic to the noble Lord’s wish to reframe people, both adults and children, from a series of euphemisms that the sector employs—such as “users”, “community members”, “creators” or “participants”—to acknowledge their status as consumers who have rights and, in particular, the right to expect the product they use to be safe and for providers to be held accountable if it is not. I join the noble Lord in asserting that there are now six weeks before Third Reading. This is a very valuable suggestion that is worthy of government attention.
Amendment 253 in the name of the noble Lord, Lord Clement-Jones, puts forward a very strong recommendation of the pre-legislative committee. We were a bit bewildered and surprised that it was not taken up at the time, so I will be interested to hear what argument the Minister makes to exclude it, if indeed he does so. I say to him that I have already experienced the frustration of being bumped from one regulator to another. Although my time as an individual or the organisational time of a charity is minor in the picture we are discussing, it is costly in time and resources. I point to the time, resources and potential effectiveness of the regulatory regime. However well oiled and well funded the regulatory regime of the Online Safety Bill is, I do not think it will be as well oiled and well funded as those that it seeks to regulate.
I make it clear that I accept the arguments of not wanting to create a super-regulator or slow down or confuse existing regulators which each have their own responsibilities, but I feel that the noble Lord, Lord Clement-Jones, has approached this with more of a belt-and-braces approach rather than a whole realignment of regulators. He simply seeks to make it explicit that regulators can, should and do have a legal basis on which to work singularly or together when it suits them. As I indicated earlier, I cannot quite understand why that would not be desirable.
Finally, in what is truly a miscellaneous group, I will refer to the amendment in the name of my noble friend Lady Finlay. I support the intent of this amendment and sincerely hope that the Minister will be able to reassure us that this is already in the Bill and will be done by Ofcom under one duty or another. I hope that he will be able to point to something that includes this. I thank my noble friend for raising it, as it harks back to an amendment in Committee in my name that sought to establish that content deemed harmful in one format would be deemed harmful in all formats—whether synthetic, such as AI, the metaverse or augmented reality. As my noble friend alluded to, it also speaks to the debate we had last week in relation to the amendment from the noble Lord, Lord Clement-Jones, about provider content in the metaverse.
My Lords, let me add to this miscellany by speaking to the government amendments that stand in my name as part of this group. The first is Amendment 288A, which we mentioned on the first group of amendments on Report because it relates to the new introductory clause, Clause 1, and responds to the points raised by the noble Lord, Lord Stevenson of Balmacara. I am very happy to say again that the Government recognise that people with multiple and combined characteristics suffer disproportionately online and are often at greater risk of harm. This amendment therefore adds a provision in the new interpretation clause, Clause 1, to put beyond doubt that all the references to people with “a certain characteristic” throughout the Bill include people with a combination of characteristics. We had a good debate about the Interpretation Act 1978, which sets that out, but we are happy to set it out clearly here.
In his Amendment 186A, my noble friend Lord Moylan seeks to clarify a broader issue relating to consumer rights and online platforms. He got some general support—certainly gratitude—for raising this issue, although there was a bit of a Committee-style airing of it and a mixture of views on whether this is the right way or the right place. The amendment seeks to make it clear that certain protections for consumers in the Consumer Rights Act 2015 apply when people use online services and do not pay for them but rather give up their personal data in exchange. The Government are aware that the application of the law in that area is not always clear in relation to free digital services and, like many noble Lords, express our gratitude to my noble friend for highlighting the issue through his amendment.
We do not think that the Bill is the right vehicle for attempting to provide clarification on this point, however. We share some of the cautions that the noble Lord, Lord Allan of Hallam, raised and agree with my noble friend Lady Harding of Winscombe that this is part of a broader question about consumer rights online beyond the services with which the Bill is principally concerned. It could be preferable that the principle that my noble friend Lord Moylan seeks to establish through his amendment should apply more widely than merely to category 1 services regulated under the Bill. I assure him that the Bill will create a number of duties on providers which will benefit users and clarify that they have existing rights of action in the courts. We discussed these new protections in depth in Committee and earlier on Report. He drew attention to Clause 65(1), which puts a requirement on all services, not just category 1 services, to include clear and accessible provisions in their terms of service informing users about their right to bring a claim for breach of contract. Therefore, while we are grateful, we agree with noble Lords who suggested that this is a debate for another day and another Bill.
Amendment 191A from the noble Baroness, Lady Kidron, would require Ofcom to issue guidance for coroners and procurators fiscal to aid them in submitting requests to Ofcom to exercise its power to obtain information from providers about the use of a service by a deceased child. While I am sympathetic to her intention, I do not think that her amendment is the right answer. It would be inappropriate for an agency of the Executive to issue guidance to a branch of the judiciary. As I explained in Committee, it is for the Chief Coroner to provide detailed guidance to coroners. This is written to assist coroners with the law and their legal duties and to provide commentary and advice on policy and practice.
The amendment tabled by the noble Baroness cuts across the role of the Chief Coroner and risks compromising the judicial independence of the coroner, as set out in the Constitutional Reform Act 2005. As she is aware, the Chief Coroner has agreed to consider issuing guidance to coroners on social media and to consider the issues covered in the Bill. He has also agreed to explore whether coroners would benefit from additional training, with the offer of consultation with experts including Ofcom and the Information Commissioner’s Office. I suggest that the better approach would be for Ofcom and the Information Commissioner’s Office to support the Chief Coroner in his consideration of these issues where he would find that helpful.
I agree with the noble Lord, Lord Allan, that coroners must have access to online safety expertise given the technical and fast-moving nature of this sector. As we have discussed previously, Amendment 273 gives Ofcom a power to produce a report dealing with matters relevant to an investigation or inquest following a request from a coroner which will provide that expertise. I hope that this reassures the noble Baroness.
I understand the report on a specific death, which is very welcome and part of the regime as we all see it. The very long list of things that the coroner may not know that they do not know, as I set out in the amendment, is the issue which I and other noble Lords are concerned about. If the Government could find a way to make that possible, I would be very grateful.
We are keen to ensure that coroners have access to the information and expertise that they need, while respecting the independence of the judicial process to decide what they do not know and would like to know more about and the role of the Chief Coroner there. It is a point that I have discussed a lot with the noble Baroness and with my noble friend Lady Newlove in her former role as Victims’ Commissioner. I am very happy to continue doing so because it is important that there is access to that.
The noble Lord, Lord Stevenson, spoke to the amendments tabled by the noble Baroness, Lady Merron, about supposedly gendered language in relation to Clauses 141 and 157. As I made clear in Committee, I appreciate the intention—as does Lady Deben—of making clear that a person of either sex can perform the role of chairman, just as they can perform the role of ombudsman. We have discussed in Committee the semantic point there. The Government have used “chairman” here to be consistent with terminology in the Office of Communications Act 2002. I appreciate that this predates the Written Ministerial Statement which the noble Lord cited, but that itself made clear that the Government at the time recognised that in practice, parliamentary counsel would need to adopt a flexible approach to this change—for example, in at least some of the cases where existing legislation originally drafted in the former style is being amended.
The noble Lord may be aware of a further Written Ministerial Statement, made on 23 May last year, following our debates on gendered language on another Bill, when the then Lord President of the Council and Leader of the House of Commons said that the Office of the Parliamentary Counsel would update its drafting guidance in light of that. That guidance is still forthcoming. However, importantly, the term here will have no bearing on Ofcom’s decision-making on who would chair the advisory committees. It must establish that this could indeed be a person of either sex.
Amendment 253 seeks to enable co-operation, particularly via information-sharing, between Ofcom and other regulators within the UK. I reassure noble Lords that Section 393 of the Communications Act 2003 already includes provisions for sharing information between Ofcom and other regulators in the UK.
As has been noted, Ofcom already co-operates effectively with other domestic regulators. That has been strengthened by the establishment of the Digital Regulation Co-operation Forum. By promoting greater coherence, the forum helps to resolve potential tensions, offering clarity for people and the industry. It ensures collaborative work across areas of common interest to address complex problems. Its outputs have already delivered real and wide-ranging impacts, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues, and horizon-scanning activities on new regulatory challenges. We will continue to assess how best to support collaboration between digital regulators and to ensure that their approaches are joined up. We therefore do not think that Amendment 253 is necessary.
My Lords, before I talk to the amendments I had intended to address, I will make a very narrow point in support of the noble Baroness, Lady Fraser. About 10 years ago, when I started doing work on children, I approached Ofcom and asked why all its research goes to 24, when childhood finishes at 18 and the UNCRC says that a child needs special consideration. Ofcom said, “Terribly sorry, but this is our inheritance from a marketing background”. The Communications and Digital Committee later wrote formally to Ofcom and asked if it could do its research up to 18 and then from 18 to 24, but it appeared to be absolutely impossible. I regret that I do not know what the current situation is and I hope that, with the noble Lord, Lord Grade, in place it may rapidly change overnight. My point is that the detailed description that the noble Baroness gave the House about why it is important to stipulate this is proven by that tale.
I also associate myself with the remarks of the noble Lord, Lord Allan, who terrified me some 50 minutes ago. I look forward to hearing what will be said.
I in fact rose to speak to government Amendments 196 and 199, and the bunch of amendments on access to data for researchers. I welcome the government amendments to which I added my name. I really am delighted every time the Government inch forward into the area of the transparency of systemic and design matters. The focus of the Bill should always be on the key factor that separates digital media from other forms of media, which is the power to determine, manipulate and orchestrate what a user does next, see how they behave or what they think. That is very different and is unique to the technology we are talking about.
It will not surprise the Minister to hear that I would have liked this amendment to cover the design of systems and processes, and features and functionalities that are not related to content. Rather than labouring this point, on this occasion I will just draw the Minister’s attention to an article published over the weekend by Professor Henrietta Bowden-Jones, the UK’s foremost expert on gambling and gaming addiction. She equates the systems and processes involved in priming behaviours on social media with the more extreme behaviours that she sees in her addiction clinics, with ever younger children. Professor Bowden-Jones is the spokesperson on behavioural addictions for the Royal College of Psychiatrists, and the House ignores her experience of the loops of reward and compulsion that manipulate behaviour, particularly the behaviour of children, at our peril.
I commend the noble Lord, Lord Bethell, for continuing to press the research issue and coming back, even in the light of the government amendment, with a little more. Access to good data about the operation of social media is vital in holding regulated companies to account, tracking the extent of harms, building an understanding of them and, importantly, building knowledge about how they might be sensibly and effectively addressed.
The benefit of having a period of time between the last day of Report on Wednesday and Third Reading is that that gives the Minister, the Bill team and parliamentary counsel the time to reflect on the kind of power that could be devised. The wording could be devised, and I would have thought that six weeks would be quite adequate for that, perhaps in a general way. After all, this is not a power that is immediately going to be used; it is a general power that could be brought into effect by regulation. Surely it is not beyond the wit to devise something suitable.
Sit down or stand up—I cannot remember.
I wonder whether the department has looked at the DSA and other situations where this is being worked out. I recognise that it takes a period of time, but it is not without some precedent that a pathway should be described.
We do not think that six weeks is enough time for the evidence base to develop sufficiently, our assessment being that to endow the Secretary of State with that power at this point is premature.
Amendment 262AA would require Ofcom to consider whether it is appropriate to require providers to take steps to comply with Ofcom’s researcher access guidance when including a requirement to take steps in a confirmation decision. This would be inappropriate because the researcher access provisions are not enforceable requirements; as such, compliance with them should not be subject to enforcement by the regulator. Furthermore, enforcement action may relate to a wide variety of very important issues, and the steps needed should be sufficient to address a failure to comply with an enforceable requirement. Singling out compliance with researcher access guidance alone risks implying that this will be adequate to address core failures.
Amendment 272AB would require Ofcom to give consideration to whether greater access to data could be achieved through legal requirements or incentives for regulated services. I reassure noble Lords that the scope of Ofcom’s report will already cover how greater access to data could be achieved, including through enforceable requirements on providers.
Amendment 272E would require Ofcom to take a provider’s compliance with Ofcom’s guidance on researcher access to data into account when assessing risks from regulated services and determining whether to take enforcement action and what enforcement action to take. However, we do not believe that this is a relevant factor for consideration of these issues. I hope noble Lords will agree that whether or not a company has enabled researcher access to its data should not be a mitigating factor against Ofcom requiring companies to deal with terrorism or child sexual exploitation or abuse content, for example.
On my noble friend Lord Bethell’s remaining Amendments 272BA, 273A and 273B, the first of these would require Ofcom to publish its report on researchers’ access to information within six months. While six months would not be deliverable given other priorities and the complexity of this issue, the government amendment to which I have spoken would reduce the timelines from two years to 18 months. That recognises the importance of the issue while ensuring that Ofcom can deliver the key priorities in establishing the core parts of the regulatory framework; for example, the illegal content and child safety duties.
My Lords, I will speak to the government Amendments 274B and 274C. I truly welcome a more detailed approach to Ofcom’s duties in relation to media literacy. However, as is my theme today, I raise two frustrations. First, having spent weeks telling us that it is impossible to include harms that go beyond content and opposing amendments on that point, the Government’s media literacy strategy includes a duty to help users to understand the harmful ways in which regulated services may be used. This is in addition to understanding the nature and impact of harmful content. It appears to suggest that it is the users who are guilty of misuse of products and services rather than putting any emphasis on the design or processes that determine how a service is most often used.
I believe that all of us, including children, are participants in creating an online culture and that educating and empowering users of services is essential. However, it should not be a substitute for designing a service that is safe by design and default. To make my point absolutely clear, I recount the findings of researchers who undertook workshops in 28 countries with more than 1,000 children. The researchers were at first surprised to find that, whether in Kigali, São Paulo or Berlin, to an overwhelming extent children identified the same problems online—harmful content, addiction, privacy, lack of privacy and so on. The children’s circumstances were so vastly different—country and town, Africa and the global north et cetera—but when the researchers did further analysis, they realised that the reason why they had such similar experiences was because they were using the same products. The products were more determining of the outcome than anything to do with religion, education, status, age, the family or even the country. The only other factor that loomed large, which I admit that the Government have recognised, was gender. Those were the two most crucial findings. It is an abdication of adult responsibility to place the onus on children to keep themselves safe. The amendment and the Bill, as I keep mentioning, should focus on the role of design, not on how a child uses it.
My second point, which is of a similar nature, is that I am very concerned that a lot of digital literacy—for adults as well as children, but my particular concern is in schools—is provided by the tech companies themselves. Therefore, once again their responsibility, their role in the system and process of what children might find from reward loops, algorithms and so on, is very low down on the agenda. Is it possible at this late stage to consider that Ofcom might have a responsibility to consider the system design as part of its literacy review?
My Lords, this has been a very interesting short debate. Like other noble Lords, I am very pleased that the Government have proposed the new clauses in Amendments 274B and 274C. The noble Baroness, Lady Bull, described absolutely the importance of media literacy, particularly for disabled people and for the vulnerable. This is really important for them. It is important also not to fall into the trap described by the noble Baroness, Lady Kidron, of saying, “You are a child or a vulnerable person. You must acquire media literacy—it’s your obligation; it’s not the obligation of the platforms to design their services appropriately”. I take that point, but it does not mean that media literacy is not extraordinarily important.
However, sadly, I do not believe that the breadth of the Government’s new media literacy amendments is as wide as the original draft Bill. If you look back at the draft Bill, that was a completely new and upgraded set of duties right across the board, replacing Section 11 of the Communications Act and, in a sense, fit for the modern age. The Government have made a media literacy duty which is much narrower. It relates only to regulated services. This is not optimum. We need something broader which puts a bigger and broader duty for the future on to Ofcom.
It is also deficient in two respects. The noble Lord, Lord Knight, will speak to his amendments, but it struck me immediately when looking at that proposed new clause that we were missing all the debate about functionalities and so on that the noble Baroness, Lady Kidron, debated the other day, regarding design, and that we must ensure that media literacy encompasses understanding the underlying functionalities and systems of the platforms that we are talking about.
I know that your Lordships will be very excited to hear that I am going to refer again to the Joint Committee. I know that the Minister has read us from cover to cover, but at paragraph 381 on the draft Bill we said, and it is still evergreen:
“If the Government wishes to improve the UK’s media literacy to reduce online harms, there must be provisions in the Bill to ensure media literacy initiatives are of a high standard. The Bill should empower Ofcom to set minimum standards for media literacy initiatives that both guide providers and ensure the information they are disseminating aligns with the goal of reducing online harm”.
I had a very close look at the clause. I could not see that Ofcom is entitled to set minimum standards. The media literacy provisions sadly are deficient in that respect.
My Lords, it is all quite exciting now, is it not? I can say “hear, hear!” a lot; everyone is talking about freedom of expression. I cannot tell noble Lords how relieved and pleased I was both to hear the speeches and to see Amendment 228 from the noble Lord, Lord Allan of Hallam, and the noble Viscount, Lord Colville of Culross, who both explained well why this is so important. I am so glad that, even late in our discussions on Report, it has returned as an important issue.
We have already discussed how in many cases, especially when it comes to what is seen as illegal speech, decisions about illegality are very complicated. They are complicated in the law courts and offline, and that is when they have the full power of lawyers, the criminal justice system and so on trying to make decisions. Leaving it up to people who, through no fault of their own, are not qualified but who work in a social media company to try to make that decision in a climate of quite onerous obligations—and having phrases such as “reasonable grounds to infer”—will lead to lawful expression being overmoderated. Ultimately, online platforms will use an abundance of caution, which will lead to a lot of important speech—perfectly lawful if not worthy speech; the public’s speech and the ability to speak freely—being removed. That is not a trivial side issue; it will discredit the Bill, if it has not done so already.
Whenever noble Lords make contributions about why a wide range of amendments and changes are needed—particularly in relation to protecting children, harm and so on—they constantly tell us that the Bill should send an uncompromising message. The difficulty I have is with the danger that the Bill will send an uncompromising message that freedom of expression is not important. I urge the Minister to look carefully at the amendment, because the message should be that, while the Bill is trying to tackle online harm and to protect children in particular—which I have never argued against—huge swathes of it might inadvertently silence people and deprive them of the right to information that they should be able to have.
My Amendment 229—I am not sure why it is in this group, but that is nothing new in the way that the groupings have worked—is about lawful speech and about what content is filtered by users. I have already argued for the replacement of the old legal but harmful duty, but the new duty of user empowerment is welcome, and at face value it puts users in the driving seat and allows adults to judge for themselves what they want and do not want to see. But—and it is a large but—that will work only if users and providers agree about when content should be filtered and what content is filtered.
As with all decisions on speech, as I have just mentioned, in the context particularly of a heightened climate of confusion and sensitivity regarding identity politics and the cancel-culture issues that we are all familiar with, there are some problems with the way that things stand in the Bill. I hope I am using the term “reasonable grounds to infer” in a better way than it is used in terms of illegality. My amendment specifies that companies need to have reasonable grounds to infer that content is abusive or inciting hatred when filtering out content in those user empowerment tools. Where a user chooses to filter out hateful content based on race, on being a woman or whatever, it should catch only content that genuinely falls under those headings. There is a risk that, without this amendment, technologies or individuals working for companies could operate in a heavy-handed way in filtering out legitimate content.
I shall give a couple of examples. Say that someone chooses to filter out abusive content targeting the protected characteristic of race. I imagine that they would have a reasonable expectation that that filter would target aggressive, unpleasant content demeaning to a person because of their race, but does the provider agree with that? Will it interpret my filtering choice as a user in the most restrictive way possible in a bid to protect my safety or by seeing my sensibilities as having a low threshold for what it might consider to be abuse?
The race issue illustrates where we get into difficulties. Will the filterers take their cue from the document that has just been revealed, which was compiled by the Diocese of St Edmundsbury and Ipswich, which the anti-racist campaigning group Don’t Divide Us has just released, and which is being used in 87 schools? Under the heading of racism we have ideas like passive racism includes agreeing that
“There are two sides to every story”,
or if you deny white privilege or if you start a sentence saying, “Not all white people”. “Veiled racism” in this document—which, as I say, is being used in schools for that particular reason by the Church of England—includes a “Euro-centric curriculum” or “cultural appropriation”. “Racist discrimination” includes “anti- immigration policies”, which, as I pointed out before, would indicate that some people would call the Government’s own Bill tonight racist.
The reason why I mention that is that you might think, “I am going to have racism filtered out”, but if there is too much caution then you will have filtered out very legitimate discussions on immigration and cultural appropriation. You will be protected, but if, for example, the filterer follows certain universities that have deemed the novels of Walter Scott, the plays of William Shakespeare or Enid Blyton’s writing as racist, then you can see that we have some real problems. When universities have said there is misogynistic bullying and sexual abuse in “The Great Gatsby” and Ovid’s “Metamorphoses”, I just want to make sure that we do not end up in a situation where there is oversensitivity by the filterers. Perhaps the filtering will take place by algorithm, machine learning and artificial intelligence, but the EHRC has noted that algorithms just cannot cope with the context, cultural difference and complexity of language within the billions of items of content produced every day.
Amendment 229 ensures that there is a common standard—a standard of objective reasonableness. It is not perfect at all; I understand that reasonableness itself is open to interpretation. However, it is an attempt to ensure that the Government’s concept of user empowerment is feasible by at least aspiring to a basic shared understanding between users and providers as to what will be filtered and what will not, and a check against providers’ filter mechanisms removing controversial or unpopular content in the name of protecting users. Just as I indicated in terms of sending a message, if the Government could indicate to the companies that rather than taking a risk-averse attitude, they had to bear in mind freedom of expression, not be oversensitive and not be too risk-averse or overcautious, we might begin to get some balance. Otherwise, an awful lot of lawful material will be removed that is not even harmful.
My Lords, I support Amendment 228. I spoke on this issue to the longer amendment in Committee. To decide whether something is illegal without the entire apparatus of the justice system, in which a great deal of care is taken to decide whether something is illegal, at high volume and high speed, is very worrying. It strikes me as amusing because someone commented earlier that they like a “must” instead of a “maybe”. In this case, I caution that a provider should treat the content as content of the kind in question accordingly, that something a little softer is needed, not a cliff edge that ends up in horrors around illegality where someone who has acted in self-defence is accused of a crime of violence, as happens to many women, and so on and so forth. I do not want to labour the point. I just urge a gentle landing rather than, as it is written, a cliff edge.
My Lords, this has been a very interesting debate. Beyond peradventure my noble friend Lord Allan and the noble Viscount, Lord Colville, and the noble Baroness, Lady Fox, have demonstrated powerfully the perils of this clause. “Lawyers’ caution” is one of my noble friend’s messages to take away, as is the complexities in making these judgments. It was interesting when he mentioned the sharing for awareness’s sake of certain forms of content and the judgments that must be taken by platforms. His phrase “If in doubt, take it out” is pretty chilling in free speech terms—I think that will come back to haunt us. As the noble Baroness, Lady Fox, said, the wrong message is being delivered by this clause. It is important to have some element of discretion here and not, as the noble Baroness, Lady Kidron, said, a cliff edge. We need a gentler landing. I very much hope that the Minister will land more gently.
My Lords, I pay tribute to the noble Baroness, Lady Harding, for her role in bringing this issue forward. I too welcome the government amendments. It is important to underline that adding the potential role of app stores to the Bill is neither an opportunity for other companies to fail to comply and wait for the gatekeepers to do the job nor a one-stop shop in itself. It is worth reminding ourselves that digital journeys rarely start and finish in one place. In spite of the incredible war for our attention, in which products and services attempt to keep us rapt on a single platform, it is quite important for everyone in the ecosystem to play their part.
I have two minor points. First, I was not entirely sure why the government amendment requires the Secretary of State to consult as opposed to Ofcom. Can the Minister reassure me that, whoever undertakes the consultation, it will include children and children’s organisations as well as tech companies? Secondly, like the noble Baroness, Lady Harding, I was a little surprised that the amendment does not define an app store but uses the term “the ordinary meaning of”. That seems like it may have the possibility for change. If there is a good reason for that—I am sure there is—then it must be stated that app stores cannot suddenly rebrand to something else and that that gatekeeper function will be kept absolutely front and centre.
Notwithstanding those comments, and associating myself with the idea that nothing should wait until 2025-26, I am very grateful to the Government for bringing this forward.
My Lords, I will make a brief contribution because I was the misery guts when this was proposed first time round. I congratulate the noble Baroness, Lady Harding, not just on working with colleagues to come up with a really good solution but on seeking me out. If I heard someone be as miserable as I was, I might try to avoid them. She did not; she came and asked me, “Why are you miserable? What is the problem here?”, and took steps to address it. Through her work with the Government, their amendments address my main concerns.
My first concern, as we discussed in Committee, was that we would be asking large companies to regulate their competitors, because the app stores are run by large tech companies. She certainly understood that concern. The second was that I felt we had not necessarily yet clearly defined the problem. There are lots of problems. Before you can come up with a solution, you need a real consensus on what problem you are trying to address. The government amendment will very much help in saying, “Let’s get really crunchy about the actual problem that we need app stores to address”.
Finally, I am a glass-half-full kind of guy as well as a misery guts—there is a contradiction there—and so I genuinely think that these large tech businesses will start to change their behaviour and address some of the concerns, such as getting age ratings correct, just by virtue of our having this regulatory framework in place. Even if today the app stores are technically outside, the fact that the sector is inside and that this amendment tells them that they are on notice will, I think and hope, have a hugely positive effect and we will get the benefits much more quickly than the timescale envisaged in the Bill. That feels like a true backstop. I sincerely hope that the people in those companies, who I am sure will be glued to our debate, will be thinking that they need to get their act together much more quickly. It is better for them to do it themselves than wait for someone to do it to them.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 3 months ago)
Lords ChamberMy Lords, I want to thank the Minister and other noble colleagues for such kind words. I really appreciate it.
I want to say very little. It has been an absolute privilege to work with people across both Houses on this. It is not every day that one keeps the faith in the system, but this has been a great pleasure. In these few moments that I am standing, I want to pay tribute to the bereaved parents, the children’s coalition, the NSPCC, my colleagues at 5Rights, Barnardo’s, and the other people out there who listen and care passionately that we get this right. I am not going to go through what we got right and wrong, but I think we got more right than we got wrong, and I invite the Minister to sit with me on Monday in the Gallery to make sure that those last little bits go right—because I will be there. I also remind the House that we have some work in the data Bill vis-à-vis the bereaved parents.
In all the thanks—and I really feel that I have had such tremendous support on my area of this Bill—I pay tribute to the noble Baroness, Lady Benjamin. She was there before many people were and suffered cruelly in the legislative system. Our big job now is to support Ofcom, hold it to account and help it in its task, because that is Herculean. I really thank everyone who has supported me through this.
My Lords, I am sure that your Lordships would not want the Bill to pass without hearing some squeak of protest and dissent from those of us who have spent so many days and weeks arguing for the interests of privacy and free speech, to which the Bill remains a very serious and major threat.
Before I come to those remarks, I associate myself with what other noble Lords have said about what a privilege it has been, for me personally and for many of us, to participate over so many days and weeks in what has been the House of Lords at its deliberative best. I almost wrote down that we have conducted ourselves like an academic seminar, but when you think about what most academic seminars are like—with endless PowerPoint slides and people shuttling around, and no spontaneity whatever—we exceeded that by far. The conversational tone that we had in the discussions, and the way in which people who did not agree were able to engage—indeed, friendships were made—meant that the whole thing was done with a great deal of respect, even for those of us who were in the small minority. At this point, I should perhaps say on behalf of the noble Baroness, Lady Fox of Buckley, who participated fully in all stages of the Bill, that she deeply regrets that she cannot be in her place today.
I am not going to single out anybody except for one person. I made the rather frivolous proposal in Committee that all our debates should begin with the noble Lord, Lord Allan of Hallam; we learned so much from every contribution he made that he really should have kicked them all off. We would all have been a great deal more intelligent about what we were saying, and understood it better, had we heard what he had to say. I certainly have learned a great deal from him, and that was very good.
I will raise two issues only that remain outstanding and are not assuaged by the very odd remarks made by my noble friend as he moved the Third Reading. The first concerns encryption. The fact of the matter is that everybody knows that you cannot do what Ofcom is empowered by the Bill to do without breaching end-to-end encryption. It is as simple as that. My noble friend may say that that is not the Government’s intention and that it cannot be forced to do it if the technology is not there. None of that is in the Bill, by the way. He may say that at the Dispatch Box but it does not address the fact that end-to-end encryption will be breached if Ofcom finds a way of doing what the Bill empowers it to do, so why have we empowered it to do that? How do we envisage that Ofcom will reconcile those circumstances where platforms say that they have given their best endeavours to doing something and Ofcom simply does not believe that they have? Of course, it might end up in the courts, but the crucial point is that that decision, which affects so many people—and so many people nowadays regard it as a right to have privacy in their communications—might be made by Ofcom or by the courts but will not be made in this Parliament. We have given it away to an unaccountable process and democracy has been taken out of it. In my view, that is a great shame.
I come back to my second issue—I will not be very long. I constantly ask about Wikipedia. Is Wikipedia in scope of the Bill? If it is, is it going to have to do prior checking of what is posted? That would destroy its business model and make many minority language sites—I instanced Welsh—totally unviable. My noble friend said at the Dispatch Box that, in his opinion, Wikipedia was not going to be in scope of the Bill. But when I asked why we could not put that in the Bill, he said it was not for him to decide whether it was in scope and that the Government had set up this wonderful structure whereby Ofcom will tell us whether it is—almost without appeal, and again without any real democratic scrutiny. Oh yes, and we might have a Select Committee, which might write a very good, highly regarded report, which might be debated some time within the ensuing 12 months on the Floor of your Lordships’ House. However, we will have no say in that matter; we have given it away.
I said at an earlier stage of the Bill that, for privacy and censorship, this represents the closest thing to a move back to the Lord Chamberlain and Lady Chatterley’s Lover that you could imagine but applied to the internet. That is bad, but what is almost worse is this bizarre governance structure where decisions of crucial political sensitivity are being outsourced to an unaccountable regulator. I am very sad to say that I think that, at first contact with reality, a large part of this is going to collapse, and with it a lot of good will be lost.