Online Safety Bill Debate
Full Debate: Read Full DebateLord Parkinson of Whitley Bay
Main Page: Lord Parkinson of Whitley Bay (Conservative - Life peer)Department Debates - View all Lord Parkinson of Whitley Bay's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, the Government are committed to protecting children against accessing pornography online. As technology evolves, it is important that the regulatory framework introduced by the Bill keeps pace with emerging risks to children and exposure to pornography in new forms, such as generative artificial intelligence.
Part 5 of the Bill has been designed to be future-proof, and we assess that it would already capture AI-generated pornography. Our Amendments 206 and 209 will put beyond doubt that content is “provider pornographic content” where it is published or displayed on a Part 5 service by means of an automated tool or algorithm, such as a generative AI bot, made available on the service by a provider. Amendments 285 and 293 make clear that the definition of an automated tool includes a bot. Amendment 276 clarifies the definition of a provider of a Part 5 service, to make clear that a person who controls an AI bot that generates pornography can be regarded as the provider of a service.
Overall, our amendments provide important certainty for users, providers and Ofcom on the services and content in scope of the Part 5 duties. This will ensure that the new, robust duties for Part 5 providers to use age verification or age estimation to prevent children accessing provider pornographic content will also extend to AI-generated pornography. I beg to move.
My Lords, the noble Baroness, Lady Kidron, has unfortunately been briefly detained. If you are surprised to see me standing up, it is because I am picking up for her. I start by welcoming these amendments. I am grateful for the reaction to the thought-provoking debate that we had in Committee. I would like to ask a couple of questions just to probe the impact around the edges.
Amendment 27 looks as if it implies that purely content-generating machine-learning or AI bots could be excluded from the scope of the Bill, rather than included, which is the opposite of what we were hoping to achieve. That may be us failing to understand the detail of this large body of different amendments, but I would welcome my noble friend the Minister’s response to make sure that in Amendment 27 we are not excluding harm that could be generated by some form of AI or machine-learning instrument.
Maybe I can give my noble friend the Minister an example of what we are worried about. This is a recent scenario that noble Lords may have seen in the news, of a 15 year-old who asked, “How do I have sex with a 30 year-old?”. The answer was given in forensic detail, with no reference to the fact that it would in fact be statutory rape. Would the regulated service, or the owner of the regulated service that generated that answer, be included or excluded as a result of Amendment 27? That may be my misunderstanding.
This group is on AI-generated pornography. My friend, the noble Baroness, Lady Kidron, and I are both very concerned that it is not just about pornography, and that we should make sure that AI is included in the Bill. Specifically, many of us with teenage children will now be learning how to navigate the Snap AI bot. Would harm generated by that bot be captured in these amendments, or is it only content that is entirely pornographic? I hope that my noble friend the Minister can clarify both those points, then we will be able to support all these amendments.
My Lords, this has been a short but important debate and I am grateful to noble Lords for their broad support for the amendments here and for their questions. These amendments will ensure that services on which providers control a generative tool, such as a generative AI bot, are in scope of Part 5 of the Bill. This will ensure that children are protected from any AI-generated pornographic content published or displayed by provider-controlled generative bots. These changes will not affect the status of any non-pornographic AI-generated content, or AI-generated content shared by users.
We are making a minor change to definitions in Part 3 to ensure that comments or reviews on content generated by a provider-controlled artificial intelligence source are not regulated as user-generated content. This is consistent with how the Bill treats comments and reviews on other provider content. These amendments do not have any broader impact on the treatment of bots by Part 3 of the Bill’s regime beyond the issue of comments and reviews. The basis on which a bot will be treated as a user, for example, remains unchanged.
I am grateful to the noble Lord, Lord Clement-Jones, for degrouping his Amendment 152A so that I can come back more fully on it in a later group and I am grateful for the way he spoke about it in advance. I am grateful too for my noble friend Lady Harding’s question. These amendments will ensure that providers which control a generative tool on a service, such as a generative AI bot, are in scope of Part 5 of the Bill. A text-only generative AI bot would not be in scope of Part 5. It is important that we focus on areas which pose the greatest risk of harm to children. There is an exemption in Part 5 for text-based provider pornographic content because of the limited risks posed by published pornographic content. This is consistent with the approach of Part 3 of the Digital Economy Act 2017 and its provisions to protect children from commercial online pornography, which did not include text-based content in scope.
The right reverend Prelate the Bishop of Oxford is right to ask whether we think this is enough. These changes certainly help. The way that the Bill is written in a technology-neutral way will help us to future proof it but, as we have heard throughout the passage of the Bill, we all know that this area of work will need constant examination and scrutiny. That is why the Bill is subject the post-Royal Assent review and scrutiny that it is and why we are grateful for the anticipation noble Lords and Members of Parliament in the other place have already given to ensuring that it delivers on what we want to see. I believe these amendments, which put out of doubt important provisions relating to generative AI, are a helpful addition and I beg to move.
My Lords, much like the noble Lord, Lord Clement-Jones, I started off being quite certain I knew what to say about these amendments. I even had some notes—unusual for me, I know—but I had to throw them away, which I always do with my notes, because the arguments have been persuasive. That is exactly why we are here in Parliament discussing things: to try to reach common solutions to difficult problems.
We started with a challenge to the Minister to answer questions about scope, exemptions and discretion in relation to a named service—Wikipedia. However, as the debate went on, we came across the uncomfortable feeling that, having got so far into the Bill and agreed a lot of amendments today improving it, we are still coming up against quite stubborn issues that do not fit neatly into the categorisation and structures that we have. We do not seem to have the right tools to answer the difficult questions before us today, let alone the myriad questions that will come up as the technology advances and new services come in. Why have we not already got solutions to the problems raised by Amendments 281, 281A and 281B?
There is also the rather difficult idea we have from the noble Lord, Lord Russell, of dark patterns, which we need to filter into our thinking. Why does that not fit into what we have got? Why is it that we are still worried about Wikipedia, a service for public good, which clearly has risks in it and is sometimes capable of making terrible mistakes but is definitely a good thing that should not be threatened by having to conform with a structure and a system which we think is capable of dealing with some of the biggest and most egregious companies that are pushing stuff at us in the way that we have talked about?
I have a series of questions which I do not have the answers to. I am looking forward to the Minister riding to my aid on a white charger of enormous proportions and great skill which will take us out without having to fall over any fences.
If I may, I suggest to the Minister a couple of things. First, we are stuck on the word “content”. We will come back to that in the future, as we still have an outstanding problem about exactly where the Bill sets it. Time and again in discussions with the Bill team and with Ministers we have been led back to the question of where the content problem lies and where the harms relate to that, but this little debate has shown beyond doubt that harm can occur independent of and separate from content. We must have a solution to that, and I hope it will be quick.
Secondly, when approaching anybody or anything or any business or any charity that is being considered in scope for this Bill, we will not get there if we are looking only at the question of its size and its reach. We have to look at the risks it causes, and we have to drill down hard into what risks we are trying to deal with using our armoury as we approach these companies, because that is what matters to the children, vulnerable people and adults who would suffer otherwise, and not the question of whether or not these companies are big or small. I think there are solutions to that and we will get there, but, when he comes to respond, the Minister needs to demonstrate to us that he is still willing to listen and think again about one or two issues. I look forward to further discussions with him.
I am grateful to noble Lords for their contributions during this debate. I am sympathetic to arguments that we must avoid imposing disproportionate burdens on regulated services, and particularly that the Bill should not inhibit services from providing valuable information which is of benefit to the public. However, I want to be clear that that is why the Bill has been designed in the way that it has. It has a broad scope in order to capture a range of services, but it has exemptions and categorisations built into it. The alternative would be a narrow scope, which would be more likely inadvertently to exempt risky sites or to displace harm on to services which we would find are out of scope of the Bill. I will disappoint noble Lords by saying that I cannot accept their amendments in this group but will seek to address the concerns that they have raised through them.
The noble Lord, Lord Allan, asked me helpfully at the outset three questions, to which the answers are yes, no and maybe. Yes, Wikipedia and OpenStreetMap will be in scope of the Bill because they allow users to interact online; no, we do not believe that they would fall under any of the current exemptions in the Bill; and the maybe is that Ofcom does not have the discretion to exempt services but the Secretary of State can create additional exemptions for further categories of services if she sees fit.
I must also say maybe to my noble friend Lord Moylan on his point about Wikipedia—and with good reason. Wikipedia, as I have just explained, is in scope of the Bill and is not subject to any of its exemptions. I cannot say how it will be categorised, because that is based on an assessment made by the independent regulator, but I reassure my noble friend that it is not the regulator but the Secretary of State who will set the categorisation thresholds through secondary legislation; that is to say, a member of the democratically elected Government, accountable to Parliament, through legislation laid before that Parliament. It will then be for Ofcom to designate services based on whether or not they meet those thresholds.
It would be wrong—indeed, nigh on impossible—for me to second-guess that designation process from the Dispatch Box. In many cases it is inherently a complex and nuanced matter since, as my noble friend Lady Harding said, many services change over time. We want to keep the Bill’s provisions flexible as services change what they do and new services are invented.
I would just like to finish my thought on Wikipedia. Noble Lords are right to mention it and to highlight the great work that it does. My honourable friend the Minister for Technology and the Digital Economy, Paul Scully, met Wikipedia yesterday to discuss its concerns about the Bill. He explained that the requirements for platforms in this legislation will be proportionate to the risk of harm, and that as such we do not expect the requirements for Wikipedia to be unduly burdensome.
I am computing the various pieces of information that have just been given, and I hope the Minister can clarify whether I have understood them correctly. These services will be in scope as user-to-user services and do not have an exemption, as he said. The Secretary of State will write a piece of secondary legislation that will say, “This will make you a category 1 service”—or a category 2 or 2B service—but, within that, there could be text that has the effect that Wikipedia is in none of those categories. So it and services like it could be entirely exempt from the framework by virtue of that secondary legislation. Is that a correct interpretation of what he said?
The Secretary of State could create further exemptions but would have to bring those before Parliament for it to scrutinise. That is why there is a “maybe” in answer to his third question in relation to any service. It is important for the legislation to be future-proofed that the Secretary of State has the power to bring further categorisations before Parliament for it to discuss and scrutinise.
My Lords, I will keep pressing this point because it is quite important, particularly in the context of the point made by the noble Baroness, Lady Kidron, about categorisation, which we will debate later. There is a big difference when it comes to Schedule 11, which defines the categorisation scheme: whether in the normal run of business we might create an exemption in the categorisation secondary legislation, or whether it would be the Secretary of State coming back with one of those exceptional powers that the Minister knows we do not like. He could almost be making a case for why the Secretary of State has to have these exceptional powers. We would be much less comfortable with that than if the Schedule 11 categorisation piece effectively allowed another class to be created, rather than it being an exceptional Secretary of State power.
I do not think that it is, but it will be helpful to have a debate on categorisation later on Report, when we reach Amendment 245, to probe this further. It is not possible for me to say that a particular service will certainly be categorised one way or another, because that would give it carte blanche and we do not know how it may change in the future—estimable though I may think it is at present. That is the difficulty of setting the precise parameters that the noble Baroness, Lady Fox, sought in her contribution. We are setting broad parameters, with exemptions and categorisations, so that the burdens are not unduly heavy on services which do not cause us concern, and with the proviso for the Secretary of State to bring further exemptions before Parliament, as circumstances strike her as fit, for Parliament to continue the debate we are having now.
The noble Baroness, Lady Kidron, in her earlier speech, asked about the functionalities of user-to-user services. The definitions of user-to-user services are broad and flexible, to capture new and changing services. If a service has both user-to-user functionality and a search engine, it will be considered a combined service, with respective duties for the user-to-user services which form part of its service and search duties in relation to the search engine.
I reassure my noble friend Lady Harding of Winscombe that the Bill will not impose a disproportionate burden on services, nor will it impede the public’s access to valuable content. All duties on services are proportionate to the risk of harm and, crucially, to the capacity of companies. The Bill’s proportionate design means that low-risk services will have to put in place only measures which reflect the risk of harm to their users. Ofcom’s guidance and codes of practice will clearly set out how these services can comply with their duties. We expect that it will set out a range of measures and steps for different types of services.
Moreover, the Bill already provides for wholesale exemptions for low-risk services and for Ofcom to exempt in-scope services from requirements such as record-keeping. That will ensure that there are no undue burdens to such services. I am grateful for my noble friend’s recognition, echoed by my noble friend Lady Stowell of Beeston, that “non-profit” does not mean “not harmful” and that there can be non-commercial services which may pose harms to users. That is why it is important that there is discretion for proper assessment.
Amendment 30 seeks to allow Ofcom to withdraw the exemptions listed in Schedule 1 from the Bill. I am very grateful to my noble friend Lord Moylan for his time earlier this week to discuss his amendment and others. We have looked at it, as I promised we would, but I am afraid that we do not think that it would be appropriate for Ofcom to have this considerable power—my noble friend is already concerned that the regulator has too much.
The Bill recognises that it may be necessary to remove certain exemptions if there is an increased risk of harm from particular types of services. That is why the Bill gives the Secretary of State the power to remove particular exemptions, such as those related to services which have limited user-to-user functionality and those which offer one-to-one live aural communications. These types of services have been carefully selected as areas where future changes in user behaviour could necessitate the repeal or amendment of an exemption in Schedule 1. This power is intentionally limited to only these types of services, meaning that the Secretary of State will not be able to remove exemptions for comments on recognised news publishers’ sites. That is in recognition of the Government’s commitment to media freedom and public debate. It would not be right for Ofcom to have the power to repeal those exemptions.
Amendments 281 and 281B, in the name of the noble Lord, Lord Russell of Liverpool, are designed to ensure that the lists of features under the definition of “functionality” in the Bill apply to all regulated services. Amendment 281A aims to add additional examples of potentially addictive functionalities to the Bill’s existing list of features which constitute a “functionality”. I reassure him and other noble Lords that the list of functionalities in the Bill is non-exhaustive. There may be other functionalities which could cause harm to users and which services will need to consider as part of their risk assessment duties. For example, if a provider’s risk assessment identifies that there are functionalities which risk causing significant harm to an appreciable number of children on its service, the Bill will require the provider to put in place measures to mitigate and manage that risk.
He and other noble Lords spoke about the need for safety by design. I can reassure them this is already built into the framework of the Bill, which recognises how functionalities including many of the things mentioned today can increase the risk of harm to users and will encourage the safe design of platforms.
Amendments 281 and 281B have the effect that regulated services would need to consider the risk of harm of functionalities that are not relevant for their kind of service. For example, sharing content with other users is a functionality of user-to-user services, which is not as relevant for search services. The Bill already outlines specific features that both user-to-user and search services should consider, which are the most relevant functionalities for those types of service. Considering these functionalities would create an unnecessary burden for regulated services which would detract from where their efforts can best be focused. That is why I am afraid I cannot accept the amendments that have been tabled.
My Lords, surely it is the role of the regulators to look at functionalities of this kind. The Minister seemed to be saying that it would be an undue burden on the regulator. Is not that exactly what we are meant to be legislating about at this point?
Perhaps I was not as clear as I could or should have been. The regulator will set out in guidance the duties that fall on the businesses. We do not want the burden on the business to be unduly heavy, but there is an important role for Ofcom here. I will perhaps check—
But these functionalities are a part of their business model, are they not?
Hence Ofcom will make the assessments about categorisation based on that. Maybe I am missing the noble Lord’s point.
I think we may need further discussions on the amendment from the noble Lord, Lord Russell.
I will check what I said but I hope that I have set out why we have taken the approach that we have with the broad scope and the exemptions and categorisations that are contained in it. With that, I urge the noble Lord to withdraw his amendment.
My Lords, that was a very useful debate. I appreciate the Minister’s response and his “yes, no, maybe” succinctness, but I think he has left us all more worried than when the debate started. My noble friend Lord Clement-Jones tied it together nicely. What we want is for the regulator to be focused on the greatest areas of citizen risk. If there are risks that are missing, or things that we will be asking the regulator to do that are a complete waste of time because they are low risk, then we have a problem. We highlighted both those areas. The noble Lord, Lord Russell, rightly highlighted that we are not content with just “content” as the primary focus of the legislation; it is about a lot more than content. In my amendment and those by the noble Lord, Lord Moylan, we are extremely worried—and remain so—that the Bill creates a framework that will trap Wikipedia and services like it, without that being our primary intention. We certainly will come back to this in later groups; I will not seek to press the amendment now, because there is a lot we all need to digest. However, at the end of this process, we want to get to point where the regulator is focused on things that are high risk to the citizen and not wasting time on services that are very low risk. With that, I beg leave to withdraw my amendment.
My Lords, the government amendments in this group relate to the categories of primary priority and priority content that is harmful to children.
Children must be protected from the most harmful online content and activity. As I set out in Committee, the Government have listened to concerns about designating primary priority and priority categories of content in secondary legislation and the need to protect children from harm as swiftly as possible. We have therefore tabled amendments to set out these categories in the Bill. I am grateful for the input from across your Lordships’ House in finalising the scope of these categories.
While it is important to be clear about the kinds of content that pose a risk of harm to children, I acknowledge what many noble Lords raised during our debates in Committee, which is that protecting children from online harm is not just about content. That is why the legislation takes a systems and processes approach to tackling the risk of harm. User-to-user and search service providers will have to undertake comprehensive, mandatory risk assessments of their services and consider how factors such as the design and operation of a service and its features and functionalities may increase the risk of harm to children. Providers must then put in place measures to manage and mitigate these risks, as well as systems and processes to prevent and protect children from encountering the categories of harmful content.
We have also listened to concerns about cumulative harm. In response to this, the Government have tabled amendments to Clause 209 to make it explicit that cumulative harm is addressed. This includes cumulative harm that results from algorithms bombarding a user with content, or where combinations of functionality cumulatively drive up the risk of harm. These amendments will be considered in more detail under a later group of amendments, but they are important context for this discussion.
I turn to the government amendments, starting with Amendment 171, which designates four categories of primary priority content. First, pornographic content has been defined in the same way as in Part 5—to give consistent and comprehensive protection for children, regardless of the type of service on which the pornographic content appears. The other three categories capture content which encourages, promotes or provides instructions for suicide, self-harm or eating disorders. This will cover, for example, glamorising or detailing methods for carrying out these dangerous activities. Designating these as primary priority content will ensure that the most stringent child safety duties apply.
Government Amendment 172 designates six categories of priority content. Providers will be required to protect children from encountering a wide range of harmful violent content, which includes depictions of serious acts of violence or graphic injury against a person or animal, and the encouragement and promotion of serious violence, such as content glamorising violent acts. Providers will also be required to protect children from encountering abusive and hateful content, such as legal forms of racism and homophobia, and bullying content, which sadly many children experience online.
The Government have heard concerns from the noble Baronesses, Lady Kidron and Lady Finlay of Llandaff, about extremely dangerous activities being pushed to children as stunts, and content that can be harmful to the health of children, including inaccurate health advice and false narratives. As such, we are designating content that encourages dangerous stunts and challenges as a category of priority content, and content which encourages the ingestion or inhalation of, or exposure to, harmful substances, such as harmful abortion methods designed to be taken by a person without medical supervision.
Amendment 174, from the noble Baroness, Lady Kidron, seeks to add “mis- and disinformation” and “sexualised content” to the list of priority content. On the first of these, I reiterate what I said in Committee, which is that the Bill will protect children from harmful misinformation and disinformation where it intersects with named categories of primary priority or priority harmful content—for example, an online challenge which is promoted to children on the basis of misinformation or disinformation, or abusive content with a foundation in misinformation or disinformation. However, I did not commit to misinformation and disinformation forming its own stand-alone category of priority harmful content, which could be largely duplicative of the categories that we have already included in the Bill and risks capturing a broad range of legitimate content.
We have already addressed key concerns related to misinformation and disinformation content which presents the greatest risk to children by including content which encourages the ingestion or inhalation of, or exposure to, harmful substances to the list of priority categories. However, the term “mis- and disinformation”, as proposed by Amendment 174, in its breadth and subjectivity risks inadvertently capturing a wide range of content resulting in disproportionate, excessive censorship of the content children see online, including in areas of legitimate debate. The harm arising from misinformation or disinformation usually arises from the context or purpose of the content, rather than the mere fact that it is untrue. Our balanced approach ensures that children are protected from the most prevalent and concerning harms associated with misinformation and disinformation.
My Lords, we spent a lot of time in Committee raising concerns about how pornography and age verification were going to operate across all parts of the Bill. I have heard what the Minister has said in relation to this group, priority harms to children, which I believe is one of the most important groups under discussion in the Bill. I agree that children must be protected from the most harmful content online and offline.
I am grateful to the Government for having listened carefully to the arguments put forward by the House in this regard and commend the Minister for all the work he and his team have done since them. I also commend the noble Lord, Lord Bethell. He and I have been in some discussion between Committee and now in relation to these amendments.
In Committee, I argued for several changes to the Bill which span three groups of amendments. One of my concerns was that pornography should be named as a harm in the Bill. I welcome the Government’s Amendment 171, which names pornography as a primary priority content. I also support Amendment 174 in the name of the noble Baroness, Lady Kidron. She is absolutely right that sexualised content can be harmful to children if not age appropriate and, in that regard, before she even speaks, I ask the Minister tousb reconsider his views on this amendment and to accept it.
Within this group are the amendments which move the definition of “pornographic content” from Part 5 to Clause 211. In that context, I welcome the Government’s announcement on Monday about a review of the regulation, legislation and enforcement of pornography offences.
In Committee, your Lordships were very clear that there needed to be a consistent approach across the Bill to the regulation of pornography. I am in agreement with the amendments tabled in Committee to ensure that consistency applies across all media. In this regard, I thank the noble Baroness, Lady Benjamin, for her persistence in raising this issue. I also thank my colleagues on the Opposition Front Bench, the noble Lord, Lord Stevenson, and the noble Baroness, Lady Merron.
I appreciate that the Government made this announcement only three days ago, but I hope the Minister will set out a timetable for publishing the terms of reference and details of how this review will take place. The review is too important to disappear into the long grass over the Summer Recess, never to be heard of again, so if he is unable to answer my question today, will he commit to writing to your Lordships with the timeframe before the House rises for the summer? Will he consider the active involvement of external groups in this review, as much expertise lies outside government in this area? In that regard, I commend CARE, CEASE and Barnardo’s for all their input into the debates on the Bill.
My Lords, I think the noble Baroness’s comments relate to the next group of amendments, on pornography. She might have skipped ahead, but I am grateful for the additional thinking time to respond to her questions. Perhaps she will save the rest of her remarks for that group.
I thank the Minister for that. In conclusion, I hope he will reflect on those issues and come back, maybe at the end of the next group. I remind the House that in February the APPG on Commercial Sexual Exploitation, in its inquiry on pornography, recommended that the regulation of pornography should be consistent across all online platforms and between the online and offline spheres. I hope we can incorporate the voices I have already mentioned in the NGO sphere in order to assist the Government and both Houses in ensuring that we regulate the online platforms and that children are protected from any harms that may arise.
My Lords, like the noble Baroness, Lady Harding, I want to make it very clear that I think the House as a whole welcomes the change of heart by the Government to ensure that we have in the Bill the two sides of the question of content that will be harmful to children. We should not walk away from that. We made a big thing of this in Committee. The Government listened and we have now got it. The fact that we do not like it—or do not like bits of it—is the price we pay for having achieved something which is, probably on balance, good.
The shock comes from trying to work out why it is written the way it is, and how difficult it is to see what it will mean in practice when companies working to Ofcom’s instructions will take this and make this happen in practice. That lies behind, I think I am right in saying, the need for the addition to Amendment 172 from the noble Baroness, Lady Kidron, which I have signed, along with the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. Both of them have spoken well in support of it and I do not need to repeat those points.
Somehow, in getting the good of Amendments 171 and 172, we have lost the flexibility that we think we want as well to try to get that through. The flexibility does exist, because the Government have retained powers to amend and change both primary priority content that is harmful to children and the primary content. Therefore, subject to approval through the secondary legislation process, this House will continue to have a concern about that—indeed, both Houses will.
Somehow, however, that does not get to quite where the concern comes from. The concern should be both the good points made by the noble Lord, Lord Russell—I should have caught him up in the gap and said I had already mentioned the fact that we had been together at the meeting. He found some additional points to make which I hope will also be useful to future discussion. I am glad he has done that. He is making a very good point in relation to cultural context and the work that needs to go on—which we have talked about in earlier debates—in order to make this live: in other words, to make people who are responsible for delivering this through Ofcom, but also those who are delivering it through companies, to understand the wider context. In that sense, clearly we need the misinformation/disinformation side of that stuff. It is part and parcel of the problems we have got. But more important even than that is the need to see about the functionality issues. We have come back to that. This Bill is about risk. The process that we will be going through is about risk assessment and making sure that the risks are understood by those who deliver services, and the penalties that follow the failure of the risk assessment process delivering change that we want to see in society.
However, it is not just about content. We keep saying that, but we do not see the changes around it. The best thing that could happen today would be if the Minister in responding accepted that these clauses are good—“Tick, we like them”—but could we just not finalise them until we have seen the other half of that, which is: what are the other risks to which those users of services that we have referred to and discussed are receiving through the systemic design processes that are designed to take them in different directions? It is only when we see the two together that we will have a proper concern.
I may have got this wrong, but the only person who can tell us is the Minister because he is the only one who really understands what is going on in the Bill. Am I not right in saying—I am going to say I am right; he will say no, I am not, but I am, aren’t I?—that we will get to Clauses 208 and 209, or the clauses that used to be 208 and 209, one of which deals with harms from content and the other deals with functionality? We may need to look at the way in which those are framed in order to come back and understand better how these lie and how they interact with that. I may have got the numbers wrong—the Minister is looking a bit puzzled, so I probably have—but the sense is that this will probably not come up until day 4. While I do not want to hold back the Bill, we may need to look at some of the issues that are hidden in the interstices of this set of amendments in order to make sure that the totality is better for those who have to use it.
My Lords, this has been a useful debate. As the noble Baroness, Lady Kidron, says, because I spoke first to move the government amendments, in effect I got my response in first to her Amendment 174, the only non-government amendment in the group. That is useful because it allows us to have a deeper debate on it.
The noble Baroness asked about the way that organisations such as the British Board of Film Classification already make assessments of sexualised content. However, the Bill’s requirement on service providers and the process that the BBFC takes to classify content are not really comparable. Services will have far less time and much more content to consider them the BBFC does, so will not be able to take the same approach. The BBFC is able to take an extended time to consider maybe just one scene, one image or one conversation, and therefore can apply nuance to its assessments. That is not possible to do at the scale at which services will have to apply the child safety duties in the Bill. We therefore think there is a real risk that they would excessively apply those duties and adversely affect children’s rights online.
I know the noble Baroness and other noble Lords are rightly concerned with protecting rights to free expression and access to information online for children and for adults. It is important that we strike the right balance, which is what we have tried to do with the government amendments in this group.
To be clear, the point that I made about the BBFC was not to suggest a similar arrangement but to challenge the idea that we cannot categorise material of a sexualised nature. Building on the point made by the noble Lord, Lord Allan, perhaps we could think about it in terms of the amber light rather than the red light—in other words, something to think about.
I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.
My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.
The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—
I clarify that what I said was aimed at protecting children. Somebody corrected me and asked, “Do you know that this says ‘abusive’?”—of course I do. What I suggested was that this is an area that is very contentious when we talk about introducing it to children. I am thinking about safeguarding children in this instance, not just copying and pasting a bit of an Act.
I take this opportunity to ask my noble friend the Minister a question; I want some clarity about this. Would an abusive comment about a particular religion—let us say a religion that practised cannibalism or a historical religion that sacrificed babies, as we know was the norm in Carthage—count as “priority harmful content”? I appreciate that we are mapping the language of the Equality Act, but are we creating a new offence of blasphemy in this Bill?
As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:
“Content which is abusive and which targets any of the following characteristics”.
It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.
I will keep this short, because I know that everyone wants to get on. It would be said that it is abusive to misgender someone; in the context of what is going on in sixth forms and schools, I suggest that this is a problem. It has been suggested that showing pictures of the Prophet Muhammad in an RE lesson—these are real-life events that happen offline—is abusive. I am suggesting that it is not as simple as saying the word “abusive” a lot. In this area, there is a highly contentious and politicised arena that I want to end, but I think that this will exacerbate, not help, it.
My noble friend seemed to confirm what I said. If I wish to be abusive—in fact, I do wish to be abusive—about the Carthaginian religious practice of sacrificing babies to Moloch, and I were to do that in a way that came to the attention of children, would I be caught as having created “priority harmful content”? My noble friend appears to be saying yes.
Does my noble friend wish to do that and direct it at children?
With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?
May I attempt to assist the Minister? This is the “amber” point described by the noble Lord, Lord Allan: “priority content” is not the same as “primary priority content”. Priority content is our amber light. Even the most erudite and scholarly description of baby eating is not appropriate for five year-olds. We do not let it go into “Bod” or any of the other of the programmes we all grew up on. This is about an amber warning: that user-to-user services must have processes that enable them to assess the risk of priority content and primary priority content. It is not black and white, as my noble friend is suggesting; it is genuinely amber.
My Lords, we may be slipping back into a Committee-style conversation. My noble friend Lord Moylan rightly says that this is the first chance we have had to examine this provision, which is a concession wrung out of the Government in Committee. As the noble Lord, Lord Stevenson, says, sometimes that is the price your Lordships’ House pays for winning these concessions, but it is an important point to scrutinise in the way that my noble friend Lord Moylan and the noble Baroness, Lady Fox, have done.
I will try to reassure my noble friend and the noble Baroness. This relates to the definition of a characteristic with which we began our debates today. To be a characteristic it has to be possessed by a person; therefore, the content that is abusive and targets any of the characteristics has to be harmful to an individual to meet the definition of harm. Further, it has to be material that would come to the attention of children in the way that the noble Baronesses who kindly leapt to my defence and added some clarity have set out. So my noble friend would be able to continue to criticise the polytheistic religions of the past and their tendencies to his heart’s content, but there would be protections in place if what he was saying was causing harm to an individual—targeting them on the basis of their race, religion or any of those other characteristics—if that person was a child. That is what noble Lords wanted in Committee, and that is what the Government have brought forward.
My noble friend and others asked why mis- and disinformation were not named as their own category of priority harmful content to children. Countering mis- and disinformation where it intersects with the named categories of primary priority or priority harmful content, rather than as its own issue, will ensure that children are protected from the mis- and disinformation narratives that present the greatest risk of harm to them. We recognise that mis- and disinformation is a broad and cross-cutting issue, and we therefore think the most appropriate response is to address directly the most prevalent and concerning harms associated with it; for example, dangerous challenges and hoax health advice for children to self-administer harmful substances. I assure noble Lords that any further harmful mis- and disinformation content will be captured as non-designated content where it presents a material risk of significant harm to an appreciable number of children.
In addition, the expert advisory committee on mis- and disinformation, established by Ofcom under the Bill, will have a wide remit in advising on the challenges of mis- and disinformation and how best to tackle them, including how they relate to children. Noble Lords may also have seen that the Government have recently tabled amendments to update Ofcom’s statutory media literacy duty. Ofcom will now be required to prioritise users’ awareness of and resilience to misinformation and disinformation online. This will include children and their awareness of and resilience to mis- and disinformation.
My noble friend Lady Harding of Winscombe talked about commercial harms. Harms exacerbated by the design and operation of a platform—that is, their commercial models—are covered in the Bill already through the risk assessment and safety duties. Financial harm, as used in government Amendment 237, is dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This exemption ensures that there is no regulatory overlap.
The noble Lord, Lord Russell of Liverpool, elaborated on remarks made earlier by the noble Lord, Lord Stevenson of Balmacara, about their meeting looking at the incel movement, if it can be called that. I assure the noble Lord and others that Ofcom has a review and report duty and will be required to stay on top of changes in the online harms landscape and report to government on whether it recommends changes to the designated categories of content because of the emerging risks that it sees.
The noble Baroness, Lady Kidron, anticipated the debate we will have on Monday about functionalities and content. I am grateful to her for putting her name to so many of the amendments that we have brought forward. We will continue the discussions that we have been having on this point ahead of the debate on Monday. I do not want to anticipate that now, but I undertake to carry on those discussions.
In closing, I reiterate what I know is the shared objective across your Lordships’ House—to protect children from harmful content and activity. That runs through all the government amendments in this group, which cover the main categories of harmful content and activity that, sadly, too many children encounter online every day. Putting them in primary legislation enables children to be swiftly protected from encountering them. I therefore hope that noble Lords will be heartened by the amendments that we have brought forward in response to the discussion we had in Committee.