Chris Philp
Main Page: Chris Philp (Conservative - Croydon South)(2 years, 6 months ago)
Public Bill CommitteesGood afternoon, Sir Roger; it is a pleasure, as ever, to serve under your chairship. I rise to speak to new clause 36, which has been grouped with amendment 142 and is tabled in the names of the hon. Members for Ochil and South Perthshire and for Aberdeen North.
I, too, pay tribute to Samaritans for all the work it has done in supporting the Bill and these amendments to it. As colleagues will be aware, new clause 36 follows a recommendation from the Law Commission dating back to July 2021. The commission recommended the creation of a new, narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. It identified that there is
“currently no offence that adequately addresses the encouragement of serious self-harm.”
The recommendation followed acknowledgement that
“self-harm content online is a worrying phenomenon”
and should have a
“robust fault element that targets deliberate encouragement of serious self-harm”.
Currently, there are no provisions in the Bill to create a new offence of assisting or encouraging self-harm, despite the fact that other recommendations from the Law Commission report have been brought into the Bill, such as creating a new offence of cyber-flashing and prioritising tackling illegal suicide content.
We all know that harmful suicide and self-harm content is material that has the potential to cause or exacerbate self-harm and suicidal behaviours. Content relating to suicide and self-harm falls into both categories in the Bill—illegal content and legal but harmful content. Encouraging or assisting suicide is also currently a criminal offence in England and Wales under the Suicide Act 1961, as amended by the Coroners and Justice Act 2009.
Content encouraging or assisting someone to take their own life is illegal and has been included as priority illegal content in the Bill, meaning that platforms will be required to proactively and reactively prevent individuals from encountering it, and search engines will need to structure their services to minimise the risk to individuals encountering the content. Other content, including content that positions suicide as a suitable way of overcoming adversity or describes suicidal methods, is legal but harmful.
The Labour party’s Front-Bench team recognises that not all content falls neatly into the legal but harmful category. What can be helpful for one user can be extremely distressing to others. Someone may find it extremely helpful to share their personal experience of suicide, for example, and that may also be helpful to other users. However, the same material could heighten suicidal feelings and levels of distress in someone else. We recognise the complexities of the Bill and the difficulties in developing a way around this, but we should delineate harmful and helpful content relating to suicide and self-harm, and that should not detract from tackling legal but clearly harmful content.
In its current form, the Bill will continue to allow legal but clearly harmful suicide and self-harm content to be accessed by over-18s. Category 1 platforms, which have the highest reach and functionality, will be required to carry out risk assessments of, and set out in their terms and conditions their approach to, legal but harmful content in relation to over-18s. As the hon. Member for Ochil and South Perthshire outlined, however, the Bill’s impact assessment states that “less than 0.001%” of in-scope platforms
“are estimated to meet the Category 1 and 2A thresholds”,
and estimates that only 20 platforms will be required to fulfil category 1 obligations. There is no requirement on the smaller platforms, including those that actively encourage suicide, to do anything at all to protect over-18s. That simply is not good enough. That is why the Labour party supports new clause 36, and we urge the Minister to do the right thing by joining us.
It is, as always, a great pleasure to serve under your chairmanship, Sir Roger. The hon. Member for Ochil and South Perthshire made an observation in passing about the Government’s willingness to listen and respond to parliamentarians about the Bill. We listened carefully to the extensive prelegislative scrutiny that the Bill received, including from the Joint Committee on which he served. As a result, we have adopted 66 of the changes that that Committee recommended, including on significant things such as commercial pornography and fraudulent advertising.
If Members have been listening to me carefully, they will know that the Government are doing further work or are carefully listening in a few areas. We may have more to say on those topics as the Bill progresses; it is always important to get the drafting of the provisions exactly right. I hope that that has indicated to the hon. Gentleman our willingness to listen, which I think we have already demonstrated well.
On new clause 36, it is important to mention that there is already a criminal offence of inciting suicide. It is a schedule 7 priority offence, so the Bill already requires companies to tackle content that amounts to the existing offence of inciting suicide. That is important. We would expect the promotion of material that encourages children to self-harm to be listed as a primary priority harm relating to children, where, again, there is a proactive duty to protect them. We have not yet published that primary priority harm list, but it would be reasonable to expect that material promoting children to self-harm would be on it. Again, although we have not yet published the list of content that will be on the adult priority harm list—obviously, I cannot pre-empt the publication of that list—one might certainly wish for content that promotes adults to self-harm to appear on it too.
The hon. Gentleman made the point that duties relating to adults would apply only to category 1 companies. Of course, the ones that apply to children would apply to all companies where there was significant risk, but he is right that were that priority harm added to the adult legal but harmful list, it would apply only to category 1 companies.
In a second, but I may be about to answer the hon. Lady’s question.
Those category 1 companies are likely to be small in number, as I think the shadow Minister said, but I would imagine—I do not have the exact number—that they cover well over 90% of all traffic. However, as I hinted on the Floor of the House on Second Reading—we may well discuss this later—we are thinking about including platforms that may not meet the category 1 size threshold but none the less pose high-level risks of harm. If that is done—I stress “if”—it will address the point raised by the hon. Member for Ochil and South Perthshire. That may answer the point that the hon. Member for Batley and Spen was going to raise, but if not, I happily give way.
It kind of does, but the Minister has raised some interesting points about children and adults and the risk of harm. To go back to the work of Samaritans, it is really important to talk about the fact that suicide is the biggest killer of young people aged 16 to 24, so it transcends the barrier between children and adults. With the right hon. Member for Basingstoke, the hon. Member for Aberdeen North, and the shadow Minister, my hon. Friend the Member for Pontypridd, we have rightly talked a lot about women, but it is really important to talk about the fact that men account for three quarters of all suicide. Men aged between 45 and 49 are most at risk of suicide—the rate among that group has been persistently high for years. It is important that we bring men into the discussion about suicide.
I am grateful for the element of gender balance that the hon. Member has introduced, and she is right to highlight the suicide risk. Inciting suicide is already a criminal offence under section 2 of the Suicide Act 1961 and we have named it a priority offence. Indeed, it is the first priority offence listed under schedule 7—it appears a third of the way down page 183—for exactly the reason she cited, and a proactive duty is imposed on companies by paragraph 1 of schedule 7.
On amendment 142 and the attendant new clause 36, the Government agree with the sentiment behind them—namely, the creation of a new offence of encouraging or assisting serious self-harm. We agree with the substance of the proposal from the hon. Member for Ochil and South Perthshire. As he acknowledged, the matter is under final consideration by the Law Commission and our colleagues in the Ministry of Justice. The offence initially proposed by the Law Commission was wider in scope than that proposed under new clause 36. The commission’s proposed offence covered the offline world, as well as the online one. For example, the new clause as drafted would not cover assisting a person to self-harm by providing them with a bladed article because that is not an online communication. The offence that the Law Commission is looking at is broader in scope.
The Government have agreed in principle to create an offence based on the Law Commission recommendation in separate legislation, and once that is done the scope of the new offence will be wider than that proposed in the new clause. Rather than adding the new clause and the proposed limited new offence to this Bill, I ask that we implement the offence recommended by the Law Commission, the wider scope of which covers the offline world as well as the online world, in separate legislation. I would be happy to make representations to my colleagues in Government, particularly in the MOJ, to seek clarification about the relevant timing, because it is reasonable to expect it to be implemented sooner rather than later. Rather than rushing to introduce that offence with limited scope under the Bill, I ask that we do it properly as per the Law Commission recommendation.
Once the Law Commission recommendation is enacted in separate legislation, to which the Government have already agreed in principle, it will immediately flow through automatically to be incorporated into clause 52(4)(d), which relates to illegal content, and under clause 176, the Secretary of State may, subject to parliamentary approval, designate the new offence as a priority offence under schedule 7 via a statutory instrument. The purpose of amendment 142 can therefore be achieved through a SI.
The Government publicly entirely agree with the intention behind the proposed new clause 36, but I think the way to do this is to implement the full Law Commission offence as soon as we can and then, if appropriate, add it to schedule 7 by SI. The Government agree with the spirit of the hon. Gentleman’s proposal, but I believe that the Government already have a plan to do a more complete job to create the new offence.
I have nothing to add and, having consulted my hon. Friend the Member for Aberdeen North, on the basis of the Minister’s assurances, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
I beg to move amendment 116, in schedule 7, page 183, line 11, at end insert—
“1A An offence under section 13 of the Criminal Justice Act (Northern Ireland) 1966 (c. 20 (N.I.)) (assisting suicide etc).”
This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.
These amendments pick up a question asked by the hon. Member for Aberdeen North much earlier in our proceedings. In schedule 7 we set out the priority offences that exist in English and Welsh law. We have consulted the devolved Administrations in Scotland and Northern Ireland extensively, and I believe we have agreed with them a number of offences in Scottish and Northern Irish law that are broadly equivalent to the English and Welsh offences already in schedule 7. Basically, Government amendments 116 to 126 add those devolved offences to the schedule.
In future, if new Scottish or Northern Irish offences are created, the Secretary of State will be able to consult Scottish or Northern Irish Ministers and, by regulations, amend schedule 7 to add the new offences that may be appropriate if conceived by the devolved Parliament or Assembly in due course. That, I think, answers the question asked by the hon. Lady earlier in our proceedings. As I say, we consulted the devolved Administrations extensively and I hope that the Committee will assent readily to the amendments.
The amendments aim to capture all the criminal offences in other parts of the UK to be covered by the provisions of the Bill, as the Minister outlined. An offence in one part of the UK will be considered an offence elsewhere, for the purposes of the Bill.
With reference to some of the later paragraphs, I am keen for the Minister to explain briefly how this will work in the case of Scotland. We believe that the revenge porn offence in Scotland is more broadly drawn than the English version, so the level of protection for women in England and Wales will be increased. Can the Minister confirm that?
The Bill will not apply the Scottish offence to English offenders, but it means that content that falls foul of the law in Scotland, but not in England or Wales, will still be relevant regulated content for service providers, irrespective of the part of the UK in which the service users are located. That makes sense from the perspective of service providers, but I will be grateful for clarity from the Minister on this point.
I thank the Minister for tabling the amendments. In the evidence sessions, we heard about omissions in schedule 7 from not having Northern Irish and Scottish offences included. Such offences were included in schedule 6 but, at that point, not in schedule 7.
I appreciate that the Minister has worked with the devolved Administrations to table the amendments. I also appreciate the way in which amendment 126 is written, such that the Secretary of State “must consult” Scottish Ministers and the Department of Justice in Northern Ireland before making regulations that relate to legislation in either of the devolved countries. I am glad that the amendments have been drafted in this way and that the concern that we heard about in evidence no longer seems to exist, and I am pleased with the Minister’s decision about the way in which to make any future changes to legislation.
I agree with the position put forward by the hon. Member for Pontypridd. My understanding, from what we heard in evidence a few weeks ago, is that, legally, all will have to agree with the higher bar of the offences, and therefore anyone anywhere across the UK will be provided with the additional level of protection. She is right that the offence might not apply to everyone, but the service providers will be subject to the requirements elsewhere. Similarly, that is my view. Once again, I thank the Minister.
Briefly, I hope that the amendments provide further evidence to the Committee of the Government’s willingness to listen and to respond. I can provide the confirmation that the hon. Members for Aberdeen North and for Pontypridd requested: the effect of the clauses is a levelling up—if I may put it that way. Any of the offences listed effectively get applied to the UK internet, so if there is a stronger offence in any one part of the United Kingdom, that will become applicable more generally via the Bill. As such, the answer to the question is in the affirmative.
Amendment 116 agreed to.
My custom with amendments to be moved formally is to call them by number. If Members wish to vote on them, they should shout; otherwise, I will rattle through them. It is quicker that way.
Amendments made: 117, in schedule 7, page 183, line 29, at end insert—
“4A An offence under section 50A of the Criminal Law (Consolidation) (Scotland) Act 1995 (racially-aggravated harassment).”
This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.
Amendment 118, in schedule 7, page 183, line 36, at end insert—
“5A An offence under any of the following provisions of the Protection from Harassment (Northern Ireland) Order 1997 (S.I. 1997/1180 (N.I. 9))—
(a) Article 4 (harassment);
(b) Article 6 (putting people in fear of violence).”
This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.
Amendment 119, in schedule 7, page 184, line 2, at end insert—
“6A An offence under any of the following provisions of the Criminal Justice and Licensing (Scotland) Act 2010 (asp 13)—
(a) section 38 (threatening or abusive behaviour);
(b) section 39 (stalking).”
This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.
Amendment 120, in schedule 7, page 184, line 38, at end insert—
“12A An offence under any of the following provisions of the Criminal Justice (Northern Ireland) Order 1996 (S.I. 1996/3160 (N.I. 24))—
(a) Article 53 (sale etc of knives);
(b) Article 54 (sale etc of knives etc to minors).”
This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.
Amendment 121, in schedule 7, page 184, line 42, at end insert—
“13A An offence under any of the following provisions of the Firearms (Northern Ireland) Order 2004 (S.I. 2004/702 (N.I. 3))—
(a) Article 24 (sale etc of firearms or ammunition without certificate);
(b) Article 37(1) (sale etc of firearms or ammunition to person without certificate etc);
(c) Article 45(1) and (2) (purchase, sale etc of prohibited weapons);
(d) Article 63(8) (sale etc of firearms or ammunition to people who have been in prison etc);
(e) Article 66A (supplying imitation firearms to minors).”
This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.
Amendment 122, in schedule 7, page 184, line 44, at end insert—
“14A An offence under any of the following provisions of the Air Weapons and Licensing (Scotland) Act 2015 (asp 10)—
(a) section 2 (requirement for air weapon certificate);
(b) section 24 (restrictions on sale etc of air weapons).”
This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.
Amendment 123, in schedule 7, page 185, line 8, at end insert—
“16A An offence under any of the following provisions of the Sexual Offences (Northern Ireland) Order 2008 (S.I. 2008/1769 (N.I. 2))—
(a) Article 62 (causing or inciting prostitution for gain);
(b) Article 63 (controlling prostitution for gain).”—(Chris Philp.)
This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.
Schedule 7 sets out the list of criminal content that in-scope firms will be required to remove as a priority. Labour was pleased to see new additions to the most recent iteration, including criminal content relating to online drug and weapons dealing, people smuggling, revenge porn, fraud, promoting suicide and inciting or controlling prostitution for gain. The Government’s consultation response suggests that the systems and processes that services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures.
More widely, although we appreciate that the establishment of priority offences online is the route the Government have chosen to go down with the Bill, we believe the Bill remains weak in relation to addressing harms to adults and wider societal harms. Sadly, the Bill remains weak in its approach and has seemingly missed a number of known harms to both adults and children that we feel are a serious omission. Three years on from the White Paper, the Government know where the gaps are, yet they have failed to address them. That is why we are pleased to support the amendment tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North.
Human trafficking offences are a serious omission from schedule 7 that must urgently be rectified. As we all know from whistleblower Frances Haugen’s revelations, Facebook stands accused, among the vast array of social problems, of profiting from the trade and sale of human beings—often for domestic servitude—by human traffickers. We also know that, according to internal documents, the company has been aware of the problems since at least 2018. As the hon. Member for Ochil and South Perthshire said, we know that a year later, on the heels of a BBC report that documented the practice, the problem was said to be so severe that Apple itself threatened to pull Facebook and Instagram from its app store. It was only then that Facebook rushed to remove content related to human trafficking and made emergency internal policy changes to avoid commercial consequences described as “potentially severe” by the company. However, an internal company report detailed that the company did not take action prior to public disclosure and threats from Apple—profit over people.
In a complaint to the US Securities and Exchange Commission first reported by The Wall Street Journal, whistleblower Haugen wrote:
“Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.”
I cannot believe that the Government have failed to commit to doing more to tackle such abhorrent practices, which are happening every day. I therefore urge the Minister to do the right thing and support amendment 90.
The first thing to make clear to the Committee and anyone listening is that, of course, offences under the Modern Slavery Act 2015 are brought into the scope of the illegal content duties of this Bill through clause 52(4)(d), because such offences involve an individual victim.
Turning to the priority offences set out in schedule 7 —I saw this when I was a Home Office Minister—modern slavery is generally associated with various other offences that are more directly visible and identifiable. Modern slavery itself can be quite hard to identify. That is why our approach is, first, to incorporate modern slavery as a regular offence via clause 52(4)(d) and, secondly, to specify as priority offences those things that are often identifiable symptoms of it and that are feasibly identified. Those include many of the offences listed in schedule 7, such as causing, inciting or controlling prostitution for gain, as in paragraph 16 on sexual exploitation, which is often the manifestation of modern slavery; money laundering, which is often involved where modern slavery takes place; and assisting illegal immigration, because modern slavery often involves moving somebody across a border, which is covered in paragraph 15 on assisting illegal immigration, as per section 25 of the Immigration Act 1971.
Modern slavery comes into scope directly via clause 52(4)(d) and because the practicably identifiable consequences of modern slavery are listed as priority offences, I think we do have this important area covered.
I appreciate that the Minister thinks that there are other measures that cover this offence, but will he keep it under consideration going forward? I do not think that that is too much to ask. Part of the logic behind that is that some of the other issues, where the reasons behind them must be proved, are much more difficult to define or prove than the modern slavery offences that we are asking to be added here. Whether he accepts the amendment or not, will he commit to considering the matter and not just saying, “Absolutely no”? That would be helpful for us and the many organisations that are keen for such things to be included.
I am happy to give that further consideration, but please do not interpret that as a firm commitment. I repeat that the Modern Slavery Act is brought into the scope of this Bill via clause 52(4)(d).
I have nothing further to add. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Schedule 7, as amended, agreed to.
Clause 53
“Content that is harmful to children” etc
I have a couple of questions for the Minister. The first is about the interaction of subsection (4)(c) and subsection (5). I am slightly confused about how that, because subsection (4)(c) states that anything that is not within the terms of primary priority content or primary content but is harmful to
“an appreciable number of children”
is included as
“content that is harmful to children”.
That is completely reasonable. However, subsection (5) excludes illegal content and content with a “potential financial impact”. I appreciate that these provisions are drafted in quite a complicated way, but it would be useful to have an understanding of what that means. If it means there is no harm on the basis of things that are financial in nature, that is a problem, because that explicitly excludes gambling-type sites, loot boxes and anything of that sort, which by their nature are intentionally addictive and try to get children or adults to part with significant amounts of cash. If they are excluded, that is a problem.
How will clause 53 be future-proofed? I am not suggesting that there is no future proofing, but it would be helpful to me and fellow Committee members if the Minister explained how the clause will deal with new emerging harms and things that may not necessarily fall within the definitions that we set initially. How will those definitions evolve and change as the internet evolves and changes, and as the harms with which children are presented evolve and change?
And finally—I know that the Minister mentioned earlier that saying, “And finally”, in a speech is always a concern, but I am saying it—I am slightly concerned about the wording in subsection (4)(c), which refers to
“material risk of significant harm to an appreciable number of children”,
because I am not clear what an “appreciable number” is. If there is significant harm to one child from content, and content that is incredibly harmful to children is stumbled upon by a child, is it okay for that provider to have such content? It is not likely to accessed by an “appreciable number of children” and might be accessed by only a small number, but if the Minister could give us an understanding of what the word “appreciable” means in that instance, that would be greatly appreciated.
There are one or two points to pick up on. A question was raised about algorithms, and it is worth saying that the risk assessments that platforms must undertake will include consideration of the operation of algorithms. It is important to make it absolutely clear that that is the case.
The shadow Minister asked about the definition of harm, and whether all the harms that might concern Parliament, and many of us as parents, will be covered. It may be helpful to refer to definition of harm provided in clause 187, at the top of page 153. Committee members will note that the definition is very wide and that subsection (2) defines it as “physical or psychological harm”, so I hope that partly answers the shadow Minister’s question.
I am jumping ahead a bit, but I know that we will discuss clause 150, Zach’s law and epilepsy in particular at some point. Given the definition that my hon. Friend has just cited, am I correct to assume that the physical harm posed to those with epilepsy who might be targeted online will be covered, and that it is not just about psychological harm?
I admire my hon. Friend’s attention to the debate. The definition of harm for the harmful communications offence in clause 150 is set out in clause 150(4). In that context, harm is defined slightly differently, as
“psychological harm amounting to at least serious distress”.
The definition of harm in clause 187 that I read out is the definition of harm used elsewhere in the Bill. However, as I said before in the House and in the evidence session, the Government’s belief and intention is that epilepsy trolling would fall in the scope of clause 150, because giving someone an epileptic fit clearly does have a physical implication, as my hon. Friend said, but also causes psychological harm. Being given an epileptic fit is physically damaging, but it causes psychological harm as well.
Despite the fact that the definition of harm in clause 187 does not apply in clause 150, which has its own definition of harm, I am absolutely categoric that epilepsy trolling is caught by clause 150 because of the psychological harm it causes. I commend my hon. Friend the Member for Watford for being so attentive on the question of epilepsy, and also in this debate.
Returning to the definition of harm in clause 187, besides the wide definition covering physical and psychological harm, clause 187(4) makes it clear that harm may also arise not just directly but if the content prompts individuals to
“act in a way that results in harm to themselves or that increases the likelihood of harm to themselves”.
Clause 187(4)(b) covers content where the
“individuals do or say something to another individual that results in”
that individual suffering harm. I hope the shadow Minister is reassured that the definition of harm that applies here is extremely wide in scope.
There was a question about media literacy, which I think the hon. Member for Batley and Spen raised in an intervention. Media literacy duties on Ofcom already exist in the Communications Act 2003. The Government published a comprehensive and effective media literacy strategy about a year ago. In December—after the first version of the Bill was produced, but before the second and updated version—Ofcom updated its policy in a way that went beyond the duties contained in the previous version of the Bill. From memory, that related to the old clause 103, in the version of the Bill published in May last year, which is of course not the same clause in this version of the Bill, as it has been updated.
The hon. Member for Aberdeen North raised, as ever, some important points of detail. She asked about future proofing. The concept of harm expressed in the clause is a general concept of harm. The definition of harm is whatever is harmful to children, which includes things that we do not know about at the moment and that may arise in the future. Secondly, primary priority content and priority content that is harmful can be updated from time to time by a statutory instrument. If some new thing happens that we think deserves to be primary priority content or priority content that is harmful to children, we can update that using a statutory instrument.
The hon. Lady also asked about exclusions in clause 53(5). The first exclusion in subsection (5)(a) is illegal content, because that is covered elsewhere in the Bill—it is covered in clause 52. That is why it is excluded, because it is covered elsewhere. The second limb, subsection 5(b), covers some financial offences. Those are excluded because they are separately regulated. Financial services are separately regulated. The hon. Lady used the example of gambling. Gambling is separately regulated by the Gambling Act 2005, a review of which is imminent. There are already very strong provisions in that Act, which are enforced by the regulator, the Gambling Commission, which has a hard-edged prohibition on gambling if people are under 18.
However, I do not think that loot boxes even existed in 2005 when that Act was published. Loot boxes are gambling. They may not be covered by that legislation, but they are gambling. Will the Minister consider whether those harms are unintentionally excluded by clause 53?
We are getting into some detail here. In the unlikely event that any member of the Committee does not know what a loot box is, it is where someone playing a game can buy extra lives or enhance the game’s functionality somehow by paying some money. There have been some cases where children have stolen their parent’s credit card and bought these things in large numbers
Having played lots of games, I can clarify that people do not know what they are getting with a loot box, so they are putting money forward but do not know whether they will get a really good piece of armour or a really crap piece of armour. It is literally gambling, because children do not know what will come out of the box, as opposed to just buying a really good piece of armour with £2.99 from their parent’s credit card.
However, the reward is non-monetary in nature. For that reason, the Government’s view—if I can test your patience momentarily, Sir Roger, as we are straying somewhat outside this particular debate—is that loot boxes will not be covered by the gambling review, because we do not see them as gambling. However, we do see them as an issue that needs to be addressed, and that will happen via the online advertising programme, which will be overseen by the Minister for Media, Data and Digital Infrastructure, my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez). That will happen shortly and advertising legislation will follow, so loot boxes will be addressed in the online advertising programme and the subsequent legislation.
The other question raised by the hon. Member for Aberdeen North was about the definition of “an appreciable number”. I have a couple of points to make. By definition, anything that is illegal is covered already in schedule 7 or through clause 52(4)(d), which we have mentioned a few times. Content that is
“primary priority content that is harmful to children”
or
“priority content that is harmful to children”
is covered in clause 53(4)(a) and (b), so we are now left with the residue of stuff that is neither illegal nor primary priority content; it is anything left over that might be harmful. By definition, we have excluded all the serious harms already, because they would be either illegal or in the priority categories. We are left with the other stuff. The reason for the qualifier “appreciable” is to make sure that we are dealing only with the residual non-priority harmful matters. We are just making sure that the duty is reasonable. What constitutes “appreciable” will ultimately get set out through Ofcom guidance, but if it was a tiny handful of users and it was not a priority harm, and was therefore not considered by Parliament to be of the utmost priority, it would be unlikely to be applicable to such a very small number. Because it is just the residual category, that is a proportionate and reasonable approach to take.
Given the Government’s ability to designate priority content and primary priority content through secondary legislation, the Minister is telling me that if they decided that loot boxes were not adequately covered by the future legislation coming through, and they were to discover that something like this was a big issue, they could add that to one of the two priority content designations.
The hon. Member is asking me a somewhat technical question, and I hesitate to answer without taking full advice, but I think the answer is yes. The reason that loot boxes are not considered gambling in our view is that they do not have a monetary value, so the exclusion in clause 53(5)(b)(i) does not apply. On a quick off-the-cuff reading, it does not strike me immediately that the exclusions in (5)(b)(ii) or (iii) would apply to loot boxes either, so I believe—and officials who know more about this than I do are nodding—that the hon. Lady is right to say that it would be possible for loot boxes to become primary priority content or priority content by way of a statutory instrument. Yes, my belief is that that would be possible.
Question put and agreed to.
Clause 53 accordingly ordered to stand part of the Bill.
Clause 54
“Content that is harmful to children” etc
I beg to move amendment 83, in clause 54, page 50, line 39, at end insert—
“(2A) Priority content designated under subsection (2) must include content that contains health-related misinformation and disinformation, where such content is harmful to adults.”
This amendment would amend Clause 54 so that the Secretary of State’s designation of “priority content that is harmful to adults” must include a description of harmful health related misinformation or disinformation (as well as other priority content that might be designated in regulations by the Secretary of State).
The Bill requires category 1 service providers to set out how they will tackle harmful content on their platforms. In order for this to work, certain legal but harmful content must be designated in secondary legislation as
“priority content that is harmful to adults.”
As yet, however, it is not known what will be designated as priority content or when. There have been indications from Government that health-related misinformation and disinformation will likely be included, but there is no certainty. The amendment would ensure that harmful health-related misinformation and disinformation would be designated as priority content that is harmful to adults.
Thank you, Sir Roger. I think that the Minister would agree that this is probably one of the most contentious parts of the Bill. It concerns legal but harmful content, which is causing an awful lot of concern out there. The clause says that the Secretary of State may in regulations define as
“priority content that is harmful to adults”
content that he or she considers to present
“a material risk of significant harm to an appreciable number of adults”.
We have discussed this issue in other places before, but I am deeply concerned about freedom of speech and people being able to say what they think. What is harmful to me may not be harmful to any other colleagues in this place. We would be leaving it to the Secretary of State to make that decision. I would like to hear the Minister’s thoughts on that.
I am very happy to reply to the various queries that have been made. I will start with the points on vaccine disinformation raised by the hon. Members for Ochil and South Perthshire and for Pontypridd. The Government strongly agree with the points they made about the damaging effects of vaccine misinformation and the fact that many of our fellow citizens have probably died as a result of being misled into refusing the vaccine when it is, of course, perfectly safe. We strongly share the concerns they have articulated.
Over the past two years, the Department for Digital, Culture, Media and Sport has worked together with other Departments to develop a strong operational response to this issue. We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.
In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.
Obviously, we agree with the intention behind the amendment. However, the way to handle it is not to randomly drop an item into the Bill and leave the rest to a statutory instrument. Important and worthy though it may be to deal with disinformation, and specifically harmful health-related disinformation, there are plenty of other important things that one might add that are legal but harmful to adults, so we will not accept the amendment. Instead, we will proceed as planned by designating the list via a statutory instrument. I know that a number of Members of Parliament, probably including members of this Committee, would find it helpful to see a draft list of what those items might be, not least to get assurance that health-related misinformation and disinformation is on that list. That is something that we are considering very carefully, and more news might be forthcoming as the Bill proceeds through Parliament.
My hon. Friend has talked about the Department’s counter-disinformation unit. Do the Government anticipate that that function to continue, or will they expect Ofcom to do it?
The work of the counter-disinformation unit is valuable. We look at these things on a spending review by spending review basis, and as far as I am aware we intend to continue with the counter-disinformation unit over the current spending review period. Clearly, I cannot commit future Ministers in perpetuity, but my personal view—if I am allowed to express it—is that that unit performs a useful function and could valuably be continued into the future. I think it is useful for the Government, as well as Ofcom, to directly have eyes on this issue, but I cannot speak for future Ministers. I can only give my right hon. Friend my own view.
I hope that I have set out my approach. We have heard the calls to publish the list so that parliamentarians can scrutinise it, and we also heard them on Second Reading.
I will now turn to the question raised by my hon. Friend the Member for Don Valley regarding freedom of expression. Those on one side of the debate are asking us to go further and to be clearer, while those on the other side have concerns about freedom of expression. As I have said, I honestly do not think that these legal but harmful provisions infringe on freedom of speech, for three reasons. First, even when the Secretary of State decides to designate content and Parliament approves of that decision through the affirmative procedure—Parliament gets to approve, so the Secretary of State is not acting alone—that content is not being banned. The Bill does not say that content designated as legal but harmful should immediately be struck from every corner of the internet. It simply says that category 1 companies—the big ones—have to do a proper risk assessment of that content and think about it properly.
Secondly, those companies have to have a policy to deal with that content, but that policy is up to them. They could have a policy that says, “It is absolutely fine.” Let us say that health disinformation is on the list, as one would expect it to be. A particular social media firm could have a policy that says, “We have considered this. We know it is risky, but we are going to let it happen anyway.” Some people might say that that is a weakness in the Bill, while others might say that it protects freedom of expression. It depends on one’s point of view, but that is how it works. It is for the company to choose and set out its policy, and the Bill requires it to enforce it consistently. I do not think that the requirements I have laid out amount to censorship or an unreasonable repression of free speech, because the platforms can still set their own terms and conditions.
There is also the general duty to have regard to free speech, which is introduced in clause 19(2). At the moment, no such duty exists. One might argue that the duty could be stronger, as my hon. Friend suggested previously, but it is unarguable that, for the first time ever, there is a duty on the platforms to have regard to free speech.
The argument has been made that the social media companies are doing this anyway, but two wrongs don’t make a right. We need to stop them doing it. I understand what we are trying to do here. We can see straight away that the Opposition want to be tighter on this. At a later date, if the Bill goes through as it is, freedom of speech will be gradually suppressed, and I am really concerned about that. My hon. Friend said that it would come back to Parliament, which I am pleased about. Are the priorities going to be written into the Bill? Will we be able to vote on them? If the scope is extended at any point in time, will we be able to vote on that, or will the Secretary of State just say, “We can’t have that so we’re just going to ban it”?
I will answer the questions in reverse order. The list of harms will not be in the Bill. The amendment seeks to put one of the harms in the Bill but not the others. So no, it will not be in the Bill. The harms—either the initial list or any addition to or subtraction from the list—will be listed in an affirmative statutory instrument, which means that the House will be able to look at it and, if it wants, to vote on it. So Parliament will get a chance to look at the initial list, when it is published in an SI. If anything is to be added in one, two or three years’ time, the same will apply.
So will we be able to vote on any extension of the scope of the Bill at any time? Will that go out to public consultation as well?
Yes. There is an obligation on the Secretary of State to consult—[Interruption.] Did I hear someone laugh?—before proposing a statutory instrument to add things. There is a consultation first and then, if extra things are going to be added—in my hon. Friend’s language, if the scope is increased—that would be votable by Parliament because it is an affirmative SI. So the answer is yes to both questions. Yes there will be consultation in advance, and yes, if this Government or a future Government wanted to add anything, Parliament could vote on it if it wanted to because it will be an affirmative SI. That is a really important point.
In a moment; I want to answer the other point made by my hon. Friend the Member for Don Valley first. He said that two wrongs don’t make a right. I am not defending the fact that social media firms act in a manner that is arbitrary and censorious at the moment. I am not saying that it is okay for them to carry on. The point that I was making was a different one. I was saying that they act censoriously and arbitrarily at times at the moment. The Bill will diminish their ability to do that in a couple of ways. First, for the legal but harmful stuff, which he is worried about, they will have a duty to act consistently. If they do not, Ofcom will be able to enforce against them. So their liberty to behave arbitrarily, for this category of content at least, will be circumscribed. They will now have to be consistent. For other content that is outside the scope of this clause —which I guess therefore does not worry my hon. Friend—they can still be arbitrary, but for this they have got to be consistent.
There is also the duty to have regard to freedom of expression, and there is a protection of democratic and journalistic importance in clauses 15 and 16. Although those clauses are not perfect and some people say they should be stronger, they are at least better than what we have now. When I say that this is good for freedom of speech, I mean that nothing here infringes on freedom of speech, and to the extent that it moves one way or the other, it moves us somewhat in the direction of protecting free speech more than is the case at the moment, for the reasons I have set out. I will be happy to debate the issue in more detail either in this Committee or outside, if that is helpful and to avoid trying the patience of colleagues.
I thank the Minister for giving way; I think that is what he was doing as he sat down.
Just for clarity, the hon. Member for Don Valley and the Minister have said that Labour Members are seeking to curtail or tighten freedom of expression and freedom of speech, but that is not the case. We fundamentally support free speech, as we always have been. The Bill addresses systems and processes, and that is what it should do—the Minister, the Labour party and I are in full alignment on that. We do not think that the Bill should restrict freedom of speech. I would just like to put that on the record.
We also share the concerns expressed by the hon. Member for Don Valley about the Secretary of State’s potential powers, the limited scope and the extra scrutiny that Parliament might have to undertake on priority harms, so I hope he will support some of our later amendments.
I am grateful to the shadow Minister for confirming her support for free speech. Perhaps I could take this opportunity to apologise to you, Sir Roger, and to Hansard for turning round. I will try to behave better in future.
I find myself not entirely reassured, so I think we should press the amendment to a vote.
Question put, That the amendment be made.
I have heard my right hon. Friend’s points about a standing Joint Committee for post-legislative implementation scrutiny. On the comments about the time, I agree that the Ofcom review needs to be far enough into the future that it can be meaningful, hence the three-year time period.
On the substance of amendment 62, tabled by the shadow Minister, I can confirm that the Government are already undertaking research and working with stakeholders on identifying what the priority harms will be. That consideration includes evidence from various civil society organisations, victims organisations and many others who represent the interests of users online. The wider consultation beyond Ofcom that the amendment would require is happening already as a matter of practicality.
We are concerned, however, that making this a formal consultation in the legal sense, as the amendment would, would introduce some delays while we do so, because a whole sequence of things have to happen after Royal Assent. First, we have to designate the priority harms by statutory instrument, and then Ofcom has to publish its risk assessments and codes of practice. If we insert into that a formal legal consultation step, it would add at least four or even six months into the process of implementing the Act. I know that that was not the hon. Lady’s intention and that she is concerned about getting the Act implemented quickly. For that reason, the Government do not want to insert a formal legal consultation step into the process, but I am happy to confirm that we are engaging in the consultation already on an informal basis and will continue to do so. I ask respectfully that amendment 62 be withdrawn.
The purpose of clauses 55 and 56 has been touched on already, and I have nothing in particular to add.
I am grateful for the Minister’s comments on the time that these things would take. I cannot see how they could not happen succinctly along with the current consultation, and why it would take an additional four to six months. Could he clarify that?
A formal statutory consultation could happen only after the passage of the Bill, whereas the informal non-statutory consultation we can do, and are doing, now.
Question put, That the amendment be made.
I appreciate that clarification. I just wanted to make it absolutely clear that I strongly believe that anonymity is a very good protection, not just for people who intend to do bad on the internet, but for people who are seeking out community, particularly. I think that that is important.
If you will allow me to say a couple of things about the next clause, Sir Roger, Mencap raised the issue of vulnerable users, specifically vulnerable adult users, in relation to the form of identity verification. If the Minister or Ofcom could give consideration to perhaps including travel passes or adult passes, it might make the internet a much easier place to navigate for people who do not have control of their own documentation—they may not have access to their passport, birth certificate, or any of that sort of thing—but who would be able to provide a travel pass, because that is within their ownership.
We have heard quite a lot about the merits of clause 57, and I am grateful to colleagues on both side for pointing those out. The hon. Member for Pontypridd asked about the effectiveness of the user identity verification processes and how those might occur—whether they would be done individually by each company for their own users, or whether a whole industry would develop even further, with third parties providing verification that could then be used across a whole number of companies.
Some of those processes exist already in relation to age verification, and I think that some companies are already active in this area. I do not think that it would be appropriate for us, in Parliament, to specify those sorts of details. It is ultimately for Ofcom to issue that guidance under clause 58, and it is, in a sense, up to the market and to users to develop their own preferences. If individual users prefer to verify their identity once and then have that used across multiple platforms, that will itself drive the market. I think that there is every possibility that that will happen. [Interruption.]
Order. There is a Division on the Floor of the House. The Committee will sit again in 15 minutes. As far as I am aware, there will only be one vote on this; if there are two, we will return 15 minutes later than that.
I was just concluding my remarks on clause stand part, Sir Roger. User choice and Ofcom guidance will ultimately determine the shape of this market.
The shadow Minister, the hon. Member for Pontypridd, expressed concerns about privacy. That is of course why the list of people Ofcom must consult—at clause 58(3)(a)—specifies the Information Commissioner, to ensure that Ofcom’s guidance properly protects the privacy of users, for the reasons that the shadow Minister referred to in her speech.
Finally, on competition, if anyone attempts to develop an inappropriate monopoly position in this area, the Competition and Markets Authority’s usual powers will apply. On that basis, I commend the clause to the Committee.
Question put and agreed to.
Clause 57 accordingly ordered to stand part of the Bill.
Clause 58
OFCOM’s guidance about user identity verification
Question proposed, That the clause stand part of the Bill.
As we have said previously, it is absolutely right that Ofcom produces guidance for providers of category 1 services to assist with their compliance with the duty. We very much welcome the inclusion and awareness of identity verification forms for vulnerable adult users in subsection (2); once again, however, we feel that that should go further, as outlined in new clause 8.
Clause 58, which was touched on in our last debate, simply sets out Ofcom’s duty to publish guidance for category 1 services to assist them in complying with the user identification duty set out in clause 57. We have probably covered the main points, so I will say nothing further.
Question put and agreed to.
Clause 58 accordingly ordered to stand part of the Bill.
Clause 59
Requirement to report CSEA content to the NCA
Question proposed, That the clause stand part of the Bill.
You are really moving us at pace, Sir Roger. It is a pleasure to serve in Committee with you in the Chair.
It is welcome that regulated services will have to report all child sexual exploitation and abuse material that they detect on their platform. The Government’s decision to move away from the approach of a regulatory code of practice to a mandatory reporting requirement is an important improvement to the draft Bill.
For companies to report child sexual exploitation and abuse material correctly to the mandatory reporting body, they will need access to accurate datasets that will determine whether something that they are intending to report is child sexual exploitation and abuse content. What guidance will be made available to companies so that they can proactively detect CSEA, and what plans are in place to assist companies to identify potential CSEA that has not previously been identified? The impact assessment mentions that, for example, BT is planning to use the Internet Watch Foundation’s hash list, which is compliant with UK law enforcement standards, to identify CSEA proactively. Hashing is a technology used to prevent access to known CSEA; a hash is a unique string of letters and numbers which is applied to an image and which can then be matched every time a user attempts to upload a known illegal image to a platform. It relies, however, on CSEA already having been detected. What plans are in place to assist companies to identify potential CSEA?
Finally, it is important that the introduction of mandatory reporting does not impact on existing international reporting structures. Many of the largest platforms in the scope of the Bill are US-based and required under US law to report CSEA material detected on their platform to the National Centre for Missing and Exploited Children, which ensures that information relevant to UK law enforcement is referred to it for investigation.
To answer the shadow Minister’s question about the duty to detect CSEA proactively—because, as she says, we have to detect it before we can report it—I confirm that there are already duties in the Bill to prevent and detect CSEA proactively, because CSEA is a priority offence in the schedule 6 list of child exploitation and abuse offences, and there is a duty for companies to prevent those proactively. In preventing them proactively, they will by definition identify them. That part of her question is well covered.
The hon. Lady also asked about the technologies available to those companies, including hash matching—comparing images against a known database of child sexual exploitation images. A lot of technology is being developed that can proactively spot child sexual exploitation in new images that are not on the hash matching database. For example, some technology combines age identification with nude image identification; by putting them together, we can identify sexual exploitation of children in images that are new and are not yet in the database.
To ensure that such new technology can be used, we have the duties under clause 103, which gives Ofcom the power to mandate—to require—the use of certain accredited technologies in fighting not just CSEA, but terrorism. I am sure that we will discuss that more when we come to that clause. Combined, the requirement to proactively prevent CSEA and the ability to specify technology under clause 103 will mean that companies will know about the content that they now, under clause 59, have to report to the National Crime Agency. Interestingly, the hon. Member for Worsley and Eccles South mentioned that that duty already exists in the USA, so it is good that we are matching that requirement in our law via clause 59, which I hope that the Committee will agree should stand part of the Bill.
Question put and agreed to.
Clause 59 accordingly ordered to stand part of the Bill.
Clause 60
Regulations about reports to the NCA
Question proposed, That the clause stand part of the Bill.
The hon. Member for Worsley and Eccles South asks about the prioritisation of reports made to the NCA under the new statutory provisions. The prioritisation of investigations is an operational matter for the NCA, acting as a law enforcement body. I do not think it would be right either for myself as a Minister or for Parliament as a legislative body to specify how the NCA should conduct its operational activities. I imagine that it would pursue the most serious cases as a matter of priority, and if there is evidence of any systemic abuse it would also prioritise that, but it really is a matter for the NCA, as an operationally independent police force, to decide for itself. I think it is fairly clear that the scope of matters to be contained in these regulations is fairly comprehensive, as one would expect.
On the questions raised by the hon. Member for Aberdeen North, the Secretary of State might consult Scottish Ministers under clause 63(6)(c), particularly those with responsibility for law enforcement in Scotland, and the same would apply to other jurisdictions. On whether an amendment is required to cover any matters to do with the procedures in Scotland equivalent to the matter covered in clause 61, we do not believe that any equivalent change is required to devolved Administration law. However, in order to be absolutely sure, we will get the hon. Lady written confirmation on that point.
I am not sure that the Minister has answered my question on clause 60. I think we all agree that law enforcement agencies can decide their own priorities, quite rightly, but clause 60(2)(d) sets out that the Secretary of State’s regulations must include
“provision about cases of particular urgency”.
I asked the Minister what that would look like.
Also, we think it is pretty important that the National Crime Agency, the Internet Watch Foundation and Ofcom work together on mandatory reporting. I asked him how he envisaged them working together to share information, because the better they do that, the more children are protected.
I apologise for missing those two points. On working together, the hon. Lady is right that agencies such as the Internet Watch Foundation and others should co-operate closely. There is already very good working between the Internet Watch Foundation, law enforcement and others—they seem to be well networked together and co-operating closely. It is appropriate to put on the record that Parliament, through this Committee, thinks that co-operation should continue. That communication and the sharing of information on particular images is obviously critical.
As the clause states, the regulations can set out expedited timeframes in cases of particular urgency. I understand that to mean cases where there might be an immediate risk to a child’s safety, or where somebody might be at risk in real time, as opposed to something historic—for example, an image that might have been made some time ago. In cases where it is believed abuse is happening at the present time, there is an expectation that the matter will be dealt with immediately or very close to immediately. I hope that answers the shadow Minister’s questions.
Question put and agreed to.
Clause 60 accordingly ordered to stand part of the Bill.
Clause 61 ordered to stand part of the Bill.
Clause 62
Offence in relation to CSEA reporting
I beg to move amendment 1, in clause 62, page 55, line 14, leave out “maximum summary term for either-way offences” and insert “general limit in a magistrates’ court”.
Amendments 1 to 5 relate to the maximum term of imprisonment on summary conviction of an either-way offence in England and Wales. Amendments 1 to 4 insert a reference to the general limit in a magistrates’ court, meaning the time limit in section 224(1) of the Sentencing Code, which, currently, is 12 months.
With this it will be convenient to consider Government amendments 4, 2, 3 and 5.
These amendments make some technical drafting changes to the Bill in relation to sentencing penalties for either-way offences in the courts of England and Wales. They bring the Bill into line with recent changes implemented following the passage of the Judicial Review and Courts Act 2022. The change uses the new term
“general limit in a magistrates’ court”
to account for any future changes to the sentencing limit in the magistrates court. The 2022 Act includes a secondary power to switch, by regulations, between a 12-month and six-month maximum sentence in the magistrates court, so we need to use the more general language in this Bill to ensure that changes back and forth can be accommodated. If we just fix a number, it would become out of sync if switches are made under the 2022 Act.
Amendment 1 agreed to.
Question proposed, That the clause, as amended, stand part of the Bill.
Clause 62 creates an offence, as we discussed earlier, of knowingly or recklessly providing inaccurate information to the NCA in relation to CSEA reporting, the penalty for which is imprisonment, a fine or both. Where a company seeks to evade its responsibility, or disregards the importance of the requirement to report CSEA by providing inaccurate information, it will be liable for prosecution. We are backing the requirement to report CSEA with significant criminal powers.
Clause 63 provides definitions for the terms used in chapter 2 of part 4, in relation to the requirement to report CSEA. In summary, a UK provider of a regulated service is defined as a provider that is
“incorporated or formed under the law of any part of the United Kingdom”
or where it is
“individuals who are habitually resident in the United Kingdom”.
The shadow Minister asked about the test and what counts, and I hope that provides the answer. We are defining CSEA content as content that a company becomes aware of containing CSEA. A company can become aware of that by any means, including through the use of automated systems and processes, human moderation or user reporting.
With regard to the definition of UK-linked CSEA, which the shadow Minister also asked about, that refers to content that may have been published and shared in the UK, or where the nationality or location of a suspected offender or victim is in the UK. The definition of what counts as a UK link is quite wide, because it includes not only the location of the offender or victim but where the content is shared. That is a wide definition.
I have a specific question—the Minister answered a similar question from me earlier. The Bill says that the location of the child “is” in the UK. Would it be reasonable to expect that if a company suspected the child “was” in the UK, although not currently, that would be in scope as something required to be reported? I know that is technical, but if the “was” is included in the “is” then that is much wider and more helpful than just including the current location.
If the child had been in the UK when the offence was committed, that would ordinarily be subject to UK criminal law, because the crime would have been committed in the UK. The test is: where was the child or victim at the time the offence was committed? As I said a moment ago, however, the definition of “UK-linked” is particularly wide and includes
“the place where the content was published, generated, uploaded or shared.”
The word “generated”—I am reading from clause 63(6)(a), at the top of page 56—is clearly in the past tense and would include the circumstance that the hon. Lady described.
What the Minister has said is helpful, but the question I asked was about what guidance and support will be made available to regulated services. We all want this to work, because it is one of the most important aspects of the Bill—many aspects are important. He made it clear to us that the definition is quite wide, for both the general definitions and the “UK-linked” content. The point of the question was, given the possible difficulties in some circumstances, what guidance and support will be made available?
I anticipate that the National Crime Agency will issue best practice guidance. A fair amount of information about the requirements will also be set out in the regulations that the Secretary of State will issue under clause 60, which we have already debated. So it is a combination of those regulations and National Crime Agency best practice guidance. I hope that answers the question.
Finally, on companies being taken over, if a company ceases to be UK-linked, we would expect it to continue to discharge its reporting duties, which might include reporting not just in the UK but to its domestic reporting agency—we have already heard the US agency described and referenced.
I hope that my answers demonstrate that the clause is intended to be comprehensive and effective. It should ensure that the National Crime Agency gets all the information it needs to investigate and prosecute CSEA in order to keep our children safe.
Question put and agreed to.
Clause 62, as amended, accordingly ordered to stand part of the Bill.
Clause 63 ordered to stand part of the Bill.
Clause 64
Transparency reports about certain Part 3 services
I beg to move amendment 54, in clause 64, page 56, line 29, leave out “Once” and insert “Twice”.
This amendment would change the requirement for transparency report notices from once a year to twice a year.
To start with, it is worth saying that clause 64 is extremely important. In the course of debating earlier clauses, Opposition Members rightly and repeatedly emphasised how important it is that social media platforms are compelled to publish information. The testimony that Frances Haugen gave to the Joint Committee and to this Committee a few weeks ago demonstrates how important that is. Social media platforms are secretive and are not open. They seek to disguise what is going on, even though the impact of what they are doing has a global effect. So the transparency power in clause 64 is a critical part of the Bill and will dramatically transform the insights of parliamentarians, the wider public, civil society campaigners and academics. It will dramatically open up the sense of what is going on inside these companies, so it is extremely important indeed.
Amendment 54 seeks to increase the frequency of transparency reporting from once a year to twice a year. To be honest, we do not want to do this unreasonably frequently, and our sense is that once a year, rather than twice a year, is the right regularity. We therefore do not support the amendment. However, Members will notice that there is an ability in clause 64(12) for the Secretary of State, by regulation, to
“amend subsection (1) so as to change the frequency of the transparency reporting process.”
If it turns out in due course that once a year is not enough and we would like to do it more frequently—for example, twice a year—there is the power for those regulations to be used so that the reporting occurs more frequently. The frequency is not set in stone.
I turn to amendment 55, which sets out a number of topics that would be included in reporting. It is important to say that, as a quick glance at schedule 8 shows, the remit of the reports is already extremely wide in scope. Hon. Members will see that paragraph 5 specifies that reports can cover
“systems and processes for users to report content which they consider to be illegal”
or “harmful”, and so on. Paragraph 6 mentions:
“The systems and processes that a provider operates to deal with illegal content, content that is harmful to children”,
and so on. Therefore, the topics that amendment 55 speaks to are already covered by the schedule, and I would expect such things to be reported on. We have given Ofcom the explicit powers to do that and, rather than prescribe such details in the Bill, we should let Ofcom do its job. It certainly has the powers to do such things—that is clearly set out in the schedule—and I would expect, and obviously the Opposition would expect, that it will do so. On that basis, I will gently resist amendments 54 and 55.
On amendment 55, I want to come back to the Minister on two points about languages that were made by the hon. Member for Aberdeen North. I think most people would be shocked to discover that safety systems and the languages in which they operate are not protected, so if people are speaking a language other than English, they will not be protected. I also think that people will be shocked about, as I outlined, the employment of moderators and how badly they are paid and trained. There are factories full of people doing that important task.
I recommend that the Minister thinks again about requiring Ofcom to provide details on human moderators who are employed or engaged and how they are trained and supported. It is a bit like when we find out about factories producing various items under appalling conditions in other parts of the world—we need transparency on these issues to make people do something about it. These platforms will not do anything about it. Under questioning from my hon. Friend the Member for Pontypridd, Richard Earley admitted that he had no idea how many human moderators were working for Facebook. That is appalling and we must do something about it.
I obviously have sympathy with the objectives, but the topics covered in schedule 8, which include the systems and processes for responding to illegal and harmful content and so on, give Ofcom the power to do what the hon. Member requires. On the language point, the risk assessments that companies are required to do are hard-edged duties in the Bill, and they will have to include an assessment of languages used in the UK, which is a large number of languages—obviously, it does not include languages spoken outside the UK. So the duty to risk-assess languages already exists. I hope that gives the hon. Member reassurance. She is making a reasonable point, and I would expect that, in setting out transparency requirements, Ofcom will address it. I am sure that it will look at our proceedings to hear Parliament’s expectations, and we are giving it those powers, which are clearly set out in schedule 8.
I will just make a final point. The Bill gives Ofcom powers when it already has so much to do. We keep returning to the point of how much will ride on Ofcom’s decisions. Our amendments would make clear the requirement for transparency reporting relating to the language issue, as well as the employment of human moderators and how they are trained and supported. If we do not point that out to Ofcom, it really has enough other things to be doing, so we are asking for these points to be drawn out specifically. As in so many of our amendments, we are just asking for things to be drawn out so that they happen.
Question put, That the amendment be made.
I associate myself with the comments made by the hon. Member for Pontypridd and apologise on behalf of my hon. Friend the Member for Ochil and South Perthshire, who is currently in the Chamber dealing with the Channel 4 privatisation. I am sure that, given his position on the Joint Committee, he would have liked to comment on the clause and would have welcomed its inclusion in the Bill, but, unfortunately, he cannot currently do so.
It is a great shame that the hon. Member for Ochil and South Perthshire is occupied in the main Chamber, because I could have pointed to this change as one of the examples of the Government listening to the Joint Committee, on which he and many others served. However, I hope that the hon. Member for Aberdeen North will communicate my observation to him, which I am sure he will appreciate.
In seriousness, this is an example of the Government moving the Bill on in response to widespread parliamentary and public commentary. It is right that we extend the duties to cover commercial pornographic content as well as the user-to-user pornography covered previously. I thank the Opposition parties for their support for the inclusion of those measures.
As a member of the Joint Committee, on which I worked with the hon. Member for Ochil and South Perthshire, I thank the Minister for including this clause on a point that was debated at length by the Joint Committee. Its inclusion is crucial to organisations in my constituency such as Dignify—a charity that works to raise awareness and campaign on this important point, to protect children but also wider society. As this is one of the 66 recommendations that the Minister took forward in the Bill, I would like to thank him; it is very welcome, and I think that it will make a huge difference to children and to society.
I thank my hon. Friend for his intervention and for his work on the Joint Committee, which has had a huge impact, as we have seen. I hope that colleagues will join me in thanking the members of the Joint Committee for their work.
My final point on this important clause is in response to a question that the shadow Minister raised about clause 66(3), which makes reference to
“a person acting on behalf of the provider”.
That is just to ensure that the clause is comprehensively drafted without any loopholes. If the provider used an agent or engaged some third party to disseminate content on their behalf, rather than doing so directly, that would be covered too. We just wanted to ensure that there was absolutely no loophole—no chink of light—in the way that the clause was drafted. That is why that reference is there.
I am delighted that these clauses seem to command such widespread support. It therefore gives me great pleasure to commend them to the Committee.
Question put and agreed to.
Clause 66 accordingly ordered to stand part of the Bill.
Clause 67 ordered to stand part of the Bill.
Schedule 9 agreed to.
Clause 68
Duties about regulated provider pornographic content
I beg to move amendment 114, in clause 68, page 60, line 13, at end insert—
“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.
(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.
(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.
(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”
This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.
I am grateful to the right hon. Member for her intervention. She knows that I have the utmost respect for all that she has tried to achieve in this area in the House along with my right hon. Friend the Member for Kingston upon Hull North.
We feel these amendments would encapsulate the specific issue of consent-based imagery or video content for which consent has not been obtained. Many of these people do not even know that the content has been taken in the first place, and it is then uploaded to these websites. It would be the website’s duty to verify that consent had been obtained and that the people in the video were of the age of consent. That is why we urge hon. Members to back the amendments.
The shadow Minister has laid out compellingly how awful the displaying of images of children on pornography websites and the displaying of images where the consent of the person has not been obtained are. Let me take each of those in turn, because my answers will be a bit different in the two cases.
First, all material that contains the sexual abuse of children or features children at all—any pornographic content featuring children is, by definition, sexual abuse—is already criminalised through the criminal law. Measures such as the Protection of Children Act 1978, the Criminal Justice Act 1988 and the Coroners and Justice Act 2009 provide a range of criminal offences that include the taking, making, circulating, possessing with a view to distributing, or otherwise possessing indecent photos or prohibited images of children. As we would expect, everything that the hon. Lady described is already criminalised under existing law.
This part of the Bill—part 5—covers publishers and not the user-to-user stuff we talked about previously. Because they are producing and publishing the material themselves, publishers of such material are covered by the existing criminal law. What they are doing is already illegal. If they are engaged in that activity, they should—and, I hope, will—be prosecuted for doing it.
The new clause and the amendments essentially seek to duplicate what is already set out very clearly in criminal law. While their intentions are completely correct, I do not think it is helpful to have duplicative law that essentially tries to do the same thing in a different law. We have well established and effective criminal laws in these areas.
In relation to the separate question of people whose images are displayed without their consent, which is a topic that my right hon. Friend the Member for Basingstoke has raised a few times, there are existing criminal offences that are designed to tackle that, including the recent revenge pornography offences in particular, as well as the criminalisation of voyeurism, harassment, blackmail and coercive or controlling behaviour. There is then the additional question of intimate image abuse, where intimate images are produced or obtained without the consent of the subject, and are then disseminated.
I think it would cover some of them. If, for example, someone in a relationship had a video taken that was then made available on a commercial pornography site, that would clearly be in scope. I am not saying that the revenge pornography legislation covers all examples, but it covers some of them. We have discussed already that clause 150 will criminalise a great deal of the content referred to here if the intention of that content or the communication concerned is to cause harm—meaning
“psychological harm amounting to at least serious distress”—
to the subject. That will capture a lot of this as well.
My right hon. Friend the Member for Basingstoke has made a point about needing to remove the intent requirement. Any sharing of an intimate image without consent should be criminalised. As we have discussed previously, that is being moved forward under the auspices of the Ministry of Justice in connection with the Law Commission’s proposed offence. That work is in flight, and I would anticipate it delivering legislative results. I think that is the remaining piece of the puzzle. With the addition of that piece of legislation, I think we will cover the totality of possible harms in relation to images of people whose consent has not been given.
In relation to material featuring children, the legislative pattern is complete already; it is already criminal. We do not need to do anything further to add any criminal offences; it is already illegal, as it should be. In relation to non-consensual images, the picture is largely complete. With the addition of the intimate image abuse offence that my right hon. Friend the Member for Basingstoke has been rightly campaigning for, the picture will be complete. Given that that is already in process via the Law Commission, while I again agree with what the Opposition are trying to do here, we have a process in hand that will sort this out. I hope that that makes the Government’s position on the amendments and the new clause clear.
Clause 68 is extremely important. It imposes a legally binding duty to make sure that children are not normally able to encounter pornographic content in a commercial context, and it makes it clear that one of the ways that can be achieved is by using age verification. If Ofcom, in its codes of practice, directs companies to use age verification, or if there is no other effective means of preventing children from seeing pornographic content, the clause makes it clear that age verification is expressly authorised by Parliament in primary legislation. There will be no basis upon which a porn provider could try to legally challenge Ofcom, because it is there in black and white in the Bill. It is clearly Parliament’s intention that hard-edged age verification will be legal. By putting that measure in the Bill as an example of the way that the duty can be met, we immunise the measure from legal challenge should Ofcom decide it is the only way of delivering the duty. I make that point explicitly for the avoidance of doubt, so that if this point is ever litigated, Parliament’s intention is clear.