All 15 Debates between Kirsty Blackman and Chris Philp

Tue 28th Jun 2022
Tue 28th Jun 2022
Thu 23rd Jun 2022
Tue 21st Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 7th Jun 2022
Tue 7th Jun 2022
Tue 24th May 2022
Mon 12th Nov 2018
Finance (No. 3) Bill
Commons Chamber

2nd reading: House of Commons & Programme motion: House of Commons

Online Safety Bill (Seventeenth sitting)

Debate between Kirsty Blackman and Chris Philp
Committee stage
Tuesday 28th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for raising those considerations, because protecting children is clearly one of the most important things that the Bill will do. The first point that it is worth drawing to the Committee’s attention again is the fact that all companies, regardless of the number of child users they may have, including zero child users, have duties to address illegal content where it affects children. That includes child sexual exploitation and abuse content, and illegal suicide content. Those protections for the things that would concern us the most—those illegal things—apply to companies regardless of their size. It is important to keep that in mind as we consider those questions.

It is also worth keeping in mind that we have designed the provisions in clause 31 to be a bit flexible. The child user condition, which is in clause 31(3) on page 31 of the Bill, sets out that one of two tests must be met for the child user condition to be met. The condition is met if

“there is a significant number of children who are users of the service…or…the service…is of a kind likely to attract a significant number of users who are children.”

When we debated the issue previously, we clarified that the word “user” did not mean that they had to be a registered user; they could be somebody who just stumbles across it by accident or who goes to it intentionally, but without actually registering. We have built in a certain amount of flexibility through the word “likely”. That helps a little bit. We expect that where a service poses a very high risk of harm to children, it is likely to meet the test, as children could be attracted to it—it might meet the “likely to attract” test.

New clause 27 would introduce the possibility that even when there were no children on the service and no children were ever likely to use it, the duties would be engaged—these duties are obviously in relation to content that is not illegal; the illegal stuff is covered already elsewhere. There is a question about proportionality that we should bear in mind as we think about this. I will be resisting the new clause on that basis.

However, as the hon. Member for Aberdeen North said, I have hinted or more than hinted to the Committee previously that we have heard the point that has been made—it was made in the context of adults, but applies equally to children here—that there is a category of sites that might have small numbers of users but none the less pose a high risk of harm, not harm that is illegal, because the “illegal” provision applies to everybody already, but harm that falls below the threshold of illegality. On that area, we heard hon. Members’ comments on Second Reading. We have heard what members of the Committee have had to say on that topic as well. I hope that if I say that that is something that we are reflecting on very carefully, the hon. Member for Aberdeen North will understand that those comments have been loudly heard by the Government. I hope that I have explained why I do not think new clause 27 quite works, but the point is understood.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate the Minister’s comments, but in the drafting of the new clause, we have said that Ofcom “may” impose these duties. I would trust the regulator enough not to impose the child safety duties on a site that literally has no children on it and that children have no ability to access. I would give the regulator greater credit than the Minister did, perhaps accidentally, in his comments. If it were up to Ofcom to make that decision and it had the power to do so where it deemed that appropriate, it would be most appropriate for the regulator to have the duty to make the decision.

I wish to press the new clause to a Division.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

It does make sense, and I do understand what the Minister is talking about in relation to clause 10 and the subsections that he mentioned. However, that only sets out what the platforms must take into account in their child risk assessments.

If we are talking about 15-year-olds, they are empowered in their lives to make many decisions on their own behalf, as well as decisions guided by parents or parental decisions taken for them. We are again doing our children a disservice by failing to allow young people the ability to opt out—the ability to choose not to receive certain content. Having a requirement to include whether not these functionalities exist in a risk assessment is very different from giving children and young people the option to choose, and to decide what they do—and especially do not—want to see on whichever platform they are interacting on.

I have previously mentioned the fact that if a young person is on Roblox, or some of those other platforms, it is difficult for them to interact only with people who are on their friends list. It is difficult for that young person to exclude adult users from contacting them. A lot of young people want to exclude content, comments or voice messages from people they do not know. They want to go on the internet and have fun and enjoy themselves without the risk of being sent an inappropriate message or photo and having to deal with those things. If they could choose those empowerment functions, that just eliminates the risk and they can make that choice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Could I develop the point I was making earlier on how the Bill currently protects children? Clause 11, which is on page 10, is on safety duties for children—what the companies have to do to protect children. One thing that they may be required by Ofcom to do, as mentioned in subsection (4)(f), is create

“functionalities allowing for control over content that is encountered, especially by children”.

Therefore, there is a facility to require the platforms to create the kind of functionalities that relate actually, as that subsection is drafted, to not just identity but the kind of content being displayed. Does that go some way towards addressing the hon. Lady’s concern?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is very helpful. I am glad that the Minister is making clear that he thinks that Ofcom will not just be ignoring this issue because the Bill is written to allow user empowerment functions only for adults.

I hope the fact that the Minister kindly raised clause 11(4) will mean that people can its importance, and that Ofcom will understand it should give consideration to it, because that list of things could have just been lost in the morass of the many, many lists of things in the Bill. I am hoping that the Minister’s comments will go some way on that. Notwithstanding that, I will press the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is an honour to support the new clause moved by the hon. Member for Aberdeen North. This was a recommendation from the Joint Committee report, and we believe it is important, given the sheer complexity of the Bill. The Minister will not be alarmed to hear that I am all in favour of increasing the scrutiny and transparency of this legislation.

Having proudly served on the DCMS Committee, I know it does some excellent work on a very broad range of policy areas, as has been highlighted. It is important to acknowledge that there will of course be cross-over, but ultimately we support the new clause. Given my very fond memories of serving on the Select Committee, I want to put on the record my support for it. My support for this new clause is not meant as any disrespect to that Committee. It is genuinely extremely effective in scrutinising the Government and holding them to account, and I know it will continue to do that in relation to both this Bill and other aspects of DCMS. The need for transparency, openness and scrutiny of this Bill is fundamental if it is truly to be world-leading, which is why we support the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful for the opportunity to discuss this issue once again. I want to put on the record my thanks to the Joint Committee, which the hon. Member for Ochil and South Perthshire sat on, for doing such fantastic work in scrutinising the draft legislation. As a result of its work, no fewer than 66 changes were made to the Bill, so it was very effective.

I want to make one or two observations about scrutinising the legislation following the passage of the Bill. First, there is the standard review mechanism in clause 149, on pages 125 and 126, which provides for a statutory review not before two years and not after five years of the Bill receiving Royal Assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that review function, it would help if the Minister could explain a bit more why it was decided to do that as a one-off, and not on a rolling two-year basis, for example.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a fairly standard clause in legislation. Clearly, for most legislation and most areas of Government activity, the relevant departmental Select Committee would be expected to provide the ongoing scrutiny, so ordinarily the DCMS Committee would do that. I hear the shadow Minister’s comments: she said that this proposal is not designed in any way to impugn or disrespect that Committee, but I listened to the comments of the Chair of that Committee on Second Reading, and I am not sure he entirely shares that view—he expressed himself in quite forthright terms.

On the proposal, we understand that the Joint Committee did valuable work. This is an unusual piece of legislation, in that it is completely groundbreaking. It is unlike any other, so the case for a having a particular Committee look at it may have some merits. I am not in a position to give a definitive Government response to that because the matter is still under consideration, but if we were to establish a special Committee to look at a single piece of legislation, there are two ways to do it. It could either be done in statute, as the new clause seeks, or it could be done by Standing Orders.

Generally speaking, it is the practice of the House to establish Committees by Standing Orders of the House rather than by statute. In fact, I think the only current Committee of the House established by statute—Ms Rees, you will correct me if I am wrong, as you are more of an expert on these matters than me—is the Intelligence and Security Committee, which was established by the Intelligence Services Act 1994. That is obviously very unusual, because it has special powers. It looks into material that would ordinarily be classified as secret, and it has access to the intelligence services. It is a rather unusual Committee that has to be granted special powers because it looks into intelligence and security matters. Clearly, those considerations do not apply here. Were a particular Committee to be established, the right way of doing that would not be in statute, as the new clause proposes, but via the Standing Orders of the House, if that is something that Parliament wants to do.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

First, let me also put on record my thanks to my hon. Friend for his service on the Joint Committee. He did a fantastic job and, as I said, the Committee’s recommendations have been powerfully heard. I thank him for his acknowledgment that if one were to do this, the right way to do it would be through Standing Orders. I have heard the point he made in support of some sort of ongoing special committee. As I say, the Government have not reached a view on this, but if one were to do that, I agree with my hon. Friend that Standing Orders would be the right mechanism.

One of the reasons for that can be found in the way the new clause has been drafted. Subsections (5) and (6) say:

“The membership and Chair of the Committee shall be appointed by regulations made by the Secretary of State…the tenure of office of members of, the procedure of and other matters…shall be set out in regulations made by the Secretary of State.”

I know those regulations are then subject to approval by a resolution of the House, but given the reservations expressed by Opposition Members about powers for the Secretary of State over the last eight sitting days, it is surprising to see the new clause handing the Secretary of State—in the form of a regulation-making power—the power to form the Committee.

That underlines why doing this through Standing Orders, so that the matter is in the hands of the whole House, is the right way to proceed, if that is something we collectively wish to do. For that reason, we will not support the new clause. Obviously, we will get back to the House in due course once thinking has been done about potential Committees, but that can be done as a separate process to the legislation. In any case, post-legislative scrutiny will not be needed until the regime is up and running, which will be after Royal Assent, so that does not have enormous time pressure on it.

A comment was made about future-proofing the Bill and making sure it stays up to date. There is a lot in that, and we need to make sure we keep up to date with changing technologies, but the Bill is designed to be tech agnostic, so if there is change in technology, that is accommodated by the Bill because the duties are not specific to any given technology. A good example is the metaverse. That was not conceived or invented prior to the Bill being drafted; none the less, it is captured by the Bill. The architecture of the Bill, relying on codes of practice produced by Ofcom, is designed to ensure flexibility so that the codes of practice can be kept up to date. I just wanted to make those two points in passing, as the issue was raised by the hon. Member for Aberdeen North.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The reason the new clause is drafted in that way is because I wanted to recognise the work of the Joint Committee and to take on board its recommendations. If it had been entirely my drafting, the House of Lords would certainly not have been involved, given that I am not the biggest fan of the House of Lords, as its Members are not elected. However, the decision was made to submit the new clause as drafted.

The Minister has said that the Government have not come to a settled view yet, which I am taking as the Minister not saying no. He is not standing up and saying, “No, we will definitely not have a Standing Committee.” I am not suggesting he is saying yes, but given that he is not saying no, I am happy to withdraw the new clause. If the Minister is keen to come forward at a future stage with suggestions for changes to Standing Orders, which I understand have to be introduced by the Leader of the House or the Cabinet Office, then they would be gladly heard on this side of the House. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 38

Adults’ risk assessment duties

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM makes any significant change to a risk profile that relates to services of the kind in question.

(3) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.

(4) A duty to make and keep a written record, in an easily understandable form, of every risk assessment under subsections (2) and (3).

(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed).

(6) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults presented by different kinds of priority content that is harmful to adults;

(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(7) In this section references to risk profiles are to the risk profiles for the time being published under section 83 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(8) The provisions of Schedule 3 apply to any assessment carried out under this section in the same way they apply to any relating to a Part 3 service.”—(John Nicolson.)

This new clause applies adults’ risk assessment duties to pornographic sites.

Brought up, and read the First time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Seeing as we are not doing spurious points of order, I will also take the opportunity to express our thanks. The first one is to the Chairs: thank you very much, Ms Rees and Sir Roger, for the excellent work you have done in the Chair. This has been a very long Bill, and the fact that you have put up with us for so long has been very much appreciated.

I thank all the MPs on the Committee, particularly the Labour Front-Bench team and those who have been speaking for the Labour party. They have been very passionate and have tabled really helpful amendments—it has been very good to work with the Labour team on the amendments that we have put together, particularly the ones we have managed to agree on, which is the vast majority. We thank Matt Miller, who works for my hon. Friend the Member for Ochil and South Perthshire. He has been absolutely wonderful. He has done an outstanding amount of work on the Bill, and the amazing support that he has given us has been greatly appreciated. I also thank the Public Bill Office, especially for putting up with the many, many amendments we submitted, and for giving us a huge amount of advice on them.

Lastly, I thank the hundreds of organisations that got in touch with us, and the many people who took the time to scrutinise the Bill, raise their concerns, and bring those concerns to us. Of those hundreds of people and organisations, I particularly highlight the work of the National Society for the Prevention of Cruelty to Children. Its staff have been really helpful to work with, and I have very much appreciated their advice and support in drafting our amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I feel slightly out of place, but I will add some concluding remarks in a moment; I should probably first respond to the substance of the new clause. The power to co-operate with other regulators and share information is, of course, important, but I am pleased to confirm that it is already in the Bill—it is not the first time that I have said that, is it?

Clause 98 amends section 393(2)(a) of the Communications Act 2003. That allows Ofcom to disclose information and co-operate with other regulators. Our amendment will widen the scope of the provision to include carrying out the functions set out in the Bill.

The list of organisations with which Ofcom can share information includes a number of UK regulators—the Competition and Markets Authority, the Information Commissioner, the Financial Conduct Authority and the Payment Systems Regulator—but that list can be amended, via secondary legislation, if it becomes necessary to add further organisations. In the extremely unlikely event that anybody wants to look it up, that power is set out in subsections (3)(i) and (4)(c) of section 393 of the Communications Act 2003. As the power is already created by clause 98, I hope that we will not need to vote on new clause 41.

I echo the comments of the shadow Minister about the Digital Regulation Cooperation Forum. It is a non-statutory body, but it is extremely important that regulators in the digital arena co-operate with one another and co-ordinate their activities. I am sure that we all strongly encourage the relevant regulators to work with the DRCF and to co-operate in this and adjacent fields.

I will bring my remarks to a close with one or two words of thanks. Let me start by thanking Committee members for their patience and dedication over the nine days we have been sitting—50-odd hours in total. I think it is fair to say that we have given the Bill thorough consideration, and of course there is more to come on Report, and that is before we even get to the House of Lords. This is the sixth Bill that I have taken through Committee as Minister, and it is by far the most complicated and comprehensive, running to 194 clauses and 15 schedules, across 213 pages. It has certainly been a labour. Given its complexity, the level of scrutiny it has received has been impressive—sometimes onerous, from my point of view.

The prize for the most perceptive observation during our proceedings definitely goes to the hon. Member for Aberdeen North, who noticed an inconsistency between use of the word “aural” in clause 49 and “oral” in clause 189, about 120 pages later.

I certainly thank our fantastic Chairs, Sir Roger Gale and Ms Rees, who have chaired our proceedings magnificently and kept us in order, and even allowed us to finish a little early, so huge thanks to them. I also thank the Committee Clerks for running everything so smoothly and efficiently, the Hansard reporters for deciphering our sometimes near-indecipherable utterances, and the Officers of the House for keeping our sittings running smoothly and safely.

I also thank all those stakeholders who have offered us their opinions; I suspect that they will continue to do so during the rest of the passage of the Bill. Their engagement has been important and very welcome. It has really brought external views into Parliament, which is really important.

I conclude by thanking the people who have been working on the Bill the longest and hardest: the civil servants in the Department for Digital, Culture, Media and Sport. Some members of the team have been working on the Bill in its various forms, including White Papers and so on, for as long as five years. The Bill has had a long gestation. Over the last few months, as we have been updating the Bill, rushing to introduce it, and perhaps even preparing some amendments for Report, they have been working incredibly hard, so I give a huge thanks to Sarah Connolly and the whole team at DCMS for all their incredible work.

Finally, as we look forward to Report, which is coming up shortly, we are listening, and no doubt flexibility will be exhibited in response to some of the points that have been raised. I look forward to working with members of the Committee and Members of the House more widely as we seek to make the Bill as good as it can be. On that note, I will sit down for the last time.

Online Safety Bill (Sixteenth sitting)

Debate between Kirsty Blackman and Chris Philp
Committee stage
Tuesday 28th June 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you for chairing this meeting, Sir Roger. I have a quick question for the Minister that relates to the new clause, which is a reasonable request for a duty on providers to disclose information to Ofcom. We would hope that the regulator had access to that information, and if companies are making significant changes, it is completely reasonable that they should have to tell Ofcom.

I do not have any queries or problems with the new clause; it is good. My question for the Minister is—I am not trying to catch anyone out; I genuinely do not know the answer—if a company makes significant changes to something that might impact on its safety duties, does it have to do a new risk assessment at that point, or does it not have to do so until the next round of risk assessments? I do not know the answer, but it would be good if the direction of travel was that any company making drastic changes that massively affected security—for example, Snapchat turning on the geolocation feature when it did an update—would have to do a new risk assessment at that point, given that significant changes would potentially negatively impact on users’ safety and increase the risk of harm on the platform.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure, as always, to serve under your chairmanship, Sir Roger. As the hon. Member for Worsley and Eccles South said, the new clause is designed to introduce a duty on providers to notify Ofcom of anything that Ofcom could reasonably be expected to be notified of.

The Bill already has extremely strong information disclosure provisions. I particularly draw the Committee’s attention to clause 85, which sets out Ofcom’s power to require information by provision of an information notice. If Ofcom provides an information notice—the particulars of which are set out in clause 86—the company has to abide by that request. As the Committee will recall, the strongest sanctions are reserved for the information duties, extending not only to fines of up to 10% or service discontinuation—unplugging the website, as it were; there is also personal criminal liability for named executives, with prison sentences of up to two years. We take those information duties extremely seriously, which is why the sanctions are as strong as they are.

The hon. Member for Aberdeen North asked what updates would occur if there were a significant design change. I draw the Committee’s attention to clause 10, which deals with children’s risk assessment duties, but there are similar duties in relation to illegal content and the safety of adults. The duty set out in clause 10(2), which cross-refers to schedule 3, makes it clear. The relevant words are “suitable and sufficient”. Clearly if there were a massive design change that would, in this case, adversely affect children, the risk assessment would not be suitable and sufficient if it were not updated to reflect that design change. I hope that answers the hon. Lady’s question.

Turning to the particulars of the new clause, if we incentivise companies to disclose information they have not been asked for by Ofcom, there is a danger that they might, through an excessive desire to comply, over-disclose and provide a torrent of information that would not be very helpful. There might also be a risk that some companies that are not well intentioned would deliberately dump enormous quantities of data in order to hide things within it. The shadow Minister, the hon. Member for Worsley and Eccles South, mentioned an example from the world of financial services, but the number of companies potentially within the scope of the Bill is so much larger than even the financial services sector. Some 25,000 companies may be in scope, a number that is much larger—probably by one order of magnitude, and possibly by two—than the financial services sector regulated by the FCA. That disparity in scale makes a significant difference.

Given that there are already strong information provision requirements in the Bill, particularly clause 85, and because of the reasons of scale that I have mentioned, I will respectfully resist the new clause.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of comments on the point about parental empowerment. I have been asked by my children for numerous apps. I have a look at them and think, “I don’t know anything about this app. I have never seen or heard of it before, and I have no idea the level of user-to-user functionality in this app.” Nowhere is there a requirement for this information to be set out. There is nowhere that parents can easily find this information.

With iPhones, if a kid wants an app, they have to request it from their parent and their parents needs to approve whether or not they get it. I find myself baffled by some of them because they are not ones that I have ever heard of or come across. To find out whether they have that level of functionality, I have to download and use the app myself in the way that, hopefully, my children would use it in order to find out whether it is safe for them.

A requirement for category 1 providers to be up front and explain the risks and how they manage them, and even how people interact with their services, would increase the ability of parents to be media literate. We can be as media literate as we like, but if the information is not there and we cannot find it anywhere, we end up having to make incredibly restrictive decisions in relation to our children’s ability to use the internet, which we do not necessarily want to make. We want them to be able to have fun, and the information being there would be very helpful, so I completely agree on that point.

My other point is about proportionality. The Opposition moved new clause 4, relating to risk assessments, and I did not feel able to support it on the basis of the arguments that the Minister made about proportionality. He made the case that Ofcom would receive 25,000 risk assessments and would be swamped by the number that it might receive. This new clause balances that, and has the transparency that is needed.

It is completely reasonable for us to put the higher burden of transparency on category 1 providers and not on other providers because they attract the largest market share. A huge percentage of the risk that might happen online happens with category 1 providers, so I am completely happy to support this new clause, which strikes the right balance. It answers the Minister’s concerns about Ofcom being swamped, because only category 1 providers are affected. Asking those providers to put the risk assessment on their site is the right thing to do. It will mean that there is far more transparency and that people are better able to make informed decisions.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the intention behind the new clause, but I want to draw the Committee’s attention to existing measures in the Bill that address this matter. I will start with the point raised by the hon. Member for Aberdeen North, who said that as a parent she would like to be able to see a helpful summary of what the risks are prior to her children using a new app. I am happy to say to her that that is already facilitated via clause 13(2), which appears at the top of page 13. There is a duty there

“to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service”,

including the levels of risk, and the nature and severity of those risks. That relates specifically to adults, but there is an equivalent provision relating to children as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I just gently say that if there is a requirement for people to sign up or begin to go through the sign-up process in order to see the terms of service, that is not as open and transparent. That is much more obstructive than it could be. A requirement for providers to make their terms of service accessible to any user, whether or not they were registered, would assist in the transparency.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the terms of service are generally available to be viewed by anyone. I do not think people have to be registered users to view the terms of service.

In addition to the duty to summarise the findings of the most recent risk assessment in relation to adults in clause 13(2), clause 11 contains obligations to specify in the terms of service, in relation to children, where children might be exposed to risks using that service. I suggest that a summary in the terms of service, which is an easy place to look, is the best way for parents or anybody else to understand what the risks are, rather than having to wade through a full risk assessment. Obviously, the documents have not been written yet, because the Bill has not been passed, but I imagine they would be quite long and possibly difficult to digest for a layperson, whereas a summary is more readily digestible. Therefore, I think the hon. Lady’s request as a parent is met by the duties set out in clause 11, and the duties for adults are set out in clause 13.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is important to make clear how the Bill operates, and I draw the Committee’s attention in particular to clauses 23 to 26, which deal with the risk assessment and safety duties for search services. I point in particular to clause 23(5)(a), which deals with the risk assessment duties for illegal content. The provision makes it clear that those risk assessments have to be carried out

“taking into account (in particular) risks presented by algorithms used by the service”.

Clause 25 relates to children’s risk assessment duties, and subsection (5)(a) states that children’s risk assessment duties have to be carried out

“taking into account (in particular) risks presented by algorithms”.

The risks presented by algorithms are expressly accounted for in clauses 23 and 25 in relation to illegal acts and to children. Those risk assessment duties flow into safety duties as we know.

By coincidence, yesterday I met with Google’s head of search, who talked about the work Google is doing to ensure that its search work is safe. Google has the SafeSearch work programme, which is designed to make the prompts better constructed.

In my view, the purpose of the new clause is covered by existing provisions. If we were to implement the proposal—I completely understand and respect the intention behind it, by the way—there could be an unintended consequence in the sense that it would ban any reference in the prompts to protected characteristics, although people looking for help, support or something like that might find such prompts helpful.

Through a combination of the existing duties and the list of harms, which we will publish in due course, as well as legislating via statutory instrument, we can ensure that people with protected characteristics, and indeed other people, are protected from harmful prompts while not, as it were, throwing the baby out with the bathwater and banning the use of certain terms in search. That might cause an unintended negative consequence for some people, particularly those from marginalised groups who were looking for help. I understand the spirit of the new clause, but we shall gently resist it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister has highlighted clauses 23 and 25. Clause 25 is much stronger than clause 23, because clause 23 includes only illegal content and priority illegal content, whereas clause 25 goes into non-designated content that is harmful to children. Some of the things that we are talking about, which might not be on the verge of illegal, but which are wrong and discriminatory, might not fall into the categories of illegal or priority illegal content unless the search service, which presumably an organisation such as Google is, has a children’s risk assessment duty. Such organisations are getting a much easier ride in that regard.

I want to make the Minister aware of this. If he turns on Google SafeSearch, which excludes explicit content, and googles the word “oral” and looks at the images that come up, he will see that those images are much more extreme than he might imagine. My point is that, no matter the work that the search services are trying to do, they need to have the barriers in place before that issue happens—before people are exposed to that harmful or illegal content. The existing situation does not require search services to have enough in place to prevent such things happening. The Minister was talking about moderation and things that happen after the fact in some ways, which is great, but does not protect people from the harm that might occur. I very much wish to press the new clause to the vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a question for the Minister that hopefully, given the Committee’s work, he might be able to answer. New clause 19(2)(b) would give Ofcom the power to require services to submit to it

“all research the service holds on a topic specified by OFCOM.”

Ofcom could say, “We would like all the research you have on the actual age of users.”

My concern is that clause 85(1) allows Ofcom to require companies to provide it

“with any information that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom might not know what information the company holds. I am concerned that Ofcom is able to say, as it is empowered to do by clause 85(1), “Could you please provide us with the research piece you did on under-age users or on the age of users?”, instead of having a more general power to say, “Could you provide us with all the research you have done?” I am worried that the power in clause 85(1) is more specific.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

If the Minister holds on for two seconds, he will get to make an actual speech. I am worried that the power is not general enough. I would very much like to hear the Minister confirm what he thinks.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am not going to make a full speech. I have conferred with colleagues. The power conferred by clause 85(1) is one to require any information in a particular domain. Ofcom does not have to point to a particular research report and say, “Please give me report X.” It can ask for any information that is relevant to a particular topic. Even if it does not know what specific reports there may be—it probably would not know what reports there are buried in these companies—it can request any information that is at all relevant to a topic and the company will be obliged to provide any information relevant to that request. If the company fails to do so, it will be committing an offence as defined by clause 92, because it would be “suppressing”, to use the language of that clause, the information that exists.

I can categorically say to the hon. Lady that the general ability of Ofcom is to ask for any relevant information—the word “any” does appear—and even if the information notice does not specify precisely what report it is, Ofcom does have that power and I expect it to exercise it and the company to comply. If the company does not, I would expect it to be prosecuted.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Given that clarification, I will not press the new clause. The Minister has made the case strongly enough and has clarified clause 85(1) to my satisfaction. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 23

Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

violence against women or girls.

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).” —(Alex Davies-Jones.)

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

Brought up, and read the First time.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for her question and for her tireless work in this area. As she says, the intimate image abuse offence being worked on is an extremely important piece in the jigsaw puzzle to protect women, particularly as it has as its threshold—at least in the previous draft—consent, without any test of intent, which addresses some points made by the Committee previously. As we have discussed before, it is a Ministry of Justice lead, and I am sure that my right hon. Friend will make representations to MOJ colleagues to elicit a rapid confirmation of its position on the recommendations, so that we can move to implement them as quickly as possible.

I remind the Committee of the Domestic Abuse Act 2021, which was also designed to protect women. Increased penalties for stalking and harassment have been introduced, and we have ended the automatic early release of violent and sex offenders from prison—something I took through Parliament as a Justice Minister a year or two ago. Previously, violent and sex offenders serving standard determinate sentences were often released automatically at the halfway point of their sentence, but we have now ended that practice. Rightly, a lot has been done outside the Bill to protect women and girls.

Let me turn to what the Bill does to further protect women and girls. Schedule 7 sets out the priority offences—page 183 of the Bill. In addition to all the offences I have mentioned previously, which automatically flow into the illegal safety duties, we have set out priority offences whereby companies must not just react after the event, but proactively prevent the offence from occurring in the first place. I can tell the Committee that many of them have been selected because we know that women and girls are overwhelmingly the victims of such offences. Line 21 lists the offence of causing

“intentional harassment, alarm or distress”.

Line 36 mentions the offence of harassment, and line 37 the offence of stalking. Those are obviously offences where women and girls are overwhelmingly the victims, which is why we have picked them out and put them in schedule 7—to make sure they have the priority they deserve.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister is making a good speech about the important things that the Bill will do to protect women and girls. We do not dispute that it will do so, but I do not understand why he is so resistant to putting this on the face of the Bill. It would cost him nothing to do so, and it would raise the profile. It would mean that everybody would concentrate on ensuring that there are enhanced levels of protection for women and girls, which we clearly need. I ask him to reconsider putting this explicitly on the face of the Bill, as he has been asked to do by us and so many external organisations.

Online Safety Bill (Fifteenth sitting)

Debate between Kirsty Blackman and Chris Philp
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.

On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the hon. Member for Worsley and Eccles South, that we want to see these provisions brought into force as quickly as possible, for the reasons that she set out. We are actively thinking about ways of ensuring that these provisions are brought into force as fast as possible. It is something that we have been actively discussing with Ofcom, and that, I hope, will be reflected in the road map that it intends to publish before the summer. That will of course remain an area of close working between the Department for Digital, Culture, Media and Sport and Ofcom, ensuring that these provisions come into force as quickly as possible. Of course, the illegal duties will be brought into force more quickly. That includes the CSEA offences set out in schedule 6.

The hon. Member for Aberdeen North raised questions in relation to the repeal of part 3 of the Digital Economy Act. Although that is on the statute book, it was never commenced. When it is repealed, we will not be removing from force something that is applied at the moment, because the statutory instrument to commence it was never laid. So the point she raised about whether the Bill would come into force the day after the Digital Economy Act is repealed does not apply; but the point she raised about bringing this legislation into force quickly is reasonable and right, and we will work on that.

The hon. Lady asked about the differences in scope between the video-sharing platform and the online safety regime. As I said, the online safety regime does have an increased scope compared with the VSP regime, but I think it is reasonable to keep an eye on that as she suggested, and keep it under review. There is of course a formal review mechanism in clause 149, but I think that more informally, it is reasonable that as the transition is made we keep an eye on it, as a Government and as parliamentarians, to ensure that nothing gets missed out.

I would add that, separately from the Bill, the online advertising programme is taking a holistic look at online advertising in general, and that will also be looking at matters that may also touch on the VSPs and what they regulate.

Question put and agreed to.

Clause 170 accordingly ordered to stand part of the Bill.

Clauses 171 and 172 ordered to stand part of the Bill.

Clause 173

Powers to amend section 36

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of questions, particularly on clause 176 and the powers to amend schedules 6 and 7. I understand the logic for schedule 5 being different—in that terrorism offences are a wholly reserved matter—and therefore why only the Secretary of State would be making any changes.

My question is on the difference in the ways to amend schedules 6 and 7—I am assuming that Government amendment 126, which asks the Secretary of State to consult Scottish Ministers and the Department of Justice in Northern Ireland, and which we have already discussed, will be voted on and approved before we come to clause 176. I do not understand the logic for having different procedures to amend the child sexual exploitation and abuse offences and the priority offences. Why have the Government chosen two different procedures for amending the two schedules?

I understand why that might not be a terribly easy question to answer today, and I would be happy for the Minister to get in touch afterwards with the rationale. It seems to me that both areas are very important, and I do not quite understand why the difference is there.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by addressing the questions the shadow Minister raised about these powers. She used the phrase “free rein” in her speech, but I would not exactly describe it as free rein. If we turn to clause 179, which we will come to in a moment or two, and subsection (1)(d), (e), (f) and (g), we see that all the regulations made under clauses 173 to 176, which we are debating, require an SI under the affirmative procedure. Parliament will therefore get a chance to have its say, to object and indeed to vote down a provision if it wishes to. It is not that the Secretary of State can act alone; changes are subject to the affirmative SI procedure.

It is reasonable to have a mechanism to change the lists of priority offences and so on by affirmative SI, because the landscape will change and new offences will emerge, and it is important that we keep up to date. The only alternative is primary legislation, and a slot for a new Act of Parliament does not come along all that often—perhaps once every few years for any given topic. I think that would lead to long delays—potentially years—before the various exemptions, lists of priority offences and so on could be updated. I doubt that it is Parliament’s intention, and it would not be good for the public if we had to wait for primary legislation to change the lists. The proposed mechanism is the only sensible and proportionate way to do it, and it is subject to a parliamentary vote.

A comment was made about Ofcom’s independence. The way the offences are defined has no impact on Ofcom’s operational independence. That is about how Ofcom applies the rules; this is about what the rules themselves are. It is right that we are able to update them relatively nimbly by affirmative SI.

The hon. Member for Aberdeen North asked about the differences in the way schedules 6 and 7 can be updated. I will happily drop her a line with further thoughts if she wants me to, but in essence we are happy to get the Scottish child sexual exploitation and abuse offences, set out in part 2 of schedule 6, adopted as soon as Scottish Ministers want. We do not want to delay any measures on child exploitation and abuse, and that is why it is done automatically. Schedule 7, which sets out the other priority offences, could cover any topic at all—any criminal offence could fall under that schedule—whereas schedule 6 is only about child sexual exploitation and abuse. Given that the scope of schedule 7 takes in any criminal offence, it is important to consult Scottish Ministers if it is a Scottish offence but then use the statutory instrument procedure, which applies it to the entire UK internet. Does the hon. Lady want me to write to her, or does that answer her question?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is actually incredibly helpful. I do not need a further letter, thanks.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the hon. Lady for saving DCMS officials a little ink, and electricity for an email.

I hope I have addressed the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 173 accordingly ordered to stand part of the Bill.

Clauses 174 and 175 ordered to stand part of the Bill.

Clause 176

Powers to amend Schedules 5, 6 and 7

Amendment made: 126, in clause 176, page 145, line 4, at end insert—

“(5A) The Secretary of State must consult the Scottish Ministers before making regulations under subsection (3) which—

(a) add an offence that extends only to Scotland, or

(b) amend or remove an entry specifying an offence that extends only to Scotland.

(5B) The Secretary of State must consult the Department of Justice in Northern Ireland before making regulations under subsection (3) which—

(a) add an offence that extends only to Northern Ireland, or

(b) amend or remove an entry specifying an offence that extends only to Northern Ireland.”—(Chris Philp.)

This amendment ensures that the Secretary of State must consult the Scottish Ministers or the Department of Justice in Northern Ireland before making regulations which amend Schedule 7 in connection with an offence which extends to Scotland or Northern Ireland only.

Clause 176, as amended, ordered to stand part of the Bill.

Clause 177

Power to make consequential provision

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

In one of our earlier debates, I asked the Minister about the difference between “oral” and “aural”, and I did not get a very satisfactory answer. I know the difference in their dictionary definition—I understand that they are different, although the words sound the same. I am confused that clause 189 uses “oral” as part of the definition of content, but clause 49 refers to

“one-to-one live aural communications”

in defining things that are excluded.

I do not understand why the Government have chosen to use those two different words in different places in the Bill. It strikes me that, potentially, we mean one or the other. If they do mean two different things, why has one thing been chosen for clause 49 and another thing for clause 189? Why has the choice been made that clause 49 relates to communications that are heard, but clause 189 relates to communications that are said? I do not quite get the Government’s logic in using those two different words.

I know this is a picky point, but in order to have good legislation, we want it to make sense, for there to be a good rationale for everything that is in it and for people to be able to understand it. At the moment, I do not properly understand why the choice has been made to use two different words.

More generally, the definitions in clause 189 seem pretty sensible, notwithstanding what I said in the previous debate in respect of amendment 76, which, with your permission, Sir Roger, I intend to move when we reach the appropriate point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As the hon. Member for Pontypridd said, clause 189 sets out various points of definition and interpretation necessary for the Bill to be understood and applied.

I turn to the question raised by the hon. Member for Aberdeen North. First, I strongly commend and congratulate her on having noticed the use of the two words. Anyone who thinks that legislation does not get properly scrutinised by Parliament has only to look to the fact that she spotted this difference, 110 pages apart, in two different clauses—clauses 49 and 189. That shows that these things do get properly looked at. I strongly congratulate her on that.

I think the best way of addressing her question is probably to follow up with her after the sitting. Clause 49 relates to regulated user-to-user content. We are in clause 49(2)—is that right?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Subsection (5).

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is cross-referenced in subsection (5). The use of the term “aural” in that subsection refers to sound only—what might typically be considered telephony services. “Oral” is taken to cover livestreaming, which includes pictures and voice. That is the intention behind the use of the two different words. If that is not sufficient to explain the point—it may not be—I would be happy to expand in writing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That would be helpful, in the light of the concerns I raised and what the hon. Member for Pontypridd mentioned about gaming, and how those communications work on a one-to-one basis. Having clarity in writing on whether clause 49 relates specifically to telephony-type services would be helpful, because that is not exactly how I read it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Given that the hon. Lady has raised the point, it is reasonable that she requires more detail. I will follow up in writing on that point.

Amendment proposed: 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.(Kirsty Blackman.)

This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

There have not been all that many times during the debate on the Bill when the Minister has so spectacularly missed the point as he has on this section. I understand everything he said about provisions already being in place to protect to children and the provisions regarding the super-complaints, but the new clause is not intended to be a replacement for the super-complaints procedure, which we all support—in fact, we have tried to strengthen that procedure. The new clause is intended to be an addition—another, very important layer.

Unfortunately, I do not have at the front of my mind the legislation that set up the Children’s Commissioner for Scotland, or the one for England. The Minister talked through some of the provisions and phrasing in the Children Act 2004. He said that the role of the Children’s Commissioner for England is to encourage bodies to act positively on behalf of children—to encourage. There is no requirement for the body to act in the way the Children’s Commissioner says it should act. Changes have been made in Wales establishing the Future Generations Commissioner, who has far more power.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As far as I can tell, the user advocacy body proposed in new clause 3 would not have the ability to compel Ofcom either.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

But it would be a statutory consultee that is specifically mentioned in this provision. I cannot find in the Bill a provision giving Ofcom a statutory duty to consult the four Children’s Commissioners. The new clause would make the children’s advocacy body a statutory consultee in decisions that affect children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Bill will require Ofcom to consult people who represent the interests of children. Although not named, it would be astonishing if the first people on that list were not the four Children’s Commissioners when developing the relevant codes of practice. The statutory obligation to consult those groups when developing codes of practice and, indeed, guidance is set out in clauses 37(6)(d) and 69(3)(d).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is very helpful, but there are still shortcomings in what the Minister says. The Bill, as drafted, requires Ofcom to require things of other organisations. Some of the detail is in the Bill, some of the detail will come in secondary legislation and some of the detail will come in the codes of practice published by Ofcom. We broadly agree that the Bill will ensure people are safer on the internet than they currently are, but we do not have all the detail on the Government’s intent. We would like more detail on some things, but we are not saying, “We need every little bit of detail.” If we did, the Bill would not be future-proof. We would not be able to change and update the Bill if we required everything to be in the Bill.

The Bill is not a one-off; it will continually change and grow. Having a user advocacy body would mean that emerging threats can quickly be brought to Ofcom’s attention. Unlike the Children’s Commissioners, who have a hundred other things to do, the entire purpose of this body would be to advocate on behalf of children online. The Children’s Commissioners do an amazing job, but this is not their No. 1 priority. If the Minister wants this to be a world-leading Bill, its No. 1 priority should be to protect the human rights of children.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the hon. Lady is being a little unfair to the Children’s Commissioners. Dame Rachel de Souza is doing a fantastic job of advocating specifically in the digital sphere. She really is doing a fantastic job, and I say that as a Minister. I would not say she is leaving any gaps.

These digital children’s safety issues link to wider children’s safety issues that exist offline, such as sexual exploitation, grooming and so on, so it is useful that the same person advocates for children in both the offline and online worlds.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.

The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.

I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.

Online Safety Bill (Fourteenth sitting)

Debate between Kirsty Blackman and Chris Philp
Committee stage
Tuesday 21st June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

As we have heard, the super-complaint process is extremely important for enabling eligible entities representing the interests of users or members of the public to make representations where there are systemic problems that need to be addressed. I think we all agree that is an important approach.

Clauses 140 to 142 set out the power to make super-complaints, the procedure for making them and the guidance that Ofcom will publish in relation to them. The shadow Minister raised a few questions first, some of which we have touched on previously. In relation to transparency, which we have debated before, as I said previously, there are transparency provisions in clause 64 that I think will achieve the objectives that she set out.

The shadow Minister also touched on some of the questions about individual rather than systemic complaints. Again, we debated those right at the beginning, I think, when we discussed the fact that the approach taken in the Bill is to deal with systems and processes, because the scale involved here is so large. If we tried to create an architecture whereby Ofcom, or some other public body, adjudicated individual complaints, as an ombudsman would, it would simply be overwhelmed. A much better approach is to ensure that the systems and processes are fixed, and that is what the Bill does.

The hon. Member for Aberdeen North had some questions too. She touched in passing on the Secretary of State’s powers to specify by regulation who counts as an eligible entity—this is under clause 140(3). Of course, the nature of those regulations is circumscribed by the very next subsection, subsection (4), in which one of the criteria is that the entity

“must be a body representing the interests of users of regulated services, or members of the public”.

That speaks to the important point about consumers that we touched on this morning. As the hon. Lady said, this will be done by the affirmative procedure, so there is enhanced parliamentary scrutiny. I hope that makes it clear that it would be done in a reasonable way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am sorry to try the Minister’s patience. I think that we are in quite a lot of agreement about what an eligible entity looks like. I appreciate that this is being done by the affirmative procedure, but we seem to be in much less agreement about the next clause, which is being done by the negative procedure. I would like him to explain that contrast.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me move on to clause 141 and amendment 153, which the hon. Lady spoke to a moment ago. Let us first talk about the question of time limits. As she said, the regulations that can be made under the clause include regulations on the time for various steps in the process. Rather than setting those out in the Bill, our intention is that when those regulations are moved they will include those time limits, but we want to consult Ofcom and other appropriate bodies to ensure that the deadlines set are realistic and reasonable. I cannot confirm now what those will be, because we have not yet done the consultation, but I will make a couple of points.

First, the steps set out in clause 141(2)(d)(i), (ii) and (iii), at the top of page 122, are essentially procedural steps about whether a particular complaint is in scope, whether it is admissible and whether the entity is eligible. Those should be relatively straightforward to determine. I do not want to pre-empt the consultation and the regulations, but my expectation is that those are done in a relatively short time. The regulations in clause 141(2)

“may…include provisions about the following matters”—

it then lists all the different things—and the total amount of time the complaint must take to resolve in its totality is not one of them. However, because the word “include” is used, it could include a total time limit. If the regulations were to set a total time limit, one would have to be a little careful, because clearly some matters are more complicated than others. The hon. Member for Aberdeen North acknowledged that we would not want to sacrifice quality and thoroughness for speed. If an overall time limit were set, it would have to accommodate cases that were so complicated or difficult, or that required so much additional information, that they could not be done in a period of, say, 90 days. I put on record that that is something that the consultation should carefully consider. We are proceeding in this way—with a consultation followed by regulations—rather than putting a time limit in the Bill because it is important to get this right.

The question was asked: why regulations rather than Ofcom? This is quite an important area, as the hon. Member for Aberdeen North and the shadow Minister—the hon. Member for Worsley and Eccles South—have said. This element of governmental and parliamentary oversight is important, hence our having regulations, rather than letting Ofcom write its own rules at will. We are talking about an important mechanism, and we want to make sure that it is appropriately responsive.

The question was asked: why will the regulations be subject to the negative, rather than the affirmative, procedure? Clearly that is a point of detail, albeit important detail. Our instinct was that the issue was perhaps of slightly less parliamentary interest than the eligible entity list, which will be keenly watched by many external parties. The negative procedure is obviously a little more streamlined. There is no hard-and-fast rule as to why we are using negative rather than affirmative, but that was broadly the thinking. There will be a consultation, in which Ofcom will certainly be consulted. Clause 141(3) makes it clear that others can be consulted too. That consultation will be crucial in ensuring that we get this right and that the process is as quick as it can be—that is important—but also delivers the right result. I gently resist amendment 153 and commend clauses 140 to 142.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Some Acts that this Parliament has passed have provided for a time limit within which something must be considered, but the time limit can be extended if the organisation concerned says to the Secretary of State, “Look, this is too complicated. We don’t believe that we can do this.” I think that was the case for the Subsidy Control Act 2022, but I have been on quite a few Bill Committees, so I may be wrong about that. That situation would be the exception, obviously, rather than the rule, and would apply only in the most complicated cases.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is suggesting a practical solution: a default limit that can be extended if the case is very complicated. That sort of structure can certainly be consulted on and potentially implemented in regulations. She referred to asking the Secretary of State’s permission. Opposition Members have been making points about the Secretary of State having too much power. Given that we are talking here about the regulator exercising their investigatory power, that kind of extension probably would not be something that we would want the Secretary of State’s permission for; we would find some other way of doing it. Perhaps the chief executive of Ofcom would have to sign it off, or some other body that is independent of Government.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Sorry, I phrased that quite badly. My point was more about having to justify things—having to say, “Look, we are sorry; we haven’t managed to do this in the time in which we were expected to. This is our justification”—rather than having to get permission. Apologies for phrasing that wrongly. I am glad that the Minister is considering including that point as something that could be suggested in the consultation.

I appreciate what the Minister says, but I still think we should have a time limit in the Bill, so I am keen to push amendment 153 to a vote.

Question put and agreed to.

Clause 140 accordingly ordered to stand part of the Bill.

Clause 141

Procedure for super-complaints

Amendment proposed: 153, in clause 141, page 121, line 32, after “140” insert

“, which must include the requirement that OFCOM must respond to such complaints within 90 days”—(Kirsty Blackman.)

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to talk about a specific example. Perhaps the Minister will be able to explain why the legislation is written this way around when I would have written it the opposite way around, much more in line with proposed new clause 10.

Snapchat brought in the Snap Map feature, which that involved having geolocation on every individual’s phone; whenever anyone took a photo to put it on Snapchat, that geolocation was included. The feature was automatically turned on for all Snapchat users when it first came in, I think in 2017. No matter what age they were, when they posted their story on Snapchat, which is available to anyone on their friends list and sometimes wider, anyone could see where they were. If a child had taken a photo at their school and put it on Snapchat, anyone could see what school they went to. It was a major security concern for parents.

That very concerning situation genuinely could have resulted in children and other vulnerable people, who may not have even known that the feature had been turned on by default and would not know how to turn on ghost mode in Snapchat so as not to post their location, being put at risk. The situation could have been helped if media literacy duties had kicked in that meant that the regulator had to say, “This is a thing on Snapchat: geolocation is switched on. Please be aware of this if your children or people you are responsible for are using Snapchat.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Is the hon. Member aware of a similar situation that arose more recently with Strava? People’s running routes were publicly displayed in the same way, which led to incidents of stalking.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I was aware that Strava did that mapping, which is why my friends list on Strava numbers about two people, but I was not aware that it had been publicly displayed. There are similar issues that routes can be public on things such as Garmin, so it is important to keep a note of that. I did not know that that information was public on Strava. If Ofcom had had the duty to ensure that people were aware of that, it would have been much easier for parents and vulnerable adults to take those decisions or have them taken on their behalf.

My reading of the clause is that if Ofcom comes across a problem, it will have to go and explain to the Secretary of State that it is a problem and get the Secretary of State to instruct it to take action. I do not think that makes sense. We have talked already about the fact that the Secretary of State cannot be an expert in everything. The Secretary of State cannot necessarily know the inner workings of Snapchat, Strava, TikTok and whatever other new platforms emerge. It seems like an unnecessary hurdle to stop Ofcom taking that action on its own, when it is the expert. The Minister is likely to say that the Secretary of State will say, “Yes, this is definitely a problem and I will easily instruct you to do this”—

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister will get the chance to make a proper speech in which he can respond.

It could be that the process is different from the one I see from reading the Bill. The Minister’s clarifications will be helpful to allow everyone to understand how the process is supposed to work, what powers Ofcom is supposed to have and whether it will have to wait for an instruction from the Secretary of State, which is what it looks like. That is why proposed new clause 10 is so important, because it would allow action to be taken to alert people to safety concerns. I am focusing mostly on that.

I appreciate that national security is also very important, but I thought I would take the opportunity to highlight specific concerns with individual platforms and to say to the Minister that we need Ofcom to be able to act and to educate the public as well as it possibly can, and to do so without having to wait for an instruction.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is obviously an operational matter for Ofcom. We would encourage it to do as much as possible. We encouraged it through our media literacy strategy, and it published an updated policy on media literacy in December last year. If Members feel that there are areas of media literacy in which Ofcom could do more, they will have a good opportunity to raise those questions when senior Ofcom officials next appear before the Digital, Culture, Media and Sport Committee or any other parliamentary Committee.

The key point is that the measures in new clause 10 are already in legislation, so the new clause is not necessary. The Secretary of State’s powers under clause 146 do not introduce a requirement for permission—they are two separate things. In addition to Ofcom’s existing powers to act of its own volition, the clause gives the Secretary of State powers to issue directions in certain very limited circumstances. A direction may be issued where there is a present threat—I stress the word “threat”—to the health or safety of the public or to national security, and only in relation to media literacy. We are talking about extremely narrowly defined powers.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister said “a present threat”, but the clause says “present a threat”. The two mean different things. To clarify, could he confirm that he means “present a threat”?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is quite right to correct me. I do mean “present a threat”, as it is written in the Bill—I apologise for inadvertently transposing the words.

Is it reasonable that the Secretary of State has those very limited and specific powers? Why should they exist at all? Does this represent an unwarranted infringement of Ofcom’s freedom? I suppose those are the questions that the Opposition and others might ask. The Government say that, yes, it is reasonable and important, because in those particular areas—health and safety, and national security—there is information to which only the Government have access. In relation to national security, for example, information gathered by the UK intelligence community—GCHQ, the Secret Intelligence Service and MI5—is made available to the Government but not more widely. It is certainly not information that Ofcom would have access to. That is why the Secretary of State has the power to direct in those very limited circumstances.

I hope that, following that explanation, the Committee will see that new clause 10 is not necessary because it replicates an existing power, and that clause 146 is a reasonable provision.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a question about subsection (4)(b), which says that the guidance can be replaced more frequently than once every three years. I understand subsection (4)(a)—that is fine—but subsection (4)(b) says that the guidance can be changed if

“the revision or replacement is by agreement between the Secretary of State and OFCOM.”

How will those of us who are not the Secretary of State or Ofcom know that there has been an agreement that the guidance can be changed and that the Secretary of State is not just acting on their own? If the guidance is changed because of an agreement, will there be a line in the guidance that says, “The Secretary of State has agreed with Ofcom to publish this only 1.5 years after the last guidance was put out, because of these reasons”? In the interests of transparency, it would be helpful for something like that to be included in the guidance, if it was being changed outside the normal three-year structure.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is better than being in the guidance, which is non-statutory, because it is in the Bill—it is right here in front of us in the measure that the hon. Lady just referred to, clause 147(4)(b). If the Secretary of State decided to issue updated guidance in less than three years without Ofcom’s consent, that would be unlawful; that would be in breach of this statute, and it would be a very straightforward matter to get that struck down. It would be completely illegal to do that.

My expectation would be that if updated guidance was issued in less than three years, it would be accompanied by written confirmation that Ofcom had agreed. I imagine that if a future Secretary of State—I cannot imagine the current Secretary of State doing it—published guidance in less than three years without Ofcom’s consent, Ofcom would not be shy in pointing that out, but to do that would be illegal. It would be unlawful; it would be a breach of this measure in the Bill.

I hope that the points that I have just made about the safeguards in clause 147, and the assurance and clarity that I have given the Committee about the intent that guidance will be at the strategic level rather than the operational level, gives Members the assurance they need to support the clause.

Question put, That the clause stand part of the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

One of the pieces of legislation that could be used is this Bill, because it is in scope. If the hon. Lady can bear with me until Report, I will say more about the specific legislative vehicle that we propose to use.

On the precise wording to be used, I will make a couple of points about the amendments that have been tabled—I think amendment 113 is not being moved, but I will speak to it anyway. Amendment 112, which was tabled by the hon. Member for Batley and Spen, talks about bringing physical harm in general into the scope of clause 150. Of course, that goes far beyond epilepsy trolling, because it would also bring into scope the existing offence of assisting or encouraging suicide, so there would be duplicative law: there would be the existing offence of assisting or encouraging suicide and the new offence, because a communication that encouraged physical harm would do the same thing.

If we included all physical harm, it would duplicate the proposed offence of assisting or encouraging self-harm that is being worked on by the Ministry of Justice and the Law Commission. It would also duplicate offences under the Offences Against the Person Act 1861, because if a communication caused one person to injure another, there would be duplication between the offence that will be created by clause 150 and the existing offence. Clearly, we cannot have two offences that criminalise the same behaviour. To the point made by the hon. Member for Aberdeen North, it would not be right to create two epilepsy trolling offences. We just need one, but it needs to be right.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a second.

The physical harm extension goes way beyond the epilepsy point, which is why I do not think that that would be the right way to do it, although the Government have accepted that we will do it and need to do it, but by a different mechanism.

I was about to speak to amendment 113, the drafting of which specifically mentions epilepsy and which was tabled by my hon. Friend the Member for Blackpool North and Cleveleys (Paul Maynard), but was the hon. Lady’s question about the previous point?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

My question was about the announcement that the Minister is hoping to make on Report. I appreciate that he has committed to introduce the new offence, which is great. If the Bill is to be the legislative vehicle, does he expect to amend it on Report, or does he expect that that will have to wait until the amendment goes through the Lords?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a good question, and it ties into my next point. Clearly, amendment 113 is designed to create a two-sentence epilepsy trolling offence. When trying to create a brand-new offence—in this case, epilepsy trolling—it is unlikely that two sentences’ worth of drafting will do the trick, because a number of questions need to be addressed. For example, the drafting will need to consider what level of harm should be covered and exactly what penalty would be appropriate. If it was in clause 150, the penalty would be two years, but it might be higher or lower, which needs to be addressed. The precise definitions of the various terms need to be carefully defined as well, including “epilepsy” and “epileptic seizures” in amendment 113, which was tabled by my hon. Friend the Member for Blackpool North and Cleveleys. We need to get proper drafting.

My hon. Friend the Member for Eastbourne mentioned that the Epilepsy Society had some thoughts on the drafting. I know that my colleagues in the Ministry of Justice and, I am sure, the office of the parliamentary counsel, would be keen to work with experts from the Epilepsy Society to ensure that the drafting is correct. Report will likely be before summer recess—it is not confirmed, but I am hoping it will be—and getting the drafting nailed down that quickly would be challenging.

I hope that, in a slightly indirect way, that answers the question. We do not have collective agreement about the precise legislative vehicle to use; however, I hope it addresses the questions about how the timing and the choreography could work.

Online Safety Bill (Thirteenth sitting)

Debate between Kirsty Blackman and Chris Philp
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship once again, Ms Rees, and I congratulate Committee members on evading this morning’s strike action.

I am delighted that the shadow Minister supports the intent behind these clauses, and I will not speak at great length given the unanimity on this topic. As she said, clause 118 allows Ofcom to impose a financial penalty for failure to take specified steps by a deadline set by Ofcom. The maximum penalty that can be imposed is the greater of £18 million or 10% of qualifying worldwide revenue. In the case of large companies, it is likely to be a much larger amount than £18 million.

Clause 119 enables Ofcom to impose financial penalties if the recipient of a section 103 notice does not comply by the deadline. It is very important to ensure that section 103 has proper teeth. Government amendments 154 to 157 make changes that allow Ofcom to recover not only the cost of running the service once the Bill comes into force and into the future but also the preparatory cost of setting up for the Bill to come into force.

As previously discussed, £88 million of funding is being provided to Ofcom in this financial year and next. We believe that something like £20 million of costs that predate these financial years have been funded as well. That adds up to around £108 million. However, the amount that Ofcom recovers will be the actual cost incurred. The figure I provided is simply an indicative estimate. The actual figure would be based on the real costs, which Ofcom would be able to recoup under these measures. That means that the taxpayer—our constituents —will not bear any of the costs, including the set-up and preparatory cost. This is an equitable and fair change to the Bill.

Clause 120 sets out that some regulated providers will be required to pay a regulatory fee to Ofcom, as set out in clause 71. Clause 120 allows Ofcom to impose a financial penalty if a regulated provider does not pay its fee by the deadline it sets. Finally, clause 121 sets out the information that needs to be included in these penalty notices issued by Ofcom.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I have questions about the management of the fees and the recovery of the preparatory cost. Does the Minister expect that the initial fees will be higher as a result of having to recoup the preparatory cost and will then reduce? How quickly will the preparatory cost be recovered? Will Ofcom recover it quickly or over a longer period of time?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Bill provides a power for Ofcom to recover those costs. It does not specify over what time period. I do not think they will be recouped over a period of years. Ofcom can simply recoup the costs in a single hit. I would imagine that Ofcom would seek to recover these costs pretty quickly after receiving these powers. The £108 million is an estimate. The actual figure may be different once the reconciliation and accounting is done. It sounds like a lot of money, but it is spread among a number of very large social media firms. It is not a large amount of money for them in the context of their income, so I would expect that recouping to be done on an expeditious basis—not spread over a number of years. That is my expectation.

Question put and agreed to.

Clause 118 accordingly ordered to stand part of the Bill.

Clause 119 ordered to stand part of the Bill.

Clause 120

Non-payment of fee

Amendments made: 154, in clause 120, page 102, line 20, after “71” insert:

“or Schedule (Recovery of OFCOM’s initial costs)”.

This amendment, and Amendments 155 to 157, ensure that Ofcom have the power to impose a monetary penalty on a provider of a service who fails to pay a fee that they are required to pay under NS2.

Amendment 155, in clause 120, page 102, line 21, leave out “that section” and insert “Part 6”.

Amendment 156, in clause 120, page 102, line 26, after “71” insert—

“or Schedule (Recovery of OFCOM’s initial costs)”

Amendment 157, in clause 120, page 103, line 12, at end insert—

“or Schedule (Recovery of OFCOM’s initial costs)”.—(Chris Philp.)

Clause 120, as amended, ordered to stand part of the Bill.

Clause 121 ordered to stand part of the Bill.

Clause 122

Amount of penalties etc

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I repeat the point I made to the hon. Member for Liverpool, Walton a moment ago. This is simply an obligation to consult. The clause gives the Secretary of State an opportunity to offer an opinion, but it is just that—an opinion. It is not binding on Ofcom, which may take that opinion into account or not at its discretion. This provision sits alongside the requirement to consult the Information Commissioner’s Office. I respectfully disagree with the suggestion that it represents unwarranted and inappropriate interference in the operation of a regulator. Consultation between organs of state is appropriate and sensible, but in this case it does not fetter Ofcom’s ability to act at its own discretion. I respectfully do not agree with the shadow Minister’s analysis.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Apologies, Ms Rees, for coming in a bit late on this, but I was not aware of the intention to vote against the clause. I want to make clear what the Scottish National party intends to do, and the logic behind it. The inclusion of Government amendment 7 is sensible, and I am glad that the Minister has tabled it. Clause 129 is incredibly important, and the requirement to publish guidance will ensure that there is a level of transparency, which we and the Labour Front Benchers have been asking for.

The Minister has been clear about the requirement for Ofcom to consult the Secretary of State, rather than to be directed by them. As a whole, this Bill gives the Secretary of State far too much power, and far too much ability to intervene in the workings of Ofcom. In this case, however, I do not have an issue with the Secretary of State being consulted, so I intend to support the inclusion of this clause, as amended by Government amendment 7.



Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, resourcing of the upper tribunal is a matter decided jointly by the Lord Chancellor and the Secretary of State for Justice, in consultation with the Lord Chief Justice, and, in this case, the Senior President of Tribunals. Parliament would expect the resourcing of that part of the upper tribunal to be such that cases could be heard in an expedited matter. Particularly where cases concern the safety of the public—and particularly of children—we expect that to be done as quickly as it can.

Question put and agreed to.

Clause 138 accordingly ordered to stand part of the Bill.

Clause 139 ordered to stand part of the Bill.

Clause 140

Power to make super-complaints

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 143, in clause 140, page 121, line 1, after “services” insert “, consumers”.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, we want the super-complaint function to be as effective as possible and for groups of relevant people, users or members of the public to be able to be represented by an eligible entity to raise super-complaints. I believe we are all on the same page in wanting to do that. If I am honest, I am a little confused as to what the addition of the term “consumers” will add. The term “users” is defined quite widely, via clause 140(6), which then refers to clause 181, where, as debated previously, a “user” is defined widely to include anyone using a service, whether registered or not. So if somebody stumbles across a website, they count as a user, but the definition being used in clause 140 about bringing super-complaints also includes “members of the public”—that is, regular citizens. Even if they are not a user of that particular service, they could still be represented in bringing a complaint.

Given that, by definition, “users” and “members of the public” already cover everybody in the United Kingdom, I am not quite sure what the addition of the term “consumers” adds. By definition, consumers are a subset of the group “users” or “members of the public”. It follows that in seeking to become an eligible entity, no eligible entity will purport to act for everybody in the United Kingdom; they will always be seeking to define some kind of subset of people. That might be children, people with a particular vulnerability or, indeed, consumers, who are one such subset of “members of the public” or “users”. I do not honestly understand what the addition of the word “consumers” adds here when everything is covered already.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister explicitly say that he thinks that an eligible entity, acting on behalf of consumers, could, if it fulfils the other criteria, bring a super-complaint?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, definitely. That is the idea of an eligible entity, which could seek to represent a particular demographic, such as children or people from a particular marginalised group, or it could represent people who have a particular interest, which would potentially include consumers. So I can confirm that that is the intention behind the drafting of the Bill. Having offered that clarification and made clear that the definition is already as wide as it conceivably can be—we cannot get wider than “members of the public”—I ask the hon. Member for Aberdeen North to consider withdrawing the amendments, particularly as there are so many. It will take a long time to vote on them.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister for the clarification. Given that he has explicitly said that he expects that groups acting on behalf of consumers could, if they fulfil the other criteria, be considered as eligible entities for making super-complaints, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendment proposed: 66, in clause 140, page 121, line 8, at end insert—

“(d) causing harm to any human or animal.”

This amendment ensures groups are able to make complaints regarding animal abuse videos.(Alex Davies-Jones.)

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the Committee, and the House, are pretty unanimous in agreeing that the power to make super-complaints is important. As we have discussed, there are all kinds of groups, such as children, under-represented groups and consumers, that would benefit from being represented where systemic issues are not being addressed and that Ofcom may have somehow overlooked or missed in the discharge of its enforcement powers.

I would observe in passing that one of the bases on which super-complaints can be made—this may be of interest to my hon. Friend the Member for Don Valley—is where there is a material risk under clause 140(1)(b) of

“significantly adversely affecting the right to freedom of expression within the law of users of the services or members of the public”.

That clause is another place in the Bill where freedom of expression is expressly picked out and supported. If freedom of expression is ever threatened in a way that we have not anticipated and that the Bill does not provide for, there is a particular power here for a particular free speech group, such as the Free Speech Union, to make a super-complaint. I hope that my hon. Friend finds the fact that freedom of expression is expressly laid out there reassuring.

Let me now speak to the substance of amendment 77, tabled by the hon. Member for Aberdeen North. It is important to first keep in mind the purpose of the super-complaints, which, as I said a moment ago, is to provide a basis for raising issues of widespread and systemic importance. That is the reason for some of the criteria in sections (1)(a), (b) and (c), and why we have subsection (2)—because we want to ensure that super-complaints are raised only if they are of a very large scale or have a profound impact on freedom of speech or some other matter of particular importance. That is why the tests, hurdles and thresholds set out in clause 140(2) have to be met.

If we were to remove subsection (2), as amendment 77 seeks to, that would significantly lower the threshold. We would end up having super-complaints that were almost individual in nature. We set out previously why we think an ombudsman-type system or having super-complaints used for near-individual matters would not be appropriate. That is why the clause is there, and I think it is reasonable that it is.

The hon. Lady asked a couple of questions about how this arrangement might operate in practice. She asked whether a company such Facebook would be caught if it alone were doing something inappropriate. The answer is categorically yes, because the condition in clause 140(2)(b)—

“impacts on a particularly large number of users”,

which would be a large percentage of Facebook’s users,

“or members of the public”—

would be met. Facebook and—I would argue—any category 1 company would, by definition, be affecting large numbers of people. The very definition of category 1 includes the concept of reach—the number of people being affected. That means that, axiomatically, clause 140(2)(b) would be met by any category 1 company.

The hon. Lady also raised the question of Facebook, for a period of time in Europe, unilaterally ceasing to scan for child sexual exploitation and abuse images, which, as mentioned, led to huge numbers of child sex abuse images and, consequently, huge numbers of paedophiles not being detected. She asks how these things would be handled under the clause if somebody wanted to raise a super-complaint about that. Hopefully, Ofcom would stop them happening in the first place, but if it did not the super-complaint redress mechanism would be the right one. These things would categorically be caught by clause 140(2)(a), because they are clearly of particular importance.

In any reasonable interpretation of the words, the test of “particular importance” is manifestly met when it comes to stopping child sexual exploitation and abuse and the detection of those images. That example would categorically qualify under the clause, and a super-complaint could, if necessary, be brought. I hope it would never be necessary, because that is the kind of thing I would expect Ofcom to catch.

Having talked through the examples from the hon. Lady, I hope I have illustrated how the clause will ensure that either large-scale issues affecting large numbers of people or issues that are particularly serious will still qualify for super-complaint status with subsection (2) left in the Bill. Given those assurances, I urge the hon. Member to consider withdrawing her amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I welcome the Minister’s fairly explicit explanation that he believes that every category 1 company would be in scope, even if there was a complaint against one single provider. I would like to push the amendment to a vote on the basis of the comments I made earlier and the fact that each of these platforms is different. We have heard concerns about, for example, Facebook groups being interested in celebrating eight-year-olds’ birthdays. We have heard about the amount of porn on Twitter, which Facebook does not have in the same way. We have heard about the kind of algorithmic stuff that takes people down a certain path on TikTok. We have heard all these concerns, but they are all specific to that one provider. They are not a generic complaint that could be brought toward a group of providers.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Would the hon. Lady not agree that in all those examples—including TikTok and leading people down dark paths—the conditions in subsection (2) would be met? The examples she has just referred to are, I would say, certainly matters of particular importance. Because the platforms she mentions are big in scale, they would also meet the test of scale in paragraph (b). In fact, only one of the tests has to be met—it is one or the other. In all the examples she has just given, not just one test—paragraph (a) or (b)— would be met, but both. So all the issues she has just raised would make a super-complaint eligible to be made.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am glad the Minister confirms that he expects that that would be the case. I am clearer now that he has explained it, but on my reading of the clause, the definitions of “particular importance” or

“a particularly large number of users…or members of the public”

are not clear. I wanted to ensure that this was put on the record. While I do welcome the Minister’s clarification, I would like to push amendment 77 to a vote.

Question put, That the amendment be made.

Online Safety Bill (Twelfth sitting)

Debate between Kirsty Blackman and Chris Philp
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I have a few questions, concerns and suggestions relating to these clauses. I think it was the hon. Member for Don Valley who asked me last week about the reports to the National Crime Agency and how that would work—about how, if a human was not checking those things, there would be an assurance that proper reports were being made, and that scanning was not happening and reports were not being made when images were totally legal and there was no problem with them. [Interruption.] I thought it was the hon. Member for Don Valley, although it may not have been. Apologies—it was a Conservative Member. I am sorry for misnaming the hon. Member.

The hon. Member for Pontypridd made a point about the high level of accuracy of the technologies. That should give everybody a level of reassurance that the reports that are and should be made to the National Crime Agency on child sexual abuse images will be made on a highly accurate basis, rather than a potentially inaccurate one. Actually, some computer technology—particularly for scanning for images, rather than text—is more accurate than human beings. I am pleased to hear those particular statistics.

Queries have been raised on this matter by external organisations—I am particularly thinking about the NSPCC, which we spoke about earlier. The Minister has thankfully given a number of significant reassurances about the ability to proactively scan. External organisations such as the NSPCC are still concerned that there is not enough on the face of the Bill about proactive scanning and ensuring that the current level of proactive scanning is able—or required—to be replicated when the Bill comes into action.

During an exchange in an earlier Committee sitting, the Minister gave a commitment—I am afraid I do not have the quote—to being open to looking at amending clause 103. I am slightly disappointed that there are no Government amendments, but I understand that there has been only a fairly short period; I am far less disappointed than I was previously, when the Minister had much more time to consider the actions he might have been willing to take.

The suggestion I received from the NSPCC is about the gap in the Bill regarding the ability of Ofcom to take action. These clauses allow Ofcom to take action against individual providers about which it has concerns; those providers will have to undertake duties set out by Ofcom. The NSPCC suggests that there could be a risk register, or that a notice could be served on a number of companies at one time, rather than Ofcom simply having to pick one company, or to repeatedly pick single companies and serve notices on them. Clause 83 outlines a register of risk profiles that must be created by Ofcom. It could therefore serve notice on all the companies that fall within a certain risk profile or all the providers that have common functionalities.

If there were a new, emerging concern, that would make sense. Rather than Ofcom having to go through the individual process with all the individual providers when it knows that there is common functionality—because of the risk assessments that have been done and Ofcom’s oversight of the different providers—it could serve notice on all of them in one go. It could not then accidentally miss one out and allow people to move to a different platform that had not been mentioned. I appreciate the conversation we had around this issue earlier, and the opportunity to provide context in relation to the NSPCC’s suggestions, but it would be great if the Minister would be willing to consider them.

I have another question, to which I think the Minister will be able to reply in the affirmative, which is on the uses of the technology as it evolves. We spoke about that in an earlier meeting. The technology that we have may not be what we use in the future to scan for terrorist-related activity or child sexual abuse material. It is important that the Bill adequately covers future conditions. I think that it does, but will the Minister confirm that, as technology advances and changes, these clauses will adequately capture the scanning technologies that are required, and any updates in the way in which platforms work and we interact with each other on the internet?

I have fewer concerns about future-proofing with regard to these provisions, because I genuinely think they cover future conditions, but it would be incredibly helpful and provide me with a bit of reassurance if the Minister could confirm that. I very much look forward to hearing his comments on clause 103.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Let me start by addressing some questions raised by hon. Members, beginning with the last point made by the hon. Member for Aberdeen North. She sought reconfirmation that the Bill will keep up with future developments in accredited technology that are not currently contemplated. The answer to her question can be found in clause 105(9), in which the definition of accredited technology is clearly set out, as technology that is

“accredited (by OFCOM or another person appointed by OFCOM) as meeting minimum standards of accuracy”.

That is not a one-off determination; it is a determination, or an accreditation, that can happen from time to time, periodically or at any point in the future. As and when new technologies emerge that meet the minimum standards of accuracy, they can be accredited, and the power in clause 103 can be used to compel platforms to use those technologies. I hope that provides the reassurance that the hon. Member was quite rightly asking for.

The shadow Minister, the hon. Member for Pontypridd, asked a related question about the process for publishing those minimum standards. The process is set out in clause 105(10), which says that Ofcom will give advice to the Secretary of State on the appropriate minimum standards, and the minimum standards will then be

“approved…by the Secretary of State, following advice from OFCOM.”

We are currently working with Ofcom to finalise the process for setting those standards, which of course will need to take a wide range of factors into account.

Let me turn to the substantive clauses. Clause 103 is extremely important, because as we heard in the evidence sessions and as Members of the Committee have said, scanning messages using technology such as hash matching, to which the shadow Minister referred, is an extremely powerful way of detecting CSEA content and providing information for law enforcement agencies to arrest suspected paedophiles. I think it was in the European Union that Meta—particularly Facebook and Facebook Messenger—stopped using this scanner for a short period time due to misplaced concerns about privacy laws, and the number of referrals of CSEA images and the number of potential paedophiles who were referred to law enforcement dropped dramatically.

A point that the hon. Member for Aberdeen North and I have discussed previously is that it would be completely unacceptable if a situation arose whereby these messages—I am thinking particularly about Facebook Messenger—did not get scanned for CSEA content in a way that they do get scanned today. When it comes to preventing child sexual exploitation and abuse, in my view there is no scope for compromise or ambiguity. That scanning is happening at the moment; it is protecting children on a very large scale and detecting paedophiles on quite a large scale. In my view, under no circumstances should that scanning be allowed to stop. That is the motivation behind clause 103, which provides Ofcom with the power to make directions to require the use of accredited technology.

As the hon. Member for Aberdeen North signalled in her remarks, given the importance of this issue the Government are of course open to thinking about ways in which the Bill can be strengthened if necessary, because we do not want to leave any loopholes. I urge any social media firms watching our proceedings never to take any steps that degrade or reduce the ability to scan for CSEA content. I thank the hon. Member for sending through the note from the NSPCC, which I have received and will look at internally.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I echo the sentiments that have been expressed by the shadow Minister, and thank her and her colleagues for tabling this amendment and giving voice to the numerous organisations that have been in touch with us about this matter. The Scottish National party is more than happy to support the amendment, which would make the Bill stronger and better, and would better enable Ofcom to take action when necessary.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the spirit behind these amendments, focusing on the word “presence” rather than “prevalence” in various places. It is worth keeping in mind that throughout the Bill we are requiring companies to implement proportionate systems and processes to protect their users from harm. Even in the case of the most harmful illegal content, we are not placing the duty on companies to remove every single piece of illegal content that has ever appeared online, because that is requesting the impossible. We are asking them to take reasonable and proportionate steps to create systems and processes to do so. It is important to frame the legally binding duties in that way that makes them realistically achievable.

As the shadow Minister said, amendments 35, 36, 39 and 40 would replace the word “prevalence” with “presence”. That would change Ofcom’s duty to enforce not just against content that was present in significant numbers—prevalent—but against a single instance, which would be enough to engage the clause.

We mutually understand the intention behind these amendments, but we think the significant powers to compel companies to adopt certain technology contained in section 103 should be engaged only where there is a reasonable level of risk. For example, if a single piece of content was present on a platform, if may not be reasonable or proportionate to force the company to adopt certain new technologies, where indeed they do not do so at the moment. The use of “prevalence” ensures that the powers are used where necessary.

It is clear—there is no debate—that in the circumstances where scanning technology is currently used, which includes on Facebook Messenger, there is enormous prevalence of material. To elaborate on a point I made in a previous discussion, anything that stops that detection happening would be unacceptable and, in the Government’s view, it would not be reasonable to lose the ability to detect huge numbers of images in the service of implementing encryption, because there is nothing more important than scanning against child sexual exploitation images.

However, we think adopting the amendment and replacing the word “prevalence” with “presence” would create an extremely sensitive trigger that would be engaged on almost every site, even tiny ones or where there was no significant risk, because a single example would be enough to trigger the amendment, as drafted. Although I understand the spirit of the amendment, it moves away from the concepts of proportionality and reasonableness in the systems and processes that the Bill seeks to deliver.

Amendment 37 seeks to widen the criteria that Ofcom must consider when deciding to use section 103 powers. It is important to ensure that Ofcom considers a wide range of factors, taking into account the harm occurring, but clause 104(2)(f) already requires Ofcom to consider

“the level of risk of harm to individuals in the United Kingdom presented by relevant content, and the severity of that harm”.

Therefore, the Bill already contains provision requiring Ofcom to take those matters into account, as it should, but the shadow Minister is right to draw attention to the issue.

Finally, amendment 38 seeks to amend clause 116 to require Ofcom to consider the risk of harm posed by individuals in the United Kingdom, in relation to adults and children in the UK or elsewhere, through the production, publication and dissemination of illegal content. In deciding whether to make a confirmation decision requiring the use of technology, it is important that Ofcom considers a wide range of factors. However, clause 116(6)(e) already proposes to require Ofcom to consider, in particular, the risk and severity of harm to individuals in the UK. That is clearly already in the Bill.

I hope that this analysis provides a basis for the shadow Minister to accept that the Bill, in this area, functions as required. I gently request that she withdraw her amendment.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a quick question for the Minister about the timelines in relation to the guidance and the commitment that Ofcom gave to producing a road map before this coming summer. When is that guidance likely to be produced? Does that road map relate to the guidance in this clause, as well as the guidance in other clauses? If the Minister does not know the answer, I have no problem with receiving an answer at a later time. Does the road map include this guidance as well as other guidance that Ofcom may or may not be publishing at some point in the future?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the cross-party support for the provisions set out in these important clauses. Clause 107 points out the requirement for Ofcom to publish guidance, which is extremely important. Clause 108 makes sure that it publishes an annual report. Clause 109 covers the interpretations.

The hon. Member for Aberdeen North asked the only question, about the contents of the Ofcom road map, which in evidence it committed to publishing before the summer. I cannot entirely speak for Ofcom, which is of course an independent body. In order to avoid me giving the Committee misleading information, the best thing is for officials at the Department for Digital, Culture, Media and Sport to liaise with Ofcom and ascertain what the exact contents of the road map will be, and we can report that back to the Committee by letter.

It will be fair to say that the Committee’s feeling—I invite hon. Members to intervene if I have got this wrong—is that the road map should be as comprehensive as possible. Ideally, it would lay out the intended plan to cover all the activities that Ofcom would have to undertake in order to make the Bill operational, and the more detail there is, and the more comprehensive the road map can be, the happier the Committee will be.

Officials will take that away, discuss it with Ofcom and we can revert with fuller information. Given that the timetable was to publish the road map prior to the summer, I hope that we are not going to have to wait very long before we see it. If Ofcom is not preparing it now, it will hopefully hear this discussion and, if necessary, expand the scope of the road map a little bit accordingly.

Question put and agreed to.

Clause 107 accordingly ordered to stand part of the Bill

Clauses 108 and 109 ordered to stand part of the Bill.

Clause 110

Provisional notice of contravention

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

This data is a little different—the two domains do not directly correspond. In the health area, there has been litigation—an artificial intelligence company is currently engaged in litigation with an NHS hospital trust about a purported breach of patient data rules—so even in that long-established area, there is uncertainty and recent, or perhaps even current, litigation.

We are asking for the report to be done to ensure that those important issues are properly thought through. Once they are, Ofcom has the power under clause 136 to lay down guidance on providing access for independent researchers to do their work.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister has committed to Ofcom being fully resourced to do what it needs to do under the Bill, but he has spoken about time constraints. If Ofcom were to receive 25,000 risk assessments, for example, there simply would not be enough people to go through them. Does he agree that, in cases in which Ofcom is struggling to manage the volume of data and to do the level of assessment required, it may be helpful to augment that work with the use of independent researchers? I am not asking him to commit to that, but to consider the benefits.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, I would agree that bona fide academic independent researchers do have something to offer and to add in this area. The more we have highly intelligent, experienced and creative people looking at a particular problem or issue, the more likely we are to get a good and well-informed result. They may have perspectives that Ofcom does not. I agree that, in principle, independent researchers can add a great deal, but we need to ensure that we get that set up in a thoughtful and proper way. I understand the desire to get it done quickly, but it is important to take the time to do it not just quickly, but right. It is an area that does not exist already—at the moment, there is no concept of independent researchers getting access to the innards of social media companies’ data vaults—so we need to make sure that it is done in the right way, which is why it is structured as it is. I ask the Committee to stick with the drafting, whereby there will be a report and then Ofcom will have the power. I hope we end up in the same place—well, the same place, but a better place. The process may be slightly slower, but we may also end up in a better place for the consideration and thought that will have to be given.

Online Safety Bill (Eleventh sitting)

Debate between Kirsty Blackman and Chris Philp
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As we have heard, the clauses set out how different platforms will be categorised with the purpose of ensuring duties are applied in a reasonable and proportionate way that avoids over-burdening smaller businesses. However, it is worth being clear that the Online Safety Bill, as drafted, requires all in-scope services, regardless of their user size, to take action against content that is illegal and where it is necessary to protect children. It is important to re-emphasise the fact that there is no size qualification for the illegal content duties and the duties on the protection of children.

It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.

It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.

I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.

I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.

We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister for his comments. I still think that such platforms are too dangerous not to be subject to more stringent legislation than similar-sized platforms. For the Chair’s information, I would like to press amendment 80 to a vote. If it falls, I will move straight to pressing amendment 82 to a vote, missing out amendment 81. Does that makes sense, Chair, and is it possible?

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to make a brief comment echoing the shadow Minister’s welcome for the inclusion of senior managers and named people in the Bill. I agree that that level of personal liability and responsibility is the only way that we will be able to hold some of these incredibly large, unwieldy organisations to account. If they could wriggle out of this by saying, “It’s somebody else’s responsibility,” and if everyone then disagreed about whose responsibility it was, we would be in a much worse place, so I also support the inclusion of these clauses and schedule 11.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am delighted by the strong support that these clauses have received from across the aisle. I hope that proves to be a habit-forming development.

On the shadow Minister’s point about publishing the risk assessments, to repeat the point I made a few days ago, under clause 64, which we have already debated, Ofcom has the power—indeed, the obligation—to compel publication of transparency reports that will make sure that the relevant information sees the light of day. I accept that publication is important, but we believe that objective is achieved via the transparency measures in clause 64.

On the point about senior management liability, which again we debated near the beginning of the Bill, we believe—I think we all agree—that this is particularly important for information disclosure. We had the example, as I mentioned at the time, of one of the very large companies refusing to disclose information to the Competition and Markets Authority in relation to a competition matter and simply paying a £50 million fine rather than complying with the duties. That is why criminal liability is so important here in relation to information disclosure.

To reassure the shadow Minister, on the point about when that kicks in, it was in the old version of the Bill, but potentially did not commence for two years. In this new version, updated following our extensive and very responsive listening exercise—I am going to get that in every time—the commencement of this particular liability is automatic and takes place very shortly after Royal Assent. The delay and review have been removed, for the reason the hon. Lady mentioned, so I am pleased to confirm that to the Committee.

The shadow Minister described many of the provisions. Clause 85 gives Ofcom powers to require information, clause 86 gives the power to issue notices and clause 87 the important power to require an entity to name that relevant senior manager, so they cannot wriggle out of their duty by not providing the name. Clause 88 gives the power to require companies to undergo a report from a so-called skilled person. Clause 89 requires full co-operation with Ofcom when it opens an investigation, where co-operation has been sadly lacking in many cases to date. Clause 90 requires people to attend an interview, and the introduction to schedule 11 allows Ofcom to enter premises to inspect or audit the provider. These are very powerful clauses and will mean that social media companies can no longer hide in the shadows from the scrutiny they so richly deserve.

Question put and agreed to.

Clause 85 accordingly ordered to stand part of the Bill.

Clauses 86 to 91 ordered to stand part of the Bill.

Schedule 11

OFCOM’s powers of entry, inspection and audit

Amendment made: 4, in schedule 11, page 202, line 17, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”.—(Chris Philp.)

Schedule 11, as amended, agreed to.

Clause 92

Offences in connection with information notices

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am delighted that support for the Government’s position on the clauses continues and that cross-party unanimity is taking an ever stronger hold. I am sure the Whips Office will find that particularly reassuring.

The shadow Minister asked a question about clause 100. Clause 100 amends section 24B of the Communications Act 2003, which allows Ofcom to provide information to the Secretary of State to assist with the formulation of policy. She asked me to clarify what that means, which I am happy to do. In most circumstances, Ofcom will be required to obtain the consent of providers in order to share information relating to their business. This clause sets out two exceptions to that principle. If the information required by the Secretary of State was obtained by Ofcom to determine the proposed fees threshold, or in response to potential threats to national security or to the health or safety of the public, the consent of the business is not required. In those instances, it would obviously not be appropriate to require the provider’s consent.

It is important that users of regulated services are kept informed of developments around online safety and the operation of the regulatory framework.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

This specifically relates to the Secretary of State, but would the Minister expect both Ofcom and his Department to be working with the Scottish Government and the Northern Ireland Executive? I am not necessarily talking about sharing all the information, but where there are concerns that it is very important for those jurisdictions to be aware of, will he try to ensure that he has a productive relationship with both devolved Administrations?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for her question. Where the matter being raised or disclosed touches on matters of devolved competence—devolved authority—then yes, I would expect that consultation to take place. Matters concerning the health and safety of the public are entirely devolved, I think, so I can confirm that in those circumstances it would be appropriate for the Secretary of State to share information with devolved Administration colleagues.

The shadow Minister has eloquently, as always, touched on the purpose of the various other clauses in this group. I do not wish to try the patience of the Committee, particularly as lunchtime approaches, by repeating what she has ably said already, so I will rest here and simply urge that these clauses stand part of the Bill.

Question put and agreed to.

Clause 97 accordingly ordered to stand part of the Bill.

Clauses 98 to 102 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Ninth sitting)

Debate between Kirsty Blackman and Chris Philp
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.

In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.

I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Before the Minister moves off the point about exceptional circumstances, it was the case previously that an amendment of the law resolution was always considered with Finance Bills. In recent years, that has stopped on the basis of it being exceptional circumstances because a general election was coming up. Then the Government changed that, and now they never table an amendment of the law resolution because they have decided that that is a minor change. Something has gone from being exceptional to being minor, in the view of this Government.

The Minister said that he envisions that this measure will be used only in exceptional circumstances. Can he commit himself to it being used only in exceptional circumstances? Can he give the commitment that he expects that it will be used only in exceptional circumstances, rather than simply envisioning that it will be used in such circumstances?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have made clear how we expect the clause to be used. I am slightly hesitant to be more categorical simply because I do not want to make comments that might unduly bind a future Secretary of State—or, indeed, a future Parliament, because the measure is subject to the affirmative procedure—even were that Secretary of State, heaven forbid, to come from a party other than mine. Circumstances might arise, such as the pandemic, in which a power such as this needs to be exercised for good public policy reasons—in that example, public health. I would not want to be too categorical, which the hon. Lady is inviting me to be, lest I inadvertently circumscribe the ability of a future Parliament or a future Secretary of State to act.

The power is also limited in the sense that, in relation to matters that are not to do with national security or terrorism or CSEA, the power to direct can be exercised only at the point at which the code is submitted to be laid before Parliament. That cannot be done at any point. The power cannot be exercised at a time of the Secretary of State’s choosing. There is one moment, and one moment only, when that power can be exercised.

I also want to make it clear that the power will not allow the Secretary of State to direct Ofcom to require a particular regulated service to take a particular measure. The power relates to the codes of practice; it does not give the power to intrude any further, beyond the code of practice, in the arena of regulated activity.

I understand the points that have been made. We have listened to the Joint Committee, and we have made an important change, which is that to the affirmative procedure. I hope my explanation leaves the Committee feeling that, following that change, this is a reasonable place for clauses 40 and 41 to rest. I respectfully resist amendment 84 and new clause 12, and urge the Committee to allow clauses 40 and 41 to stand part of the Bill.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I can see that that is the most popular thing I have said during the entire session—when you say, “And finally,” in a speech and the crowd cheers, you know you are in trouble.

Regulated user-to-user and search services will have duties to keep records of their risk assessments and the measures they take to comply with their safety duties, whether or not those are the ones recommended in the codes of practice. They must also undertake a children’s access assessment to determine whether children are likely to access their service.

Clause 48 places a duty on Ofcom to produce guidance to assist service providers in complying with those duties. It will help to ensure a consistent approach from service providers, which is essential in maintaining a level playing field. Ofcom will have a duty to consult the Information Commissioner prior to preparing this guidance, as set out in clause 48(2), in order to draw on the expertise of the Information Commissioner’s Office and ensure that the guidance is aligned with wider data protection and privacy regulation.

Question put and agreed to.

Clause 48 accordingly ordered to stand part of the Bill.

Clause 49

“Regulated user-generated content”, “user-generated content”, “news

publisher content”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 89, in clause 49, page 45, line 16, leave out subsection (e).

This amendment would remove the exemption for comments below news articles posted online.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an important philosophical question that underpins much of the Bill’s architecture. All the measures are intended to strike a balance. Where there are things that are at risk of leading to illegal activity, and things that are harmful to children, we are clamping down hard, but in other areas we are being more proportionate. For example, the legal but harmful to adult duties only apply to category 1 companies, and we are looking at whether that can be extended to other high-risk companies, as we debated earlier. In the earlier provisions that we debated, about “have regard to free speech”, there is a balancing exercise between the safety duties and free speech. A lot of the provisions in the Bill have a sense of balance and proportionality. In some areas, such as child sexual exploitation and abuse, there is no balance. We just want to stop that—end of story. In other areas, such as matters that are legal but harmful and touch on free speech, there is more of a balancing exercise.

In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.

I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a question on anonymity. On social media there will be a requirement to verify users’ identities, so if somebody posts on Twitter that they want to lynch me, it is possible to find out who that is, provided they do not have an anonymous account. There is no such provision for newspaper comment sections, so I assume it would be much more difficult for the police to find them, or for me not to see anonymous comments that threaten my safety below the line of newspaper articles—comments that are just as harmful, which threaten my safety on social media. Can the Minister can convince me otherwise?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is correct in her analysis, I can confirm. Rather similar to the previous point, because of the interaction with freedom of the press—the argument that the newspapers and broadcasters have advanced—and because this is an inherently less viral environment, we have drawn the balance where we have. She is right to highlight a reasonable risk, but we have struck the balance in the way we have for that reason.

The shadow Minister, the hon. Member for Pontypridd, asked whether very harmful or illegal interactions in the metaverse would be covered or whether they have a metaphorical “get out of jail free” card owing to the exemption in clause 49(2)(d) for “one-to-one live aural communications”. In essence, she is asking whether, in the metaverse, if two users went off somewhere and interacted only with each other, that exemption would apply and they would therefore be outwith the scope of the Bill. I am pleased to tell her they would not, because the definition of live one-to-one aural communications goes from clause 49(2)(d) to clause 49(5), which defines “live aural communications”. Clause 49(5)(c) states that the exemption applies only if it

“is not accompanied by user-generated content of any other description”.

The actions of a physical avatar in the metaverse do constitute user-generated content of any other description. Owing to that fact, the exemption in clause 49(2)(d) would not apply to the metaverse.

I am happy to provide clarification on that. It is a good question and I hope I have provided an example of how, even though the metaverse was not conceived when the Bill was conceived, it does have an effect.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that point, when it comes to definition of content, we have tabled an amendment about “any other content”. I am not convinced that the definition of content adequately covers what the Minister stated, because it is limited, does not include every possible scenario where it is user-generated and is not future-proofed enough. When we get to that point, I would appreciate it if the Minister would look at the amendment and ensure that what he intends is what happens.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the hon. Lady for thinking about that so carefully. I look forward to her amendment. For my information, which clause does her amendment seek to amend?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will let the Minister know in a moment.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful. It is an important point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his service on the Joint Committee. I heard the representations of my right hon. Friend the Member for Basingstoke about a Joint Committee, and I have conveyed them to the higher authorities.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The amendment that the Minister is asking about is to clause 189, which states:

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.

It is amendment 76 that, after “including”, would insert “but not limited to”, in order that the Bill is as future-proofed as it can be.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.

Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.

Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.

Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.

Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.

We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.

Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is helpful for understanding the rationale, but in the light of how people communicate online these days, although exempting telephone conversations makes sense, exempting what I am talking about does not. I would appreciate it if the Minister came back to me on that, and he does not have to give me an answer now. It would also help if he explained the difference between “aural” and “oral”, which are mentioned at different points in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will certainly come back with a more complete analysis of the point about protecting children—as parents, that clearly concerns us both. The literal definitions are that “aural” means “heard” and “oral” means “spoken”. They occur in different places in the Bill.

This is a difficult issue and legitimate questions have been raised, but as I said in response to the hon. Member for Batley and Spen, in this area as in others, there are balances to strike and different considerations at play—freedom of the press on the one hand, and the level of risk on the other. I think that the clause strikes that balance in an appropriate way.

Question put, That the amendment be made.

Online Safety Bill (Tenth sitting)

Debate between Kirsty Blackman and Chris Philp
Committee stage
Tuesday 14th June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I thank the Minister for tabling the amendments. In the evidence sessions, we heard about omissions in schedule 7 from not having Northern Irish and Scottish offences included. Such offences were included in schedule 6 but, at that point, not in schedule 7.

I appreciate that the Minister has worked with the devolved Administrations to table the amendments. I also appreciate the way in which amendment 126 is written, such that the Secretary of State “must consult” Scottish Ministers and the Department of Justice in Northern Ireland before making regulations that relate to legislation in either of the devolved countries. I am glad that the amendments have been drafted in this way and that the concern that we heard about in evidence no longer seems to exist, and I am pleased with the Minister’s decision about the way in which to make any future changes to legislation.

I agree with the position put forward by the hon. Member for Pontypridd. My understanding, from what we heard in evidence a few weeks ago, is that, legally, all will have to agree with the higher bar of the offences, and therefore anyone anywhere across the UK will be provided with the additional level of protection. She is right that the offence might not apply to everyone, but the service providers will be subject to the requirements elsewhere. Similarly, that is my view. Once again, I thank the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Briefly, I hope that the amendments provide further evidence to the Committee of the Government’s willingness to listen and to respond. I can provide the confirmation that the hon. Members for Aberdeen North and for Pontypridd requested: the effect of the clauses is a levelling up—if I may put it that way. Any of the offences listed effectively get applied to the UK internet, so if there is a stronger offence in any one part of the United Kingdom, that will become applicable more generally via the Bill. As such, the answer to the question is in the affirmative.

Amendment 116 agreed to.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The first thing to make clear to the Committee and anyone listening is that, of course, offences under the Modern Slavery Act 2015 are brought into the scope of the illegal content duties of this Bill through clause 52(4)(d), because such offences involve an individual victim.

Turning to the priority offences set out in schedule 7 —I saw this when I was a Home Office Minister—modern slavery is generally associated with various other offences that are more directly visible and identifiable. Modern slavery itself can be quite hard to identify. That is why our approach is, first, to incorporate modern slavery as a regular offence via clause 52(4)(d) and, secondly, to specify as priority offences those things that are often identifiable symptoms of it and that are feasibly identified. Those include many of the offences listed in schedule 7, such as causing, inciting or controlling prostitution for gain, as in paragraph 16 on sexual exploitation, which is often the manifestation of modern slavery; money laundering, which is often involved where modern slavery takes place; and assisting illegal immigration, because modern slavery often involves moving somebody across a border, which is covered in paragraph 15 on assisting illegal immigration, as per section 25 of the Immigration Act 1971.

Modern slavery comes into scope directly via clause 52(4)(d) and because the practicably identifiable consequences of modern slavery are listed as priority offences, I think we do have this important area covered.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that the Minister thinks that there are other measures that cover this offence, but will he keep it under consideration going forward? I do not think that that is too much to ask. Part of the logic behind that is that some of the other issues, where the reasons behind them must be proved, are much more difficult to define or prove than the modern slavery offences that we are asking to be added here. Whether he accepts the amendment or not, will he commit to considering the matter and not just saying, “Absolutely no”? That would be helpful for us and the many organisations that are keen for such things to be included.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am happy to give that further consideration, but please do not interpret that as a firm commitment. I repeat that the Modern Slavery Act is brought into the scope of this Bill via clause 52(4)(d).

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of questions for the Minister. The first is about the interaction of subsection (4)(c) and subsection (5). I am slightly confused about how that, because subsection (4)(c) states that anything that is not within the terms of primary priority content or primary content but is harmful to

“an appreciable number of children”

is included as

“content that is harmful to children”.

That is completely reasonable. However, subsection (5) excludes illegal content and content with a “potential financial impact”. I appreciate that these provisions are drafted in quite a complicated way, but it would be useful to have an understanding of what that means. If it means there is no harm on the basis of things that are financial in nature, that is a problem, because that explicitly excludes gambling-type sites, loot boxes and anything of that sort, which by their nature are intentionally addictive and try to get children or adults to part with significant amounts of cash. If they are excluded, that is a problem.

How will clause 53 be future-proofed? I am not suggesting that there is no future proofing, but it would be helpful to me and fellow Committee members if the Minister explained how the clause will deal with new emerging harms and things that may not necessarily fall within the definitions that we set initially. How will those definitions evolve and change as the internet evolves and changes, and as the harms with which children are presented evolve and change?

And finally—I know that the Minister mentioned earlier that saying, “And finally”, in a speech is always a concern, but I am saying it—I am slightly concerned about the wording in subsection (4)(c), which refers to

“material risk of significant harm to an appreciable number of children”,

because I am not clear what an “appreciable number” is. If there is significant harm to one child from content, and content that is incredibly harmful to children is stumbled upon by a child, is it okay for that provider to have such content? It is not likely to accessed by an “appreciable number of children” and might be accessed by only a small number, but if the Minister could give us an understanding of what the word “appreciable” means in that instance, that would be greatly appreciated.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There are one or two points to pick up on. A question was raised about algorithms, and it is worth saying that the risk assessments that platforms must undertake will include consideration of the operation of algorithms. It is important to make it absolutely clear that that is the case.

The shadow Minister asked about the definition of harm, and whether all the harms that might concern Parliament, and many of us as parents, will be covered. It may be helpful to refer to definition of harm provided in clause 187, at the top of page 153. Committee members will note that the definition is very wide and that subsection (2) defines it as “physical or psychological harm”, so I hope that partly answers the shadow Minister’s question.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

However, I do not think that loot boxes even existed in 2005 when that Act was published. Loot boxes are gambling. They may not be covered by that legislation, but they are gambling. Will the Minister consider whether those harms are unintentionally excluded by clause 53?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are getting into some detail here. In the unlikely event that any member of the Committee does not know what a loot box is, it is where someone playing a game can buy extra lives or enhance the game’s functionality somehow by paying some money. There have been some cases where children have stolen their parent’s credit card and bought these things in large numbers

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Having played lots of games, I can clarify that people do not know what they are getting with a loot box, so they are putting money forward but do not know whether they will get a really good piece of armour or a really crap piece of armour. It is literally gambling, because children do not know what will come out of the box, as opposed to just buying a really good piece of armour with £2.99 from their parent’s credit card.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

However, the reward is non-monetary in nature. For that reason, the Government’s view—if I can test your patience momentarily, Sir Roger, as we are straying somewhat outside this particular debate—is that loot boxes will not be covered by the gambling review, because we do not see them as gambling. However, we do see them as an issue that needs to be addressed, and that will happen via the online advertising programme, which will be overseen by the Minister for Media, Data and Digital Infrastructure, my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez). That will happen shortly and advertising legislation will follow, so loot boxes will be addressed in the online advertising programme and the subsequent legislation.

The other question raised by the hon. Member for Aberdeen North was about the definition of “an appreciable number”. I have a couple of points to make. By definition, anything that is illegal is covered already in schedule 7 or through clause 52(4)(d), which we have mentioned a few times. Content that is

“primary priority content that is harmful to children”

or

“priority content that is harmful to children”

is covered in clause 53(4)(a) and (b), so we are now left with the residue of stuff that is neither illegal nor primary priority content; it is anything left over that might be harmful. By definition, we have excluded all the serious harms already, because they would be either illegal or in the priority categories. We are left with the other stuff. The reason for the qualifier “appreciable” is to make sure that we are dealing only with the residual non-priority harmful matters. We are just making sure that the duty is reasonable. What constitutes “appreciable” will ultimately get set out through Ofcom guidance, but if it was a tiny handful of users and it was not a priority harm, and was therefore not considered by Parliament to be of the utmost priority, it would be unlikely to be applicable to such a very small number. Because it is just the residual category, that is a proportionate and reasonable approach to take.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Given the Government’s ability to designate priority content and primary priority content through secondary legislation, the Minister is telling me that if they decided that loot boxes were not adequately covered by the future legislation coming through, and they were to discover that something like this was a big issue, they could add that to one of the two priority content designations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member is asking me a somewhat technical question, and I hesitate to answer without taking full advice, but I think the answer is yes. The reason that loot boxes are not considered gambling in our view is that they do not have a monetary value, so the exclusion in clause 53(5)(b)(i) does not apply. On a quick off-the-cuff reading, it does not strike me immediately that the exclusions in (5)(b)(ii) or (iii) would apply to loot boxes either, so I believe—and officials who know more about this than I do are nodding—that the hon. Lady is right to say that it would be possible for loot boxes to become primary priority content or priority content by way of a statutory instrument. Yes, my belief is that that would be possible.

Question put and agreed to.

Clause 53 accordingly ordered to stand part of the Bill.

Clause 54

“Content that is harmful to children” etc

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that clarification. I just wanted to make it absolutely clear that I strongly believe that anonymity is a very good protection, not just for people who intend to do bad on the internet, but for people who are seeking out community, particularly. I think that that is important.

If you will allow me to say a couple of things about the next clause, Sir Roger, Mencap raised the issue of vulnerable users, specifically vulnerable adult users, in relation to the form of identity verification. If the Minister or Ofcom could give consideration to perhaps including travel passes or adult passes, it might make the internet a much easier place to navigate for people who do not have control of their own documentation—they may not have access to their passport, birth certificate, or any of that sort of thing—but who would be able to provide a travel pass, because that is within their ownership.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We have heard quite a lot about the merits of clause 57, and I am grateful to colleagues on both side for pointing those out. The hon. Member for Pontypridd asked about the effectiveness of the user identity verification processes and how those might occur—whether they would be done individually by each company for their own users, or whether a whole industry would develop even further, with third parties providing verification that could then be used across a whole number of companies.

Some of those processes exist already in relation to age verification, and I think that some companies are already active in this area. I do not think that it would be appropriate for us, in Parliament, to specify those sorts of details. It is ultimately for Ofcom to issue that guidance under clause 58, and it is, in a sense, up to the market and to users to develop their own preferences. If individual users prefer to verify their identity once and then have that used across multiple platforms, that will itself drive the market. I think that there is every possibility that that will happen. [Interruption.]

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 62 creates an offence, as we discussed earlier, of knowingly or recklessly providing inaccurate information to the NCA in relation to CSEA reporting, the penalty for which is imprisonment, a fine or both. Where a company seeks to evade its responsibility, or disregards the importance of the requirement to report CSEA by providing inaccurate information, it will be liable for prosecution. We are backing the requirement to report CSEA with significant criminal powers.

Clause 63 provides definitions for the terms used in chapter 2 of part 4, in relation to the requirement to report CSEA. In summary, a UK provider of a regulated service is defined as a provider that is

“incorporated or formed under the law of any part of the United Kingdom”

or where it is

“individuals who are habitually resident in the United Kingdom”.

The shadow Minister asked about the test and what counts, and I hope that provides the answer. We are defining CSEA content as content that a company becomes aware of containing CSEA. A company can become aware of that by any means, including through the use of automated systems and processes, human moderation or user reporting.

With regard to the definition of UK-linked CSEA, which the shadow Minister also asked about, that refers to content that may have been published and shared in the UK, or where the nationality or location of a suspected offender or victim is in the UK. The definition of what counts as a UK link is quite wide, because it includes not only the location of the offender or victim but where the content is shared. That is a wide definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a specific question—the Minister answered a similar question from me earlier. The Bill says that the location of the child “is” in the UK. Would it be reasonable to expect that if a company suspected the child “was” in the UK, although not currently, that would be in scope as something required to be reported? I know that is technical, but if the “was” is included in the “is” then that is much wider and more helpful than just including the current location.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

If the child had been in the UK when the offence was committed, that would ordinarily be subject to UK criminal law, because the crime would have been committed in the UK. The test is: where was the child or victim at the time the offence was committed? As I said a moment ago, however, the definition of “UK-linked” is particularly wide and includes

“the place where the content was published, generated, uploaded or shared.”

The word “generated”—I am reading from clause 63(6)(a), at the top of page 56—is clearly in the past tense and would include the circumstance that the hon. Lady described.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I associate myself with the comments made by the hon. Member for Pontypridd and apologise on behalf of my hon. Friend the Member for Ochil and South Perthshire, who is currently in the Chamber dealing with the Channel 4 privatisation. I am sure that, given his position on the Joint Committee, he would have liked to comment on the clause and would have welcomed its inclusion in the Bill, but, unfortunately, he cannot currently do so.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a great shame that the hon. Member for Ochil and South Perthshire is occupied in the main Chamber, because I could have pointed to this change as one of the examples of the Government listening to the Joint Committee, on which he and many others served. However, I hope that the hon. Member for Aberdeen North will communicate my observation to him, which I am sure he will appreciate.

In seriousness, this is an example of the Government moving the Bill on in response to widespread parliamentary and public commentary. It is right that we extend the duties to cover commercial pornographic content as well as the user-to-user pornography covered previously. I thank the Opposition parties for their support for the inclusion of those measures.

Online Safety Bill (Seventh sitting)

Debate between Kirsty Blackman and Chris Philp
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to raise an additional point about content reporting and complaints procedures. I met with representatives of Mencap yesterday, who raised the issue of the accessibility of the procedures that are in place. I appreciate that the Bill talks about procedures being accessible, but will the Minister give us some comfort about Ofcom looking at the reporting procedures that are in place, to ensure that adults with learning disabilities in particular can access those content reporting and complaints procedures, understand them and easily find them on sites?

That is a specific concern that Mencap raised on behalf of its members. A number of its members will be users of sites such as Facebook, but may find it more difficult than others to access and understand the procedures that are in place. I appreciate that, through the Bill, the Minister is making an attempt to ensure that those procedures are accessible, but I want to make sure they are accessible not just for the general public but for children, who may need jargon-free access to content reporting and complaints procedures, and for people with learning disabilities, who may similarly need jargon-free, easy-to-understand and easy-to-find access to those procedures.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me try to address some of the questions that have been raised in this short debate, starting with the question that the hon. Member for Aberdeen North quite rightly asked at the beginning. She posed the question, “What if somebody who is not an affected person encountered some content and wanted to report it?” For example, she might encounter some racist content on Twitter or elsewhere and would want to be able to report it, even though she is not herself the target of it or necessarily a member of the group affected. I can also offer the reassurance that my hon. Friend the Member for Wolverhampton North East asked for.

The answer is to be found in clause 17(2), which refers to

“A duty to operate a service using systems and processes that allow users and”—

I stress “and”—“affected persons”. As such, the duty to offer content reporting is to users and affected persons, so if the hon. Member for Aberdeen North was a user of Twitter but was not herself an affected person, she would still be able to report content in her capacity as a user. I hope that provides clarification.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that. That is key, and I am glad that this is wider than just users of the site. However, taking Reddit as an example, I am not signed up to that site, but I could easily stumble across content on it that was racist in nature. This clause would mean that I could not report that content unless I signed up to Reddit, because I would not be an affected person or a user of that site.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her clarificatory question. I can confirm that in order to be a user of a service, she would not necessarily have to sign up to it. The simple act of browsing that service, of looking at Reddit—not, I confess, an activity that I participate in regularly—regardless of whether or not the hon. Lady has an account with it, makes her a user of that service, and in that capacity she would be able to make a content report under clause 17(2) even if she were not an affected person. I hope that clears up the question in a definitive manner.

The hon. Lady asked in her second speech about the accessibility of the complaints procedure for children. That is strictly a matter for clause 18, which is the next clause, but I will quickly answer her question. Clause 18 contains provisions that explicitly require the complaints process to be accessible. Subsection (2)(c) states that the complaints procedure has to be

“easy to access, easy to use (including by children) and transparent”,

so the statutory obligation that she requested is there in clause 18.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Can the Minister explain the logic in having that phrasing for the complaints procedure but not for the content-reporting procedure? Surely it would also make sense for the content reporting procedure to use the phrasing

“easy to access, easy to use (including by children) and transparent.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is in clause 17(2)

“a duty to operate a service that allows users and affected persons to easily report content which they consider to be content of a…kind specified below”,

which, of course, includes services likely to be accessed by children, under subsection (4). The words “easily report” are present in clause 17(2).

I will move on to the question of children reporting more generally, which the shadow Minister raised as well. Clearly, a parent or anyone with responsibility for a child has the ability to make a report, but it is also worth mentioning the power in clauses 140 to 142 to make super-complaints, which the NSPCC strongly welcomed its evidence. An organisation that represents a particular group—an obvious example is the NSPCC representing children, but it would apply to loads of other groups—has the ability to make super-complaints to Ofcom on behalf of those users, if it feels they are not being well treated by a platform. A combination of the parent or carer being able to make individual complaints, and the super-complaint facility, means that the points raised by Members are catered for. I commend the clause to the Committee.

Question put and agreed to.

Clause 17 accordingly ordered to stand part of the Bill.

Clause 18

Duties about complaints procedures

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me address some of the issues raised in the debate. First, everyone in the House recognises the enormous problem at the moment with large social media firms receiving reports about harmful and even illegal content that they just flagrantly ignore. The purpose of the clause, and indeed of the whole Bill and its enforcement architecture, is to ensure that those large social media firms no longer ignore illegal and harmful content when they are notified about it. We agree unanimously on the importance of doing that.

The requirement for those firms to take the proper steps is set out in clause 18(2)(b), at the very top of page 18 —it is rather depressing that we are on only the 18th of a couple of hundred pages. That paragraph creates a statutory duty for a social media platform to take “appropriate action”—those are the key words. If the platform is notified of a piece of illegal content, or content that is harmful to children, or of content that it should take down under its own terms and conditions if harmful to adults, then it must do so. If it fails to do so, Ofcom will have the enforcement powers available to it to compel—ultimately, escalating to a fine of up to 10% of global revenue or even service disconnection.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister give way?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I wanted to ask specifically about the resourcing of Ofcom, given the abilities that it will have under this clause. Will Ofcom have enough resource to be able to be that secondary line of defence?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her thoughtful intervention. There are two separate questions here. One is about user advocacy groups helping individuals to make complaints to the companies. That is a fair point, and no doubt we will debate it later. The ombudsman question is different; it is about whether to have a right of appeal against decisions by social media companies. Our answer is that, rather than having a third-party body—an ombudsman—effectively acting as a court of appeal against individual decisions by the social media firms, because of the scale of the matter, the solution is to compel the firms, using the force of law, to get this right on a systemic and comprehensive basis.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I give way first to the hon. Member for Aberdeen North—I think she was first on her feet—and then I will come to the hon. Member for Pontypridd.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Does the Minister not think this is going to work? He is creating this systems and processes approach, which he suggests will reduce the thousands of complaints—complaints will be made and complaints procedures will be followed. Surely, if it is going to work, in 10 years’ time we are going to need an ombudsman to adjudicate on the individual complaints that go wrong. If this works in the way he suggests, we will not have tens of millions of complaints, as we do now, but an ombudsman would provide individual redress. I get what he is arguing, but I do not know why he is not arguing for both things, because having both would provide the very best level of support.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will address the review clause now, since it is relevant. If, in due course, as I hope and expect, the Bill has the desired effect, perhaps that would be the moment to consider the case for an ombudsman. The critical step is to take a systemic approach, which the Bill is doing. That engages the question of new clause 1, which would create a mechanism, probably for the reason the hon. Lady just set out, to review how things are going and to see if, in due course, there is a case for an ombudsman, once we see how the Bill unfolds in practice.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I confirm what the hon. Lady has just said. In response to the hon. Member for Worsley and Eccles South, it is important to say that the duty in clause 19 is “to have regard”, which simply means that a balancing exercise must be performed. It is not determinative; it is not as if the rights in the clause trump everything else. They simply have to be taken into account when making decisions.

To repeat what we discussed on Tuesday, I can explicitly and absolutely confirm to the hon. Member for Aberdeen North that in my view and the Government’s, concerns about freedom of expression or privacy should not trump platforms’ ability to scan for child sexual exploitation and abuse images or protect children. It is our view that there is nothing more important than protecting children from exploitation and sexual abuse.

We may discuss this further when we come to clause 103, which develops the theme a little. It is also worth saying that Ofcom will be able to look at the risk assessments and, if it feels that they are not of an adequate standard, take that up with the companies concerned. We should recognise that the duty to have regard to freedom of expression is not something that currently exists. It is a significant step forward, in my view, and I commend clauses 19 and 29 to the Committee.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.

I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.

I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Colleagues have spoken eloquently to the purpose and effect of the various clauses and schedule 3 —the stand part component of this group. On schedule 3, the shadow Minister, the hon. Member for Worsley and Eccles South, asked about timing. The Government share her desire to get this done as quickly as possible. In its evidence a couple of weeks ago, Ofcom said it would be publishing its road map before the summer, which would set out the timetable for moving all this forward. We agree that that is extremely important.

I turn to one or two questions that arose on amendment 22. As always, the hon. Member for Aberdeen North asked a number of very good questions. The first was whether the concept of a “significant number” applied to a number in absolute terms or a percentage of the people using a particular service, and which is looked at when assessing what is significant. The answer is that it can be either—either a large number in absolute terms, by reference to the population of the whole United Kingdom, or a percentage of those using the service. That is expressed in clause 31(4)(a). Members will note the “or” there. It can be a number in proportion to the total UK population or the proportion using a service. I hope that answers the hon. Member’s very good question.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

My concern is where services that meet neither of those criteria—they do not meet the “significant number” criterion in percentage terms because, say, only 0.05% of their users are children, and they do not meet it in population terms, because they are a pretty small platform and only have, say, 1,000 child users—but those children who use the platform are at very high risk because of the nature of the platform or the service provided. My concern is for those at highest risk where neither of the criteria are met and the service does not have to bother conducting any sort of age verification or access requirements.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.

The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.

On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.

Online Safety Bill (Eighth sitting)

Debate between Kirsty Blackman and Chris Philp
Committee stage
Thursday 9th June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank my hon. Friend for his public service announcement. His constituent is incredibly lucky that my hon. Friend managed to act in that way and get the money back to her, because there are so many stories of people not managing to get their money back and losing their entire life savings as a result of scams. It is the case that not all those scams take place online—people can find scams in many other places—but we have the opportunity with the Bill to take action on scams that are found on the internet.

The other group I want to mention, and for whom highlighting advertising could make a positive difference, is people with learning disabilities. People with learning disabilities who use the internet may not understand the difference between adverts and search results, as the hon. Member for Worsley and Eccles South mentioned. They are a group who I would suggest are particularly susceptible to fraudulent advertising.

We are speaking a lot about search engines, but a lot of fraudulent advertising takes place on Facebook and so on. Compared with the majority of internet users, there is generally an older population on such sites, and the ability to tackle fraudulent advertising there is incredibly useful. We know that the sites can do it, because there are rules in place now around political advertising on Facebook, for example. We know that it is possible for them to take action; it is just that they have not yet taken proper action.

I am happy to support the amendments, but I am also glad that the Minister has put these measures in the Bill, because they will make a difference to so many of our constituents.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for her latter remarks. We made an important addition to the Bill after listening to parliamentarians across the House and to the Joint Committee, which many people served on with distinction. I am delighted that we have been able to make that significant move. We have heard a lot about how fraudulent advertising can affect people terribly, particularly more vulnerable people, so that is an important addition.

Amendments 23 and 24 seek to make it clear that where the target is in the UK, people are covered. I am happy to assure the Committee that that is already covered, because the definitions at the beginning of the Bill—going back to clause 3(5)(b), on page 3—make it clear that companies are in scope, both user-to-user and search, if there is a significant number of UK users or where UK users form one of the target markets, or is the only target market. Given the reference to “target markets” in the definitions, I hope that the shadow Minister will withdraw the amendment, because the matter is already covered in the Bill.

New clause 5 raises important points about the regulation of online advertising, but that is outside the purview of what the Bill is trying to achieve. The Government are going to work through the online advertising programme to tackle these sorts of issues, which are important. The shadow Minister is right to raise them, but they will be tackled holistically by the online advertising programme, and of course there are already codes of practice that apply and are overseen by the Advertising Standards Authority. Although these matters are very important and I agree with the points that she makes, there are other places where those are best addressed.

New clause 6 is about the verification process. Given that the Bill is primary legislation, we want to have the core duty to prevent fraudulent advertising in the Bill. How that is implemented in this area, as in many others, is best left to Ofcom and its codes of practice. When Ofcom publishes the codes of practice, it might consider such a duty, but we would rather leave Ofcom, as the expert regulator, with the flexibility to implement that via the codes of practice and leave the hard-edged duty in the Bill as drafted.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I agree 100%. The case that the shadow Minister, the hon. Member for Pontypridd, made and the stories she highlighted about the shame that is felt show that we are not just talking about a one-off impact on people’s lives, but potentially years of going through those awful situations and then many years to recover, if they ever do, from the situations they have been through.

I do not think there is too much that we could do, too many codes of practice we could require or too many compliances we should have in place. I also agree that girls are the most vulnerable group when considering this issue, and we need to ensure that this Bill is as fit for purpose as it can be and meets the Government’s aim of trying to make the internet a safe place for children and young people. Because of the additional risks that there are for girls in particular, we need additional protections in place for girls. That is why a number of us in this room are making that case.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

This has been an important debate. I think there is unanimity on the objectives we are seeking to achieve, particularly protecting children from the risk of child sexual exploitation and abuse. As we have discussed two or three times already, we cannot allow end-to-end encryption to frustrate or prevent the protection of children.

I will talk about two or three of the issues that have arisen in the course of the debate. The first is new clause 20, a proposal requiring Ofcom to put together a report. I do not think that is strictly necessary, because the Bill already imposes a requirement to identify, assess and mitigate CSEA. There is no optionality here and no need to think about it; there is already a demand to prevent CSEA content, and Ofcom has to produce codes of practice explaining how it will do that. I think what is requested in new clause 20 is required already.

The hon. Member for Pontypridd mentioned the concern that Ofcom had to first of all prove that the CSEA risk existed. I think that might be a hangover from the previous draft of the Bill, where there was a requirement for the evidence to be “persistent and prevalent”—I think that might have been the phrase—which implied that Ofcom had to first prove that it existed before it could take action against it. So, for exactly the reason she mentioned, that it imposed a requirement to prove CSEA is there, we have changed the wording in the new version. Clause 103(1), at the top of page 87, instead of “persistent and prevalent”, now states “necessary and proportionate”. Therefore, if Ofcom simply considers something necessary, without needing to prove that it is persistent and prevalent—just if it thinks it is necessary—it can take the actions set out in that clause. For the reason that she mentioned, the change has been made already.

Online Safety Bill (Fifth sitting)

Debate between Kirsty Blackman and Chris Philp
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a moment.

For those reasons, I think we have drawn the line in the right place. There is personal criminal liability for information provision, with fines of 10% of local revenue and service disruption—unplugging powers—as well. Having thought about it quite carefully, I think we have struck the balance in the right place. We do not want to deter people from offering services in the UK. If they worried that they might go to prison too readily, it might deter people from locating here. I fully recognise that there is a balance to strike. I feel that the balance is being struck in the right place.

I will go on to comment on a couple of examples we heard about Carillion and the financial crisis, but before I do so, I will give way as promised.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that the Minister says he has been swithering on this point—he has been trying to work out the correct place to draw the line. Given that we do not yet have a commitment for a standing committee—again, that is potentially being considered—we do not know how the legislation is going to work. Will the Minister, rather than accepting the amendment, give consideration to including the ability to make changes via secondary legislation so that there is individual criminal liability for different breaches? That would allow him the flexibility in the future, if the regime is not working appropriately, to add through secondary legislation individual criminal liability for breaches beyond those that are currently covered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have not heard that idea suggested. I will think about it. I do not want to respond off the cuff, but I will give consideration to the proposal. Henry VIII powers, which are essentially what the hon. Lady is describing—an ability through secondary legislation effectively to change primary legislation—are obviously viewed askance by some colleagues if too wide in scope. We do use them, of course, but normally in relatively limited circumstances. Creating a brand new criminal offence via what amounts to a Henry VIII power would be quite a wide application of the power, but it is an idea that I am perfectly happy to go away and reflect on. I thank her for mentioning the idea.

A couple of examples were given about companies that have failed in the past. Carillion was not a financial services company and there was no regulatory oversight of the company at all. In relation to financial services regulation, despite the much stricter regulation that existed in the run-up to the 2008 financial crisis, that crisis occurred none the less. [Interruption.] We were not in government at the time. We should be clear-eyed about the limits of what regulation alone can deliver, but that does not deter us from taking the steps we are taking here, which I think are extremely potent, for all the reasons that I mentioned and will not repeat.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

All the companies have to do the risk assessment, for example for the “illegal” duties, where they are required to by the Bill. For the “illegal” duties, that is all of them; they have to do those risk assessments. The question is whether they have to send them to Ofcom—all of them—even if they are very low risk or have very low user numbers, and whether Ofcom, by implication, then has to consider them, because it would be pointless to require them to be sent if they were not then looked at. We want to ensure that Ofcom’s resources are pointed at the areas where the risks arise. Ofcom can request any of these. If Ofcom is concerned—even a bit concerned—it can request them.

Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

If the rules are so bad that people can circumvent them, they are not good enough anyway and they need to be updated, but I have a specific question on this. The Minister says that Ofcom will be taking in the biggest risk assessments, looking at them and ensuring that they are adequate. Will he please give consideration to asking Ofcom to publish the risk assessments from the very biggest platforms? Then they will all be in one place. They will be easy for people to find and people will not have to rake about in the bottom sections of a website. And it will apply only in the case of the very biggest, most at risk platforms, which should be regularly updating their risk assessments and changing their processes on a very regular basis in order to ensure that people are kept safe.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her intervention and for the—

Online Safety Bill (Sixth sitting)

Debate between Kirsty Blackman and Chris Philp
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for raising that. The risk assessments and, indeed, the duties arising under this Bill all apply to systems and processes—setting up systems and processes that are designed to protect people and to prevent harmful and illegal content from being encountered. We cannot specify in legislation every type of harmful content that might be encountered. This is about systems and processes. We heard the Chairman of the Joint Committee on the draft Online Safety Bill, our hon. Friend the Member for Folkestone and Hythe (Damian Collins), confirm to the House on Second Reading his belief—his accurate belief—that the Bill takes a systems-and-processes approach. We heard some witnesses saying that as well. The whole point of this Bill is that it is tech-agnostic—to future-proof it, as hon. Members mentioned this morning—and it is based on systems and processes. That is the core architecture of the legislation that we are debating.

Amendments 25 and 26 seek to ensure that user-to-user services assess and mitigate the risk of illegal content being produced via functions of the service. That is covered, as it should be—the Opposition are quite right to raise the point—by the illegal content risk assessment and safety duties in clauses 8 and 9. Specifically, clause 8(5)(d), on page 7 of the Bill—goodness, we are only on page 7 and we have been going for over half a day already—requires services to risk-assess functionalities of their service being used to facilitate the presence of illegal content. I stress the word “presence” in clause 8(5)(d). Where illegal content is produced by a functionality of the service—for example, by being livestreamed—that content will be present on the service and companies must mitigate that risk. The objective that the Opposition are seeking to achieve, and with which we completely agree with, is covered in clause 8(5)(d) by the word “presence”. If the content is present, it is covered by that section.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Specifically on that, I understand the point the hon. Gentleman is making and appreciate his clarification. However, on something such as Snapchat, if somebody takes a photo, it is sent to somebody else, then disappears immediately, because that is what Snapchat does—the photo is no longer present. It has been produced and created there, but it is not present on the platform. Can the Minister consider whether the Bill adequately covers all the instances he hopes are covered?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an interesting point about time. However, the clause 8(5)(d) uses the wording,

“the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content”

and so on. That presence can happen at any time, even fleetingly, as with Snapchat. Even when the image self-deletes after a certain period—so I am told, I have not actually used Snapchat—the presence has occurred. Therefore, that would be covered by clause 8(5)(d).

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The question of proof is a separate one, and that would apply however we drafted the clause. The point is that the clause provides that any presence of a prohibited image would fall foul of the clause. There are also duties on the platforms to take reasonable steps. In the case of matters such as child sexual exploitation and abuse images, there are extra-onerous duties that we have discussed before, for obvious and quite correct reasons.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister stress again that in this clause specifically he is talking about facilitating any presence? That is the wording that he has just used. Can he clarify exactly what he means? If the Minister were to do so, it would be an important point for the Bill as it proceeds.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government support the spirit of amendments 17 and 28, which seek to achieve critical objectives, but the Bill as drafted delivers those objectives. In relation to amendment 17 and cross-platform risk, clause 8 already sets out harms and risks—including CSEA risks—that arise by means of the service. That means through the service to other services, as well as on the service itself, so that is covered.

Amendment 28 calls for the risk assessments expressly to cover illegal child sexual exploitation content, but clause 8 already requires that to happen. Clause 8(5) states that the risk assessment must cover the

“risk of individuals who are users of the service encountering…each kind of priority illegal content”.

If we follow through the definition of priority illegal content, we find all those CSEA offences listed in schedule 6. The objective of amendment 28 is categorically delivered by clause 8(5)(b), referencing onwards to schedule 6.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The amendment specifically mentions the level and rates of those images. I did not quite manage to follow through all the things that the Minister just spoke about, but does the clause specifically talk about the level of those things, rather than individual incidents, the possibility of incidents or some sort of threshold for incidents, as in some parts of the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The risk assessments that clause 8 requires have to be suitable and sufficient; they cannot be perfunctory and inadequate in nature. I would say that suitable and sufficient means they must go into the kind of detail that the hon. Lady requests. More details, most of which relate to timing, are set out in schedule 3. Ofcom will be making sure that these risk assessments are not perfunctory.

Importantly, in relation to CSEA reporting, clause 59, which we will come to, places a mandatory requirement on in-scope companies to report to the National Crime Agency all CSEA content that they detect on their platforms, if it has not already been reported. Not only is that covered by the risk assessments, but there is a criminal reporting requirement here. Although the objectives of amendments 17 and 28 are very important, I submit to the Committee that the Bill delivers the intention behind them already, so I ask the shadow Minister to withdraw them.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend, as always, makes a very good point. The codes of practice will be important, particularly to enable Ofcom to levy fines where appropriate and then successfully defend them. This is an area that may get litigated. I hope that, should lawyers litigating these cases look at our transcripts in the future, they will see how strongly those on both sides of the House feel about this point. I know that Ofcom will ensure that the codes of practice are properly drafted. We touched this morning on the point about timing; we will follow up with Ofcom to make sure that the promise it made us during the evidence session about the road map is followed through and that those get published in good time.

On the point about the Joint Committee, I commend my right hon. Friend for her persistence—[Interruption.] Her tenacity—that is the right word. I commend her for her tenacity in raising that point. I mentioned it to the Secretary of State when I saw her at lunchtime, so the point that my right hon. Friend made this morning has been conveyed to the highest levels in the Department.

I must move on to the final two amendments, 11 and 13, which relate to transparency. Again, we had a debate about transparency earlier, when I made the point about the duties in clause 64, which I think cover the issue. Obviously, we are not debating clause 64 now but it is relevant because it requires Ofcom—it is not an option but an obligation; Ofcom must do so—to require providers to produce a transparency report every year. Ofcom can say what is supposed to be in the report, but the relevant schedule lists all the things that can be in it, and covers absolutely everything that the shadow Minister and the hon. Member for Worsley and Eccles South want to see in there.

That requirement to publish transparently and publicly is in the Bill, but it is to be found in clause 64. While I agree with the Opposition’s objectives on this point, I respectfully say that those objectives are delivered by the Bill as drafted, so I politely and gently request that the amendments be withdrawn.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of comments, particularly about amendments 15 and 16, which the Minister has just spoken about at some length. I do not agree with the Government’s assessment that the governance subsection is adequate. It states that the risk assessment must take into account

“how the design and operation of the service (including the business model, governance, use of proactive technology…may reduce or increase the risks identified.”

It is actually an assessment of whether the governance structure has an impact on the risk assessment. It has no impact whatever on the level at which the risk assessment is approved or not approved; it is about the risks that the governance structure poses to children or adults, depending on which section of the Bill we are looking at.

The Minister should consider what is being asked in the amendment, which is about the decision-making level at which the risk assessments are approved. I know the Minister has spoken already, but some clarification would be welcome. Does he expect a junior tech support member of staff, or a junior member of the legal team, to write the risk assessment and then put it in a cupboard? Or perhaps they approve it themselves and then nothing happens with it until Ofcom asks for it. Does he think that Ofcom would look unfavourably on behaviour like that? If he was very clear with us about that, it might put our minds at rest. Does he think that someone in a managerial position or a board member, or the board itself, should take decisions, rather than a very junior member of staff? There is a big spread of people who could be taking decisions. If he could give us an indication of what Ofcom might look favourably on, it would be incredibly helpful for our deliberations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am anxious about time, but I will respond to that point because it is an important one. The hon. Lady is right to say that clause 10(6)(h) looks to identify the risks associated with governance. That is correct —it is a risk assessment. However in clause 11(2)(a), there is a duty to mitigate those risks, having identified what the risks are. If, as she hypothesised, a very junior person was looking at these matters from a governance point of view, that would be identified as a risk. If it was not, Ofcom would find that that was not sufficient or suitable. That would breach clause 10(2), and the service would then be required to mitigate. If it did not mitigate the risks by having a more senior person taking the decision, Ofcom would take enforcement action for its failure under clause 11(2)(a).

For the record, should Ofcom or lawyers consult the transcript to ascertain Parliament’s intention in the course of future litigation, it is absolutely the Government’s view, as I think it is the hon. Lady’s, that a suitable level of decision making for a children’s risk assessment would be a very senior level. The official Opposition clearly think that, because they have put it in their amendment. I am happy to confirm that, as a Minister, I think that. Obviously the hon. Lady, who speaks for the SNP, does too. If the transcripts of the Committee’s proceedings are examined in the future to ascertain Parliament’s intention, Parliament’s intention will be very clear.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will first speak to our amendment 85, which, like the Labour amendment, seeks to ensure that the Bill is crystal clear in addressing intersectionality. We need only consider the abuse faced by groups of MPs to understand why that is necessary. Female MPs are attacked online much more regularly than male MPs, and the situation is compounded if they have another minority characteristic. For instance, if they are gay or black, they are even more likely to be attacked. In fact, the MP who is most likely to be attacked is black and female. There are very few black female MPs, so it is not because of sheer numbers that they are at such increased risk of attack. Those with a minority characteristic are at higher risk of online harm, but the risk facing those with more than one minority characteristic is substantially higher, and that is what the amendment seeks to address.

I have spoken specifically about people being attacked on Twitter, Facebook and other social media platforms, but people in certain groups face an additional significant risk. If a young gay woman does not have a community around her, or if a young trans person does not know anybody else who is trans, they are much more likely to use the internet to reach out, to try to find people who are like them, to try to understand. If they are not accepted by their family, school or workplace, they are much more likely to go online to find a community and support—to find what is out there in terms of assistance—but using the internet as a vulnerable, at-risk person puts them at much more significant risk. This goes back to my earlier arguments about people requiring anonymity to protect themselves when using the internet to find their way through a difficult situation in which they have no role models.

It should not be difficult for the Government to accept this amendment. They should consider it carefully and understand that all of us on the Opposition Benches are making a really reasonable proposal. This is not about saying that someone with only one protected characteristic is not at risk; it is about recognising the intersectionality of risk and the fact that the risk faced by those who fit into more than one minority group is much higher than that faced by those who fit into just one. This is not about taking anything away from the Bill; it is about strengthening it and ensuring that organisations listen.

We have heard that a number of companies are not providing the protection that Members across the House would like them to provide against child sexual abuse. The governing structures, risk assessments, rules and moderation at those sites are better at ensuring that the providers make money than they are at providing protection. When regulated providers assess risk, it is not too much to ask them to consider not just people with one protected characteristic but those with multiple protected characteristics.

As MPs, we work on that basis every day. Across Scotland and the UK, we support our constituents as individuals and as groups. When protected characteristics intersect, we find ourselves standing in Parliament, shouting strongly on behalf of those affected and giving them our strongest backing, because we know that that intersection of harms is the point at which people are most vulnerable, in both the real and the online world. Will the Minister consider widening the provision so that it takes intersectionality into account and not only covers people with one protected characteristic but includes an over and above duty? I genuinely do not think it is too much for us to ask providers, particularly the biggest ones, to make this change.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Once again, the Government recognise the intent behind these amendments and support the concept that people with multiple intersecting characteristics, or those who are members of multiple groups, may experience—or probably do experience—elevated levels of harm and abuse online compared with others. We completely understand and accept that point, as clearly laid out by the hon. Member for Aberdeen North.

There is a technical legal reason why the use of the singular characteristic and group singular is adopted here. Section 6(c) of the Interpretation Act 1978 sets out how words in Bills and Acts are interpreted, namely that such words in the singular also cover the plural. That means that references in the singular, such as

“individuals with a certain characteristic”

in clause 10(6)(d), also cover characteristics in the plural. A reference to the singular implies a reference to the plural.

Will those compounded risks, where they exist, be taken into account? The answer is yes, because the assessments must assess the risk in front of them. Where there is evidence that multiple protected characteristics or the membership of multiple groups produce compounded risks, as the hon. Lady set out, the risk assessment has to reflect that. That includes the general sectoral risk assessment carried out by Ofcom, which is detailed in clause 83, and Ofcom will then produce guidance under clause 84.

The critical point is that, because there is evidence of high levels of compounded risk when people have more than one characteristic, that must be reflected in the risk assessment, otherwise it is inadequate. I accept the point behind the amendments, but I hope that that explains, with particular reference to the 1978 Act, why the Bill as drafted covers that valid point.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I would be delighted to speak to the amendment, which would change the existing user empowerment duty in clause 14 to require category 1 services to enable adult users to see whether other users are verified. In effect, however, that objective already follows as a natural consequence of the duty in clause 14(6). When a user decides to filter out non-verified users, by definition such users will be able to see content only from verified users, so they could see from that who was verified and who was not. The effect intended by the amendment, therefore, is already achieved through clause 14(6).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am sorry to disagree with the Minister so vigorously, but that is a rubbish argument. It does not make any sense. There is a difference between wanting to filter out everybody who is not verified and wanting to actually see if someone who is threatening someone else online is a verified or a non-verified user. Those are two very different things. I can understand why a politician, for example, might not want to filter out unverified users but would want to check whether a person was verified before going to the police to report a threat.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

When it comes to police investigations, if something is illegal and merits a report to the police, users should report it, regardless of whether someone is verified or not—whatever the circumstances. I would encourage any internet user to do that. That effectively applies on Twitter already; some people have blue ticks and some people do not, and people should report others to the police if they do something illegal, whether or not they happen to have a blue tick.

Amendment 47 seeks to create a definition of identity verification in clause 189. In addition, it would compel the person’s real name to be displayed. I understand the spirit of the amendment, but there are two reasons why I would not want to accept it and would ask hon. Members not to press it. First, the words “identity verification” are ordinary English words with a clear meaning and we do not normally define in legislation ordinary English words with a clear meaning. Secondly, the amendment would add the new requirement that, if somebody is verified, their real name has to be displayed, but I do not think that that is the effect of the drafting as it stands. Somebody may be verified, and the company knows who they are—if the police go to the company, they will have the verified information—but there is no obligation, as the amendment is drafted, for that information to be displayed publicly. The effect of that part of the amendment would be to force users to choose between disclosing their identity to everyone or having no control over who they interact with. That may not have been the intention, but I am not sure that this would necessarily make sense.

New clause 8 would place requirements on Ofcom about how to produce guidance on user identity verification and what that guidance must contain. We already have provisions on that in clause 58, which we will no doubt come to, although probably not later on today—maybe on Thursday. Clause 58 allows Ofcom to include in its regulatory guidance the principles and standards referenced in the new clause, which can then assist service providers in complying with their duties. Of course, if they choose to ignore the guidelines and do not comply with their duties, they will be subject to enforcement action, but we want to ensure that there is flexibility for Ofcom, in writing those guidelines, and for companies, in following those guidelines or taking alternative steps to meet their duty.

This morning, a couple of Members talked about the importance of remaining flexible and being open to future changes in technology and a wide range of user needs. We want to make sure that flexibility is retained. As drafted, new clause 8 potentially undermines that flexibility. We think that the powers set out in clause 58 give Ofcom the ability to set the relevant regulatory guidance.

Clause 14 implements the proposals made by my hon. Friend the Member for Stroud in her ten-minute rule Bill and the proposals made, as the shadow Minister has said, by a number of third-party stakeholders. We should all welcome the fact that these new user empowerment duties have now been included in the Bill in response to such widespread parliamentary lobbying.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Sometimes we miss out the fact that although MPs face abuse, we have a level of protection as currently elected Members. Even if there were an election coming up, we have a level of security protection and access that is much higher than for anybody else challenging a candidate or standing in a council or a Scottish Parliament election. As sitting MPs, we already have an additional level of protection because of the security services we have in place. We need to remember, and I assume this is why the amendment is drawn in a pretty broad way, that everybody standing for any sort of elected office faces significant risk of harm—again, whether or not that meets the threshold for illegality.

There are specific things that have been mentioned. As has been said, epilepsy is specifically mentioned as a place where specific harm occurs. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. That is why we have election addresses and a system where the election address gets delivered through every single person’s door. There is an understanding and acceptance by people involved in designing democratic processes that the message of all candidates needs to get out there. If the message of all candidates cannot get out there because some people are facing significant levels of abuse online, then democracy is not acting in the way that it should be. These amendments are fair and make a huge amount of sense. They are protecting the most important tenets of democracy and democratic engagement.

I want to say something about my own specific experiences. We have reported people to the police and have had people in court over the messages they have sent, largely by email, which would not be included in the Bill, but there have also been some pretty creepy ones on social media that have not necessarily met the threshold. As has been said, it is my staff who have had to go to court and stand in the witness box to explain the shock and terror they have felt on seeing the email or the communication that has come in, so I think any provision should include that.

Finally, we have seen situations where people working in elections—this is not an airy-fairy notion, but something that genuinely happened—have been photographed and those pictures have been shared on social media, and they have then been abused as a result. They are just doing their job, handing out ballot papers or standing up and announcing the results on the stage, and they have to abide by the processes that are in place now. In order for us to have free and fair elections that are run properly and that people want to work at and support, we need to have that additional level of protection. The hon. Member for Batley and Spen made a very reasonable argument and I hope the Minister listened to it carefully.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have listened very carefully to both the hon. Member for Batley and Spen and the hon. Member for Aberdeen North. I agree with both of them that abuse and illegal activity directed at anyone, including people running for elected office, is unacceptable. I endorse and echo the comments they made in their very powerful and moving speeches.

In relation to the technicality of these amendments, what they are asking for is in the Bill already but in different places. This clause is about protecting content of “democratic importance” and concerns stopping online social media firms deleting content through over-zealous takedown. What the hon. Members are talking about is different. They are talking about abuse and illegal activities, such as rape threats, that people get on social media, particularly female MPs, as they both pointed out. I can point to two other places in the Bill where what they are asking for is delivered.

First, there are the duties around illegal content that we debated this morning. If there is content online that is illegal—some of the stuff that the shadow Minister referred to earlier sounds as if it would meet that threshold—then in the Bill there is a duty on social media firms to remove that content and to proactively prevent it if it is on the priority list. The route to prosecution will exist in future, as it does now, and the user-verification measures, if a user is verified, make it more likely for the police to identify the person responsible. In the context of identifying people carrying out abuse, I know the Home Office is looking at the Investigatory Powers Act 2016 as a separate piece of work that speaks to that issue.

So illegal content is dealt with in the illegal content provisions in the Bill, but later we will come to clause 150, which updates the Malicious Communications Act 1988 and creates a new harmful communications offence. Some of the communications that have been described may not count as a criminal offence under other parts of criminal law, but if they meet the test of harmful communication in clause 150, they will be criminalised and will therefore have to be taken down, and prosecution will be possible. In meeting the very reasonable requests that the hon. Members for Batley and Spen and for Aberdeen North have made, I would point to those two parts of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

But clause 150(5) says that if a message

“is, or is intended to be, a contribution to a matter of public interest”,

people are allowed to send it, which basically gives everybody a get-out clause in relation to anything to do with elections.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, it does not.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I know we are not discussing that part of the Bill, and if the Minister wants to come back to this when we get to clause 150, I have no problem with that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will answer the point now, as it has been raised. Clause 150 categorically does not give a get-out-of-jail-free card or provide an automatic excuse. Clearly, there is no way that abusing a candidate for elected office with rape threats and so on could possibly be considered a matter of public interest. In fact, even if the abuse somehow could be considered as possibly contributing to public debate, clause 150(5) says explicitly in line 32 on page 127:

“but that does not determine the point”.

Even where there is some potentially tenuous argument about a contribution to a matter of public interest, which most definitely would not be the case for the rape threats that have been described, that is not determinative. It is a balancing exercise that gets performed, and I hope that puts the hon. Lady’s mind at rest.

Online Safety Bill (First sitting)

Debate between Kirsty Blackman and Chris Philp
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have a quick question on parental digital literacy. You mentioned the panel that you put together of 16 to 21-year-olds. Do you think that today’s parents have the experience, understanding, skills and tools to keep their children properly safe online? Even if they are pretty hands-on and want to do that, do you think that they have all the tools they need to be able to do that?

Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.

What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.

I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking the NSPCC and you, Dame Rachel, and your office for the huge contribution that you have made to the Bill as it has developed? A number of changes have been made as a result of your interventions, so I would just like to start by putting on the record my thanks to both of you and both your organisations for the work that you have done so far.

Could you outline for the Committee the areas where you think the Bill, as currently drafted, contains the most important provisions to protect children?

Dame Rachel de Souza: I was really glad to see, in the rewrite of the Online Safety Bill, a specific reference to the role of age assurance to prevent children from accessing harmful content. That has come across strongly from children and young people, so I was very pleased to see that. It is not a silver bullet, but for too long children have been using entirely inappropriate services. The No. 1 recommendation from the 16 to 21-year-olds, when asked what they wish their parents had known and what we should do, was age assurance, if you are trying to protect a younger sibling or are looking at children, so I was pleased to see that. Companies cannot hope to protect children if they do not know who the children are on their platforms, so I was extremely pleased to see that.

Finance (No. 3) Bill

Debate between Kirsty Blackman and Chris Philp
2nd reading: House of Commons & Programme motion: House of Commons
Monday 12th November 2018

(6 years, 1 month ago)

Commons Chamber
Read Full debate Finance Act 2019 View all Finance Act 2019 Debates Read Hansard Text Read Debate Ministerial Extracts
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Not just now. In terms of the economic growth forecasts that the OBR has apparently made—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Will the hon. Lady give way?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am not taking any more interventions.

The OBR has made economic growth forecasts on the basis of a smooth and orderly Brexit. It has not made economic growth forecasts on the basis of us crashing out in a no-deal scenario, so its forecasts are only worth anything if the Government can strike a deal, as the Chancellor knows, which is why he has spoken about another fiscal event coming.

Frictionless trade is not frictionless just because the Government call it frictionless. If a good has to be stopped at the border, if somebody has to fill in an additional form or if there is any delay, that is not frictionless trade. Just because the Government say, “This is frictionless trade,” it does not mean that it is actually frictionless trade.

The Government need to improve their processes around the Finance Bill. This year has been the worst in terms of those processes, and they have to improve. The Government could do that by ensuring that we take evidence at the Public Bill Committee.

The Government have to actually do the things they say they are doing. If they say they are going to give Scotland the Barnett consequentials for health, they should give it the Barnett consequentials for health. If they say they are ending austerity, they should end austerity. If they say they are putting in place a living wage, they should put in place a living wage.

Lastly, if the Government are talking about tax cuts, they need to look at the situation in Scotland. The figures I have from the Library say that around half of taxpayers in England pay more than they would if they lived in Scotland, and that half of taxpayers are the people who earn the least, not the most. The UK Government should look at what the Scottish Government are doing and learn some lessons.