Kirsty Blackman debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Tue 5th Dec 2023
Media Bill (First sitting)
Public Bill Committees

Committee stage: 1st sitting & Committee stage
Tue 21st Nov 2023
Tue 17th Jan 2023
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Thu 23rd Jun 2022

Media Bill (First sitting)

Kirsty Blackman Excerpts
John Whittingdale Portrait The Minister for Media, Tourism and Creative Industries (Sir John Whittingdale)
- Hansard - - - Excerpts

I beg to move,

That—

1. the Committee shall (in addition to its first meeting at 9.25 am on Tuesday 5 December) meet—

(a) at 2.00 pm on Tuesday 5 December;

(b) at 11.30 am and 2.00 pm on Thursday 7 December;

(c) at 9.25 am and 2.00 pm on Tuesday 12 December;

(d) at 11.30 am and 2.00 pm on Thursday 14 December;

2. the proceedings shall be taken in the following order: Clauses 1 to 17; Schedule 1; Clauses 18 to 27; Schedule 2; Clause 28; Schedule 3; Clauses 29 to 36; Schedule 4; Clause 37; Schedules 5 to 7; Clauses 38 to 40; Schedule 8; Clauses 41 to 48; Schedule 9; Clause 49; Schedules 10 and 11; Clauses 50 and 51; Schedule 12; new Clauses; new Schedules; remaining proceedings on the Bill;

3. the proceedings shall (so far as not previously concluded) be brought to a conclusion at 5.00 pm on Thursday 14 December.

It is a great pleasure to serve under your chairmanship, Mr Vickers, and to debate with the hon. Member for Barnsley East, reprising the enjoyable time we had in the Data Protection and Digital Information (No. 2) Bill Committee not long ago. This Bill is important for the future of our public service broadcasters and the media in this country. It has been some time in the preparation. It has been through pre-legislative scrutiny, and has been amended considerably to reflect the views put forward to the Government. As a result, I hope that it is generally non-controversial, but it is obviously important that we scrutinise it in detail.

The Programming Sub-Committee met yesterday evening to debate the programme for consideration of the Bill. It was agreed that we should meet today at 9.25 am and 2 pm, again on Thursday, and then again on Tuesday and Thursday next week. That was the unanimous view of the Committee. On that basis, I commend the programme motion to the Committee.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you chairing the Committee today, Mr Vickers. It is a pleasure to stand opposite the Minister. The last work I did with the Department for Culture, Media and Sport was on the Online Safety Bill, which took a significant time—significantly more than I expect this Bill will. I will talk more generally about the Bill later, when we have moved off the programme motion.

I have questions for the Minister about the lack of oral evidence for the Bill. There is no programme for taking oral evidence. That generally happens when the beginning of a Bill’s Committee stage is taken on the Floor of the House; for example, we have the first part of the Finance Bill Committee on the Floor of the House. The Government have been keen not to take oral evidence on the Finance Bill. It also happens when a Bill originates in the Lords; then no oral evidence is taken in the House of Commons.

I understand what the Minister said about there having been pre-legislative scrutiny. However, I spoke to an external organisation that is often called to give evidence on things related to media, and it assumed that it would be giving evidence this morning when it first saw the draft timetable for Committee during Second Reading. It did not expect that there would be no oral evidence sessions. Let me make it clear how useful oral evidence is. We are able to ask so many experts for their views on specific parts of the Bill. The Minister said that there is a large amount of agreement on much of the Bill, and I do not disagree, but there are significant points of contention, such as the use of the word “appropriate” as opposed to “significant” in relation to prominence. It would be helpful to have experts here who could explain why they believe that “appropriate” is not the appropriate word in the circumstances.

We have had a tight turnover from Second Reading. I very much appreciate all the organisations that have worked hard to put together their written evidence in such a short time, but I guarantee that not everybody in the room will have read all the written evidence, given the tight timescales.

I have two questions. First, why did the Minister decide not to schedule oral evidence sessions when programming the Bill? Will he be slightly ashamed if we do not meet on Thursday 14 December, and we would have had time for an oral evidence session? My second question relates to the timing of the Bill. It is fairly unusual for Committee to begin this quickly after Second Reading. There were two days after Second Reading to table amendments before the deadline. That is a fairly tight turnaround, especially given that we will probably discuss most of the Bill over a few days. I would appreciate it if the Minister let us know the Government’s thinking on the programming.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I hear what the hon. Lady says and understand her points. However, as I indicated, the Bill has been in gestation for a long time. I chaired the Culture, Media and Sport Committee until 2015, and it called for a number of the measures in the Bill, so certain parts have taken at least seven or eight years. As she rightly points out, the Government published the Bill in draft form, and that led to lengthy Select Committee hearings, in which a large range of stakeholders gave evidence. Indeed, there was the Select Committee’s report, and the Scottish Affairs and Welsh Affairs Committees also made recommendations. All those were taken into account by the Government, and published evidence was available.

Since that time, we have held a number of roundtables to hear from stakeholders. I obviously recognise that those were private meetings, so there is not a public record of them, but nevertheless, as the hon. Lady points out, there has been an opportunity for all stakeholders to submit written evidence. I am shocked at her suggestion that there could be members of the Committee who have not read all the written evidence submitted, but it is publicly available. Given the time spent consulting on the Bill, it was felt that a public oral evidence session in the Committee was not necessary. If anybody wishes to make further representations, we would gratefully receive them.

The Programming Sub-Committee felt yesterday that the timetable gave sufficient time, given the Bill’s non-controversial nature. Relatively fewer amendments have been tabled than were tabled to the Data Protection and Digital Information Bill, which the hon. Member for Barnsley East and I took through Committee not that long ago. I hope that we will give the amendments proper scrutiny. I view the timetable with a certain amount of schadenfreude, in that I shall be stepping down from my position at the end of the year so that my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez) can return to her role. I am pleased that I shall have the opportunity to take the Bill through the whole of Committee, because it is one that I have spent quite a lot of time on. For those reasons, I think the programme motion and the amount of time allocated for consideration of the Bill are correct, although I join the hon. Member for Aberdeen North in hoping that anybody with further representations to make does make them, even if we are not having oral evidence sessions.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will not vote against the programme motion, but I echo the Minister’s call to stakeholders on written evidence, and say to any stakeholders who are watching: “You have been wrong-footed by the very short timescales we were given for amendments, but there is the opportunity to make amendments on Report.” If they get in touch with us about any amendments they want before the deadline for Report, they could be debated then, even though we may not necessarily have had time to craft them before Committee proceedings.

Question put and agreed to.

None Portrait The Chair
- Hansard -

The Committee will therefore meet again at 2 pm this afternoon, and on every sitting Tuesday and Thursday until 14 December, unless we complete consideration of the Bill before then.

Ordered,

That the Bill be considered in the following order, namely, Clauses 1 to 17, Schedule 1, Clauses 18 to 27, Schedule 2, Clause 28, Schedule 3, Clauses 29 to 36, Schedule 4, Clause 37, Schedules 5 to 7, Clauses 38 to 40, Schedule 8, Clauses 41 to 48, Schedule 9, Clause 49, Schedules 10 and 11, Clauses 50 and 51, Schedule 12, new Clauses, new Schedules, remaining proceedings on the Bill.—(Sir John Whittingdale.)

Resolved,

That, subject to the discretion of the Chair, any written evidence received by the Committee shall be reported to the House for publication.—(Sir John Whittingdale.)

None Portrait The Chair
- Hansard -

Copies of any written evidence received by the Committee will be circulated to Members by email and published on the Bill webpage. We now proceed to line-by-line consideration of the Bill.

Clause 1

Reports on the fulfilment of the public service remit

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 39, in clause 1, page 2, line 38, at end insert—

“(iii) at least ten hours’ transmission time per week in the Gaelic language as spoken in Scotland.”

This amendment would add a similar requirement for broadcast of programming in Scottish Gaelic as there is for Welsh language broadcasting.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 5—Gaelic language service—

“The Secretary of State must, within six months of the passage of this Act, review whether a Gaelic language service should be given a public service broadcast remit.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

It is a pleasure to take part in the Bill Committee, Mr Vickers. I am glad to see everybody here early on a Tuesday morning, either with or without coffee—I mean, definitely without coffee, as that is not allowed in Bill Committees.

Amendment 39 to clause 1 relates to Gaelic language programming. I hold my hand up: I am sorry that this is not a very good amendment. I have been pretty clear about the fact that there was an incredibly quick turnaround, and I could have done a significantly better job on this amendment. In fact, I am quite happy to support new clause 5 on this issue, which was put forward by Labour.

The Gaelic language and its preservation through public service broadcasting was debated at significant length on Second Reading. The subject is incredibly important. It exercises people in Scotland and across the rest of these islands. There is massive concern about the lack of a requirement for Gaelic language public service broadcasting. There is no requirement for a minimum amount, and no requirements relating to new content. There could, for example, have been a requirement in the Bill for the BBC to produce new Gaelic language content. The Minister is aware that MG ALBA and BBC Alba are involved in producing Gaelic language TV in Scotland, which is absolutely excellent and makes a massive difference to the use of the Gaelic language.

On Second Reading, we heard about the issues that there have historically been with Gaelic. There was the intention by authorities over a significant number of years to reduce the amount of Gaelic spoken in Scotland, and to stamp it out, and Gaelic is still slowly making a comeback. In Aberdeen, we have Gaelic-medium education; that provision is massively full at the moment, despite Aberdeen not being known as a centre for Gaelic, being on the east rather than the west coast. When I visited a Gaelic nursery in my constituency, I asked people whether they found it difficult to ensure that their children were brought up with enough Gaelic language in Aberdeen, where it is not nearly so prominent as it is in, say, the Western Isles. They talked incredibly positively about the impact of children’s TV in Gaelic. Children can watch that TV and learn Gaelic as a native language. Given that there is less Gaelic spoken by the population, public service broadcasting is even more important. Free-to-air public service broadcasting in Gaelic is vital to ensure the continuation of the language, particularly when many adults in the area are not speaking Gaelic regularly.

I would very much like the Minister to consider the lines about Gaelic in the Bill and whether they are sufficient, because I do not believe that they are. I do not believe that Gaelic is given enough of a footing in the Bill. It talks about having an “appropriate” level of provision in the indigenous languages of the UK, but it does not put Gaelic on the same footing as, for example, Welsh; it talks significantly more about quotas and minimum levels of new content for Welsh. That is incredibly important, and I do not at all want to take away from what is happening with Welsh, because that should be happening.

I am asking for parity for Gaelic, or an increase of it—or even an acknowledgement from the UK Government that Gaelic is important. It should not be mentioned as a small aside, and simply be included in a list of other languages. I would very much appreciate it if the Minister considered augmenting the provisions relating to Gaelic, to make it clear how important it is to people in Scotland and across these islands, as one of our indigenous languages. I will not push amendment 39 to a vote—I will return to the issue on Report—but I am happy to support new clause 5, put forward by Labour.

Hywel Williams Portrait Hywel Williams (Arfon) (PC)
- Hansard - - - Excerpts

I am delighted to be on this Committee. I support amendments 39 and 40 from my hon. Friend the Member for Aberdeen North. The one thing in clause 1 that I baulked at slightly was the term “regional language”. I would not say that Welsh is a regional language, though there are regions in Wales where language is used slightly differently; there is Welsh and Welsh English, if I may use that term.

I suppose I should confess that I was a participant in a campaign during the 1970s to establish S4C, the Welsh language channel. It was a very long time ago— 40 years ago—and perhaps it would be better to draw a veil over my activities then. If hon. Members are interested in the lessons from the last 40 years on how to build, sustain and improve a channel such as S4C, I refer them to the Department for Culture, Media and Sport document of 2018, “Building an S4C for the future”, by Euryn Ogwen Williams. It is a very interesting document that chronicles, to some extent, what has happened with Welsh in respect of the channel, and it has useful lessons for similar channels, and for Gaelic provision.

One of the outstanding successes of our campaign a very long time ago was ensuring minimum hours in Welsh, to refer to a point that my hon. Friend the Member for Aberdeen North made, and ensuring that programmes in Welsh on a specific channel should be broadcast at peak hours. That was a great success. It is now entirely unremarkable to have programmes in Welsh mid-afternoon, or late in the evening. The very fact that that is unremarkable is a measure of success.

The two sorts of lessons I will briefly refer to from our experience in Wales are, first, what one might call the economic and diversity arguments and, secondly, the cultural arguments. Certainly initially, the arguments for a Welsh channel, and perhaps for a Gaelic channel or Gaelic provision, are essentially cultural. To point to some of the economic features of the argument, an increase in hours in Gaelic would have the same sort of effect.

Initially, in Wales at least, there has been a greater diversity of providers. As with Channel 4, the intention—and the achievement—was to have a larger independent sector and to locate it outside Cardiff, Swansea and Bangor. In my area of Caernarfon, and in Arfon in general, that has led to a huge economic benefit in terms of not only the people employed in television production, but all the other work that has come our way because we have Welsh language television production in the north-west. Those independent producers have also diversified and now participate in international production that has nothing to do with the Welsh channel itself. As a result, we have greater growth in television production skills, and some people have graduated to working in other parts of the world. So there is that argument.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I thank the hon. Members for Barnsley East and for Aberdeen North for speaking to their amendments and allowing us to debate the importance of the Gaelic language. It is something we spent a little bit of time on at Second Reading, but it is an important issue.

The Government absolutely share the view of the vital necessity of supporting the continuation and future of Gaelic, and recognise the important contribution that the Gaelic media service MG ALBA makes to the lives and wellbeing of Gaelic speakers across Scotland and the rest of the UK. It is for that reason that the Government embedded a duty to support regional and minority languages, although I take the point made by the hon. Member for Arfon about Welsh not being a “regional” language in that sense. It is, nevertheless, a minority language—as is Gaelic. There is a duty within the BBC’s general duties under the current charter arrangements. We want to help ensure that audiences are able to access this culturally important minority language content in the decades to come.

The Bill goes further than existing provisions. Clause 1 makes the importance of programmes broadcast in the UK’s indigenous languages, including the Gaelic language, clear in legislation, by including it in our new public service remit for television. That is a new addition, which puts on the face of the legislation the need to continue to support minority languages of this kind. We will debate later the way in which the public service broadcasters are required to contribute to the remit and are held accountable for doing so. The purpose of clause 1 is to place a requirement on Ofcom to consider how the public service remit has been fulfilled. It sets a high-level mission statement for public service broadcasters, and is underpinned by a more detailed system of quotas in later clauses. It is intended to be simpler and to provide PSBs with greater flexibility.

That point notwithstanding, I reassure the hon. Member for Barnsley East that the availability of Gaelic language content is provided for elsewhere. As she knows, the BBC has a specific responsibility in the framework to make arrangement to provide BBC Alba, which is a mixed-genre television channel for Gaelic speakers and those interested in the Gaelic language. Ofcom also places a number of more detailed responsibilities on BBC Alba in the BBC’s operating licence. For example, it must provide music of particular relevance to audiences in Scotland, live news programmes each weekday evening—including during peak viewing time—and a longer news review at the weekends.

It is for Ofcom to determine whether these requirements remain appropriate, including on the basis of feedback. It is the case, however, in terms of the amount of Gaelic language broadcasting that takes place, that at the moment BBC Alba broadcasts in Gaelic from 5 pm until midnight. That is seven hours each day, starting an hour later at weekends. When not broadcasting television programmes in Gaelic, it plays—forgive me if I pronounce this wrong—BBC Radio nan Gàidheal, which is the Gaelic language radio station. That is broadcast with static graphics during the periods when television programmes are not being aired. That means that there is a total of something like 2,579 hours of Gaelic television content, certainly in the course of last year.

I think that the amount of Gaelic language already being broadcast meets the ambition set out in the amendment from the hon. Member for Aberdeen North, and it is now contained in the public service remit, serving all channels, and the BBC charter agreement. For that reason, I think there is already considerable provision to ensure the continuation of Gaelic language.

I want to turn to the issue raised by the hon. Member for Barnsley East in new clause 5, which refers specifically to the manner in which Gaelic is delivered. BBC Alba is a requirement as part of the charter, and we will again consider how it is delivered by the BBC when the charter renewal takes place. The charter review starts in 2025 and has to be completed by 2027, and we will set out further details in due course on precisely how it is to be carried out.

In the more immediate term, we have recently brought together BBC and Scottish Government officials to discuss the co-ordination of funding decisions for Gaelic language broadcasting between the two organisations. In that respect, I hope that the hon. Member for Aberdeen North and the hon. Member for Barnsley East will recognise that the intention behind their amendment and new clause is already delivered by the Bill and on that basis will be willing to withdraw their amendments.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister for his response and colleagues for their comments on the amendment and the new clause. I am pleased to hear the Minister talk about the co-ordination of funding decisions and the group that has been brought together to discuss future co-ordination on these decisions and how that may work.

There is a significant asymmetry between the funding settlements for the Welsh language and for Gaelic, particularly with the amount that comes from the licence fee and comparing, for example, Gaelic-speaking broadcasting to Welsh-speaking broadcasting. As I have acknowledged, there are significantly more Welsh speakers, and I am not trying to say that those two things should be directly comparable, but looking at the percentage required from the Scottish Government compared with the amount provided by the licence fee, there is a significant difference between that and what is provided for Welsh. I am glad to hear that the Minister has recognised that decisions are required to be made about the future of funding going forward, and is ensuring that discussions take place.

I am not a Gaelic speaker, but I think my pronunciation of nan Gàidheal would be more accurate than the Minister’s—it does sound like it has a lot more letters than that. I am, however, a native Scots speaker and grew up speaking Doric as my first language. In fact, I think I am the only MP ever to have sworn in to this place in Doric. I have done so twice.

I appreciate that Scots is also mentioned as one of the recognised regional minority languages, and I want to back the point made by my hon. Friend the Member for Arfon and the hon. Member for Barnsley East about the number of young speakers. There has been a significant increase in the number of young people speaking Scots. Even when I was at school, which is some time ago now, we were very much discouraged from speaking Scots, but anyone standing at a bus stop in Aberdeen nowadays will hear young people arguing and bantering with each other in the broad Doric. That just would not have happened in the same way 25 or 30 years ago, when I was at bus stops bantering with my pals.

It is good to see that increase, but we have not seen a commensurate increase in the amount of Scots language TV. There is some Scots language programming, but it is very unusual for us to hear somebody speaking in an Aberdeen accent, for example. A significant proportion of those in the north-east of Scotland would be able to speak Doric, or at least understand it were it on our TVs. Doric is a dialect of Scots, which is a recognised language, and it is spoken in the north-east.

The Minister talked about the BBC provision and the licence conditions in the charter. I appreciate all that, but the safeguarding of that in this legislation would have shown Gaelic speakers and people who care about the Gaelic language that it is important to have this at this level. It is important to have it not just as part of the BBC charter and of the potential BBC charter negotiations, but as a recognised part of public sector broadcasting. Gaelic should not be playing second fiddle; it should not be down the list of priorities. It is important, and we should not just say, “It is included in the charter, so that’s okay.” That is not exactly what the Minister said, but it was angling in that direction. Such an approach does not provide that safeguarding we need, and it does not provide the requirement for Ofcom to monitor this. He mentioned that Ofcom has to check whether or not there is an appropriate level of Gaelic programming because of the conditions in the Bill. However, what Ofcom has to check is whether there is a

“sufficient quantity of audiovisual content”,

and, as the shadow Minister said, no clear definition of “sufficient” is provided.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The hon. Lady is absolutely right to say that Ofcom has a duty under the Bill to monitor the delivery of the public service remit, but she will be aware that in addition Ofcom has the duty to oversee the BBC’s delivery of its requirements under the charter and the agreement. To that extent, Ofcom will be monitoring whether or not the BBC is meeting is obligations.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that Ofcom will be doing that right now, but, as the Minister says, the charter negotiations are about to open; 2025 possibly seems slightly further away to me than it does to him, but those negotiations are about to begin again and there is no guarantee that that duty will continue to be part of the charter. If the Media Bill provides that this is a required part of public sector broadcasting, it would make it easier for that to be included in the charter and to be part of the licensing conditions, and for Ofcom to ensure that the BBC or any other public sector broadcaster was delivering it.

The last point I wish to make on this is about BBC Alba. Later, we will be discussing the appropriate placement of public sector broadcasters on on-demand services, be it on Sky or wherever else one happens to watch TV. There is a requirement for public sector broadcasters to be given an appropriate level of significance. If we ensure in the Bill that Gaelic-language broadcasting is part of the public sector remit, we increase the likelihood of these broadcasters being given that level of prominence on those on-demand services and digital viewing platforms. We have a requirement for them to be given prominence but at the moment BBC Alba is not included in that, because it is just considered part of the BBC, rather than as a relevant service in its own right. I appreciate that the Minister is unlikely to accept amendment 39 and I am not going to press it to vote, but if the shadow Minister does press new clause 5 to a vote, I fully intend to support it. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 40, page 3, line 10, at end insert—

“(iv) an annual increase in the spend on content and combined content duration made in Scotland until they reach a population share.”

This amendment would add to Ofcom’s reporting requirements a requirement to report on the extent to which the public service broadcasters had made available audiovisual content including an annual increase in the spend on content and combined content duration made in Scotland until they reach a population share.

I promise that I do not have an amendment on every part of every clause—I am sure everyone will breathe a sigh of relief. Amendment 40 is about the proportion of content made in Scotland and the conditions in the Bill for content made outside the M25. It is important that more content is made outside the M25, and I am glad that the Government have recognised that and that there has been a move in public sector broadcasting to ensure that that happens. I recognise the work that broadcasters have done to ensure that that continues to be the case, and that much more content is produced outside the M25 than previously. That is positive and I am glad to see it.

--- Later in debate ---
Hywel Williams Portrait Hywel Williams
- Hansard - - - Excerpts

The rule here is that if there is facility for growth, growth will occur. There is an Irish saying that I like very much: “Live, horse, and ye shall have hay.” If it is there, it will grow. Perhaps the proof of that, in Wales at least, is that the Welsh-speaking population is equal to the size of Sheffield, but is able to sustain a full channel. I am sure that would happen in Scotland, as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Absolutely. If there were a requirement for more broadcasting, not just outside the M25, and for looking at population share, even reporting on spend and population share, there would be clarity and transparency about that spend, and whether it is anywhere close to population share. I think that public sector broadcasters would have a look and think, “Actually, we could probably do better than this. We could produce more content that is more exciting and relevant to people across all of these islands, produced in places with incredibly diverse scenery and people taking part in it.”

As for the Government’s position on levelling up, a fairly general statement on content produced outside the M25 is not going to cut it. It will not bring about levelling up or an increase in broadcasting in places that do no currently see significant amounts. As I said, I appreciate that the Minister and his Government are trying with the outside-the-M25 quota, but it could be done better in order to encourage more content, or at least transparent reporting on the level of broadcasting, spend and content creation in various parts of the UK. As expected from an SNP MP, I have highlighted Scotland, but many parts of these islands could make a pitch for more content to be made in their area, or at least reporting on the level of spend and content created in each region.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Not too long ago, just after the Scottish Affairs Committee concluded its important inquiry into the topic, I was joined by colleagues in Westminster Hall to talk about Scottish broadcasting. One of the biggest takeaways from the debate was just how important the sector is to people.

Scottish broadcasting brings communities together. It promotes pride in place and strengthens local economies. For those reasons, and many more, I strongly believe that Scottish broadcasting can and must continue to form a vital piece of the puzzle in the UK’s creative sectors. Indeed, Scotland is already a popular destination for broadcasters. Not only is it home to Amazon, but the BBC and Channel 4 operate there alongside STV, which in 2021 reached 80% of Scottish people through its main channel. Content made in Scotland often represents Scottish people’s lives and the diversity within them. That sort of representation matters. I know, for example, that it was exciting for many when the first Scottish family finally appeared on “Gogglebox”.

I am very sympathetic towards the aspect of the amendment that looks to ensure that the level of content made in and for Scotland is proportionate to the number of people who live there. However, I have questions about the mechanism used to achieve that. For example, what are the implications of directly attaching spend to population? How would population be measured and how frequently, and how would that impact the legislative requirements to match it? I wonder whether this issue could be better addressed through individual channel remits. For example, both the BBC and Channel 4 have existing nation quotas. Perhaps it would be better to focus on that rather than insert a strict spend requirement, tied to population, on the wider remit.

I would like to show my support for Scottish broadcasting, but further investigation might be needed into how we can best ensure that there is a comprehensive and holistic package of regulation and legislation to secure its future.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

My hon. Friend is absolutely right. It is not just the public service broadcasters that are committing to spending money on production in Scotland; it is right across the range of broadcasters. That exemplifies the strength of Scottish independent production. Indeed, similar figures can be quoted for Wales; it is not unique to Scotland. Every part of the UK is benefiting. Of course, Scotland has its own broadcasting company in the form of STV, which has a production arm, STV Studios, which has an ambition to become a world-class content producer for global networks and streaming services.

The success of the production sector in Scotland and across the UK has been supported and underpinned by a regulatory system. The importance of programmes being made outside London is in the new public service remit. In addition, all public service broadcasters, with the exception of S4C, are subject to regional programme-making quotas for spend and hours of production outside London. Channel 4 has its own out-of-England quota; the BBC also has a specific quota for content made in Scotland. Those quotas are set by Ofcom, which has the power to amend them, where appropriate. One example of the success of that regulatory system is the “Made outside London programme titles register”, published by Ofcom, which, in 2022, had 811 entries, including 543 from English regions outside London, 53 from Northern Ireland, 117 from Scotland and 72 from Wales. In each case, broadcasters are exceeding the production quotas quite comfortably. The Government will continue to support screen industries across the UK through a system of tax reliefs, investment in studio infrastructure and the UK global screen fund.

In line with the Government’s broader ambition to level up the UK, we want the production sector in all areas of the UK to continue to thrive, and we believe that PSBs play a very important role in our meeting that ambition. Returning to comments made by the hon. Member for Arfon, which I did not address earlier, S4C plays an extremely important part in that. I have not had the opportunity to visit production facilities in Scotland, but I have been to visit both BBC Wales in Cardiff and S4C, where I went on the set of “Pobol y Cwm”, and production in Wales is thriving. The position for S4C is slightly different from that for Scotland, in that there is, as the hon. Gentleman pointed out, a dedicated television channel for the Welsh language in the form of S4C. However, the Government are committed to supporting the production sector in all the nations of the UK.

I share the view of the hon. Member for Barnsley East that attempting to set quotas that are exactly in line with the population proportions would impose a constraint, which would be limiting and unnecessary. For that reason, I ask the hon. Member for Aberdeen North to withdraw her amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I highlight that the focus on content made outside the M25 is not enough. There needs to be a focus on ensuring that the economic and cultural benefits, and the talent pool, are spread wider; “outside the M25” cannot just be Salford, for example. It is possible for “outside the M25” to mean “focused in a small place”, which means benefits are not spread as widely as they should be.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

On the whole, I am pleased to welcome the clause, which looks to simplify the public service remit, and to allow broadcasters to contribute to the remit with programmes that are made available on a wider range of services, including their on-demand service.

Clause 1 makes an important attempt to simplify the public service remit. Currently, the remit consists of a set of purposes that public service television must fulfil in accordance with a different set of public service objectives. The Bill condenses those requirements, so that the PSB remit is considered fulfilled when providers together make available a wide range of audiovisual content that meets the needs and satisfies the interests of as many different audiences as possible. A list is then provided, setting out the types of content that can form part of such a contribution.

That simplification is, on the whole, a welcome idea, and the inclusion of minority language services and children’s programming in the remit is is great to see. However, the Voice of the Listener & Viewer, the Media Reform Coalition, the International Broadcasting Trust and others have expressed concerns that the simplified format has been coupled with the removal of requirements for public service broadcasters to provide specific genres of content.

When the Government first released the “Up next” White Paper that preceded the Bill, it made no mention of references to genres such as entertainment, drama, science and religion being removed from the remit, as they have been in the Bill. Content from those genres is important to people, and has huge societal and cultural value. If we remove explicit reference to them in the remit, there is a risk of less programming in those areas, particularly where they might be of less immediate commercial benefit. That is surely contradictory to the aim of having a public service broadcaster, which is fundamentally to ensure that public benefit is balanced against purely commercial interests.

The change is especially concerning at a time when, commercially, there is more choice than ever before in popular genres such as entertainment and drama, and less choice when it comes to dramas that provide diversity and difference for UK audiences. This would not be the first time that a reduction in requirements for PSB content led to a decline in culturally valuable content. As the Select Committee on Culture, Media and Sport highlighted in its report on the draft Bill, Ofcom identified how provision of non-animation programming for children became limited outside the BBC after the quota for children’s programming was removed.

I am pleased that the public service broadcasters have issued reassurances that the new remit will not significantly impact programming in the removed areas, and I am glad that, since its draft version, a small protection has been added in the Bill to secure

“an appropriate range of genres”.

However, the removal of references to specific genres is still a concern, even after these reassurances and amendments. Indeed, if there is no clear specification of what counts as a “range of genres”, there is no guarantee that Ofcom will monitor the amount of content in each of the removed genres. Without such monitoring, falls in provision will be difficult to identify and rectify.

It is with that in mind that I proposed amendment 19, which would ensure that public service content continues to be provided across a range of genres, including entertainment, drama, science, and religion and other beliefs. Further to that, in combination with the powers in clause 10, the amendment would enable Ofcom to properly monitor those genres and make proper suggestions, where content is lacking.

I want to be clear that this addition is not intended to change the nature of the remit, so that the issue would be covered by the PSBs as a whole. I understand that it is not, and should not be, the responsibility of each and every individual public service broadcaster to hit each and every one of the remit requirements, and that is no different for the provision of genres. For example, ITV provides nations’ and regions’ news in a way that means it is not realistic for it to meet some of the other obligations; those are then covered by the likes of Channel 4 and Channel Five, which do not provide the same level of news coverage. That sort of balance works well, and I want to explicitly state that I do not propose that every genre would have to be addressed by every provider. I hope that, bearing that in mind, the Minister can take on board what amendment 19 proposes. Simplifying the remit is a good idea, but not if is done at the cost of the kind of content that sets our public service broadcasters apart.

I move on to the other major consequences of clause 1: the changes that allow content provided through a wider range of services to contribute to the remit. This change makes sense as viewing habits start to shift in a digital age. As the Government know, last year, the weekly reach of broadcast TV fell to 79%, down from 83% in 2021. That is the sharpest fall on record. Meanwhile, on-demand viewing increased, reaching 53 minutes a day this year. Having the flexibility to meet the remit through an on-demand programme service is reasonable, given that this pattern is likely to continue for years to come.

In the meantime, online content can also help to deliver content to niche audiences. Indeed, ITV estimates that 3.8 million households in the UK are online only, meaning that they have no traditional broadcast signal. However, it is important to note that, while habits are shifting, a number of households still do not have internet access. Having previously served as shadow Minister for Digital Infrastructure, I have engaged extensively with telecoms providers and organisations such as the Digital Poverty Alliance, all of which have shared their concern and acknowledged that not everyone has access to or can afford a broadband connection. There is a movement to ensure that social tariffs and lower-cost options are available, as well as to improve the roll-out of gigabit-capable technology, so that as many people as possible can be connected.

Regardless of those efforts, there has been and will remain a section of the population for whom broadcast signal is their sole connection to media, news, entertainment and information. It is incredibly important that those people, who are likely to be older citizens, families in rural areas and those struggling with bills as a result of the cost of living crisis, are able to access public service content as usual on linear channels, delivered through a broadcast signal. That case has been argued extensively by the campaign group Broadcast 2040+, which is made up of a number of concerned organisations. We recognise that the direction of travel is that people are watching content online more than ever, but that does not need to mean diminishing content on broadcast linear services, especially where that content caters to a local audience. That belief goes beyond this Bill and ties into wider worries about the impact that a digital-first strategy will have on traditional means of broadcasting, and, as a result, on audiences.

It has been four months, for example, since the BBC decided to replace some of its vital and unique local radio programming with an increase in online journalism, which has been to the detriment of local communities up and down the country. That decision was made without consulting the communities that would be impacted, and it could easily be repeated in other areas, since there is nothing to stop many more services being axed in favour of online services. This is not to say that there will be no decline in audiences in the years to come as the rise in online content consumption continues, but no co-ordinated effort has been made to ensure that our infrastructure is ready for a mass movement toward online broadcasting. That effort must be made before such a transition takes place. The consequences for the internet capacity that will be needed to cater for spikes, and the implications for national security in a world where TV and radio are no longer methods of communication between the Government and the public, have not been thought through. As long as that remains the case, we must think of those for whom internet connection is not an option. That is why I tabled a new clause to protect the provision of high-quality content on linear services.

The new clause would introduce a safeguard, so that if Ofcom believes that the delivery of PSB content on broadcast linear services is less than satisfactory, it will have the powers needed to set a quota—to ensure that a certain proportion of public service content remains available to linear audiences through a broadcast signal. In short, quality content should remain available to those families up and down the country who rely on their TV rather than watch online content. The new clause makes no prescriptive requirements on how that should be achieved; nor does it set a specific figure for how many programmes must be available to a certain percentage of people. It simply allows Ofcom to monitor the effect of the Bill, which allows PSB content to be delivered online, and allows Ofcom to intervene with such measures as it sees fit if the new remit has unintended negative outcomes.

As well as encouraging him to accept the new clause, I urge the Minister to update us on whether the Government intend to support linear broadcasting beyond 2034. If they do not, what plans are they putting in place to manage a possible transition away from linear services? We have simply not heard enough about this from the Government, and I would be grateful to hear today what the Department’s position is and what work it is already doing on this.

Finally, I come to the rules that state that for on-demand content to count toward the remit, it must be available for at least 30 days. In the draft Bill, public service broadcasters including ITV and the BBC raised concerns that that minimum period was not appropriate for every type of content, because on-demand rights in certain areas, especially sport, news and music, often mean that such programmes are available for limited periods. It is welcome that those concerns are recognised in the Bill, and that an exemption is being introduced for news programmes and coverage of sporting events. Did the Department consider adding programmes covering music events to the list of exemptions? If it did, why was the decision made not to do so? Overall, I support a simplified remit, and the change in clause 1 that allows online content to count toward the remit, but further safeguards around certain genres of content and linear television are needed to protect against unintended or negative consequences.

I am broadly happy with clauses 2 and 4, which are consequential to clause 1. Clause 2 updates Ofcom’s reporting requirements to reflect the changes being made; likewise, clause 7 makes consequential changes to section 271 of the Communications Act 2003. On those issues, I refer Members to my remarks on clause 1 as a whole.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to pick up a couple of points relating to clause 1 that I have not mentioned yet, but that the shadow Minister has mentioned.

I am happy to support the provision in new clause 1 that would ensure that public service content is available on linear TV, but I do not think it goes far enough, and it does not add much to Ofcom’s requirements. The same concerns arise around matters such as “significant prominence”. The Minister said from the Dispatch Box on Second Reading that the move away from broadcast terrestrial television would not be made until the overwhelming majority of people in the UK were able to access television by other means. I hope that is a fairly accurate version of what he said. I am concerned that the phrase “overwhelming majority” is also not specific enough, although I appreciate the direction of travel that the Minister was indicating with that remark. My concern, like the shadow Minister’s, about the potential removal of terrestrial TV and non-digital output is for the groups who would be significantly disadvantaged by that loss.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

While I completely share the hon. Lady’s love of music and recognition of the importance that broadcasters play in the promotion of music, the purpose of the new remit is to remove the specific naming of individual genres and instead put a requirement for them to be a “broad range”. In my view, that would certainly include music. Ofcom will have a duty to ensure that the broad range of different aspects of public service broadcasting is delivered, and there is a backstop power. If it is felt that broadcasters are failing to deliver sufficient quantities of the specific genre, it is possible for us to pass additional regulation to include a named additional genre. While music is no longer specifically mentioned in the remit, I am confident that that will not lead to any reduction. Indeed, the broadcasters have made clear that they have no intention of reining back on specific genres just because they do not appear in the legislation.

On how content is delivered, the Bill updates the present system so that on-demand provision contributes to the fulfilment of the remit, but to count towards the remit, as has been mentioned, it has to be online for at least 30 days. The only exceptions to the requirement are news and the coverage of live sports, which are regarded as being of instantaneous value, but value that perhaps diminishes over a short space of time. We thought about including music, but I think the value of music lasts beyond 30 days—I am as keen to see a performance from Glastonbury today as I was at the time it was broadcast. It would therefore not be appropriate to include it as one of the exemptions to the requirement. The Government recognise that it is valuable for audiences to be able to access news and current affairs in a traditional format, and the Bill accounts for that by ensuring our public service broadcasters are still subject to quotas that require them to deliver news via traditional linear television. Taken together, these changes will help ensure that our regulatory regime keeps up with modern viewing methods.

Clause 2 updates section 264A of the Communications Act in the light of the new public service remit for television. Section 264A describes how Ofcom, when undertaking a review under section 264, should consider the contribution that other media services, including those provided by commercial broadcasters, make to the remit. The changes made by the clause are needed to implement the new public service remit.

Clause 7 makes changes consequential to clause 1. In particular, it amends section 271 of the Communications Act to apply the existing delegated powers in the section to the new public service remit, as opposed to the old purposes and objectives. That will ensure that, should there be a need, the Secretary of State can by regulation modify the public service remit in clause 1, as I was suggesting to the hon. Member for Luton North. I therefore commend the clauses to the Committee.

I understand the intention behind amendment 19, which is to ensure that the range of content shown is broad. We want that too, but we feel that no longer specifying a large number of individual genres simplifies the current system of public service broadcasting. We want to set a clear and simple vision for the industry that narrows in on what it means to be a public service broadcaster, but we do not see that that need comes at the expense of breadth. We continue to want to see a wide range of genres, and we believe the clause achieves that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister said it is possible by regulation to amend the list to add genres. Could he write to me with information about the process by which that could happen? How can amendments be made to add genres to the list, should that become necessary?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Ofcom has a duty to monitor the delivery of the remit, and that includes satisfying itself that there is a sufficient range of genres and that there has not been a diminution of a particular genre that would be considered part of the public service remit. If, however, it becomes clear that broadcasters are failing in any area, there is a backstop power that allows the Secretary of State to add a specific genre to the remit. We believe that safeguard is sufficient to ensure continued delivery of the range of genres that the hon. Lady and I want to see.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that point, for clarity in advance of the remaining stages of this Bill, it would be really helpful if the Minister wrote a letter explaining that. He has mentioned both that the Secretary of State would have the power to vary and to initiate the backstop, but also the power to create regulations, and I am not entirely clear about which it is. It would be great if he just laid that out to us in in a letter.

--- Later in debate ---
Public service remits of licensed providers
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 35, in clause 3, page 7, line 15, at end insert—

“(c) which is broadcast via UHF frequencies that can be received by a minimum of 98.5% of the population of the United Kingdom.”

This amendment would amend the definition of public service for Channel 3 services and Channel 5 to include an obligation to broadcast via digital terrestrial television.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 36, in clause 3, page 7, line 32, at end insert—

“(d) which is broadcast via UHF frequencies that can be received by a minimum of 98.5% of the population of the United Kingdom.”

This amendment would amend the definition of public service for Channel 4 to include an obligation to broadcast via digital terrestrial television.

Amendment 37, in clause 15, page 17, line 35, at end insert—

“(c) after paragraph (c), insert—

“(d) provide for the broadcast of programmes for or on behalf of a Channel 3 licensee using the MPEG-2 or MPEG-4 digital video broadcasting standard via UHF frequencies that can be received by a minimum of 98.5% of the population of the United Kingdom.””

This amendment would amend the definition of public service for Channel 3 licensees to include an obligation to broadcast via digital terrestrial television.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

We covered a little of this in the last debate, in relation to access to terrestrial television services. As I said, there is still significant digital exclusion in our society when it comes to those who access television services and public service broadcasts through non-digital means.

It is possible to do what I do, which is to access television entirely through digital means—I have not had an aerial for a significant time. We moved into our house in 2016 and I am not aware that we have ever watched terrestrial television there, but we are lucky enough to have and be able to pay for a fast broadband connection and to live in a city where we can access one; we are not in any of the excluded and more vulnerable groups that find it more difficult to access television through on-demand means. A significant number of people can still access TV only through terrestrial services.

The amendments are about trying to pin the Minister down on what he means by “an overwhelming majority”. This is about looking at the numbers: is 98.5% of the population the kind of figure that the Minister was thinking about when he said “overwhelming majority”, or did he mean 60% or 70%? I am in debt to my hon. Friend the Member for Paisley and Renfrewshire North (Gavin Newlands), who, like me, has met Broadcast 2040+, which crafted these amendments. My hon. Friend is significantly more of a football fan than I am, and has specifically mentioned the fact that football viewing figures are higher for terrestrial TV than they are for subscription services. Removing access to terrestrial TV, which may happen at some point in the future and may need to happen at some point in the very distant future, will reduce the number of people able to access Scottish football. Therefore, in addition to the comments I was making about the educational provision available on television, I make the point that it is also important that there is the ability to view sport.

Yesterday in the Chamber, there was a ministerial update on the risk and resilience framework, which was published by the Government last year. Ministers have been at pains to state how much more transparency the framework enables than was the case previously. I appreciate the work that the Government are trying to do to update the national risk register, to ensure that it is as public as possible and that people are able to access this information. However, an incredibly important part of local resilience is being able to access up-to-date news, up-to-date and on-the-spot weather, and information when something significant happens.

I will give an example. Recently, there were significant floods in Brechin, which is just down the road from Aberdeen—although I am not sure that people in Brechin would want to be described in relation to Aberdeen; Brechin is a very lovely place in its own right and not just a neighbour of Aberdeen. People in Brechin saw really significant flooding, and a number of properties were evacuated. Without the ability to access information on what was happening through terrestrial TV or radio services, people would have been much less aware that the river was about to break its banks. If there is really significant wind—as there was, during the significant rain—accessing mobile phone masts, for example, is much more difficult. Terrestrial TV service masts, having been up for significantly longer, are significantly less likely to come down in the kinds of winds that we saw during Storm Arwen and Storm Babet, as weather events increase. In terms of resilience, it is important for people to be able to access that.

During the covid pandemic, people were glued to their television screens for updates about what was happening and the latest lockdown news. If some of our most vulnerable communities were struggling to access such content because, after the withdrawal of the terrestrial services, they did not have the broadband speeds necessary to watch television on demand, they would be less likely to be able to comply with and understand the law if another pandemic or national emergency happened.

It is important for the Government to know that they can reach the general population; that is how they could make the case for lockdown restrictions or ensure that people were aware of when the Queen sadly passed away last year. They can make those announcements and ensure people have the understanding and ability to know when significant national events have happened.

If people who are older, in poverty or otherwise digitally excluded are less likely to hear timeously about extreme weather or massive national events of incredible importance, then we further marginalise communities that are already struggling. As I said, I appreciate the Minister using the term “overwhelming majority” but I am just not confident enough that—

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady should recognise that such switchovers are possible only when the technology supports it, which is a question of changing the distribution mechanism at some point. That can lead to more choice.

Take the village in Kent where I live. When we had to do the switchover in 2012, the consequence of turning off the analogue signal and replacing it with a digital one was that we could get Channel 5, which people would otherwise not have been able to get at all. With the improvement in infrastructure, some people may see a significant improvement in services, but only where that infrastructure is ready.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that and think it is important, but my point is about those who cannot get access and do not have the financial ability to do so. If we have a commitment to continue to provide terrestrial services and the legacy infrastructure, the providers of that infrastructure—the public service broadcasters—can continue to invest in it and not just say, “Well, the Government are going to allow us to turn it off in 2040 so there is no point in investing in it now. It has only got 17 years left to run, so we are just going to run the network down.” I am concerned that that may be the direction of travel.

Without a very clear commitment from the Government, I am worried that there will be a lack of investment in terrestrial services and that people will lose out. I would not want anybody to lose out on Channel 5 and I am very glad that people have access to it, but they need to have the choice. I would rather people had access to some public service broadcasting than none, which would be entirely possible if the digitally excluded could no longer access terrestrial TV services.

If the Minister made some really clear commitments today, that would be incredibly helpful. He may not be able to do that, in which case I may press some of the amendments. I will certainly be supporting the Labour party’s new clause. If the Minister cannot make more commitments, will he make clear the Government’s point of view about people likely to be excluded from taking part in a switchover, in relation to current investment in the network and investment to ensure that the network can last the next 15, 20, or 30 years? Would the Minister be happy to see that network diminish and for there to be a lack of investment so that services run down of their own accord or would he would prefer people to continue to be able to access them?

It would be great to have a little more clarity from the Government on the proposed direction of travel. I thank my hon. Friend the Member for Paisley and Renfrewshire North and also Broadcast 2040+ for all the work that they do to try to ensure that marginalised groups can continue to access public service broadcasting.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

We completely recognise that terrestrial TV is important to many in the country. I was in my second incarnation as a Minister at the time of the Bilsdale fire, and I talked to Arqiva about the importance of restoring services as rapidly as possible. A very large number of people were left without the ability to access information, entertainment and all the things that people rely on television to provide.

Looking forward, as hon. Members may be aware the Secretary of State recently announced that the Department is going to carry out a new programme of work on the future of television distribution. That includes a six-month research project working with a consortium led by the University of Exeter, looking at changing viewing habits and technologies. We have also asked Ofcom to undertake an early review on market changes that may affect the future of content distribution. I am very happy to keep the House updated on those. That will be looking at all the various factors that would need to be taken into account.

I make one final point about amendment 37. It puts a particular requirement on channel 3 licensees to use particular standards for compression technology. As with all technologies, the standards for television distribution will change over time. We want to ensure that there remains flexibility, so restricting channel 3 to a particular use of one technology would be severely limiting and actually be contrary to precisely what the Bill is designed to achieve.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On what the Minister just said about the report on the future of television provision being done and the timeline for decision making, does he recognise my point that the degradation of the technology is possible if the Government do not make fairly early decisions—I am not talking about in the next three months—on whether they are going to extend it beyond 2034? Does he understand the importance of making a decision in fairly short order to ensure that broadcasters, for example in Arqiva, keep the technology running so that it stays viable beyond 2034 if necessary?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As I say, we are committed to keeping the House updated about the research. I recognise the point, and my own expectation is that DTT will be around for quite some time to come. For the reasons I have explained, I am not able to accept the amendments. I hope that the Opposition will withdraw them.

Media Bill

Kirsty Blackman Excerpts
2nd reading
Tuesday 21st November 2023

(5 months, 3 weeks ago)

Commons Chamber
Read Full debate Media Bill 2023-24 View all Media Bill 2023-24 Debates Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

I appreciate having the opportunity to lead for the SNP on Second Reading. My hon. Friend the Member for Ochil and South Perthshire (John Nicolson), who usually leads on Digital, Culture, Media and Sport, has been unable to come along, so I have stepped into the breach, as it were, and agreed to manage the Media Bill for the SNP.

Although the Bill is welcome and takes a number of positive steps forward, I am concerned about how over-complicated some of it is. The Bill amends the Communications Act 2003, the Broadcasting Act 1996 and the Broadcasting Act 1990. Apart from amendments to corporation Acts and tax Acts, I have not seen anything quite this complicated. If I were a broadcaster or worked in this area, I would find it difficult to find all the information I needed even to comply with the legislation because of its complicated nature. The Media Bill mostly amends those three pieces of legislation, as well as a few others in smaller technical ways—smaller technical amendments are absolutely standard—but it has been done in a complicated way that will make it difficult to find some of the definitions.

I was looking, for example, for the definition of “programme”. I was directed to the Communications Act 2003, which directed me to the Broadcasting Act 1990, which then told me what the definition was. I have yet to find out the definition of “person”. Perhaps the Minister could furnish me with information on where I could find that definition in those three pieces of legislation. I did, however, find out that when it comes to choosing programmes and organising programming, an algorithm can be counted as a “person” if someone is assisted by an algorithm. I would find it very helpful if the Minister pointed me in the direction of the definition of “person”, which is used a significant number of times in the Bill when it talks about a person who is in charge of programming. Does the word “person” also relate to an entity or a group of people if they are in charge of programming? It would be helpful to have more information on that.

I am slightly concerned about other definitions and uses of words. The requirement for Ofcom to work out that there is a sufficiency of something without there being any clarity on what “sufficiency” means is slightly concerning, because something that I see as sufficient may not be seen as sufficient by somebody else. If there were more information on what “sufficient” meant, there would be more clarity on the changes to Channel 4 as a proportion of expenditure, for example, as opposed to a proportion of programming. “Sufficiency” is not sufficiently defined in the Bill.

The shadow Secretary of State mentioned the word “appropriate” in respect of the availability of public sector broadcasters through internet services, and raised concerns about whether it should be re-termed as “significant”. That would probably give those broadcasters the level of prominence that we expect and want them to have, so that people can access their services in the way that they want and expect. I agree that there could be a different way of doing that.

I will come to a number of different issues, but let me touch on the requirement on the prominence of services. That is important, and I am glad that the Government have chosen to tackle the prominence of services. The order in which public service broadcasters appear—particularly for those who use Amazon Fire Sticks, for example—is important. As those broadcasters have responsibilities that other broadcasters do not, it is important that they are given a level of primacy.

However, I am concerned that the App Store and the Google Play Store are not included in the measures, given the way in which such organisations—particularly the App Store—have behaved. They have said, “We can carry things such as the BBC iPlayer or the STV player only if you give us a significant slice of your revenue.” That is not acceptable. If people look up the BBC iPlayer on the App Store, it should be the top result, rather than being placed further down because Apple has had an argument with the BBC about it. It is inappropriate for Apple to charge the BBC significant amounts of money for a level of prominence that the BBC should have by right as a public service broadcaster. That is important not just in relation to the software in the Fire Stick, for example—or however we choose to view our video-on-demand services—but in the prominence that public service broadcaster apps, such as Channel 4 on demand and BBC iPlayer, are given. The same applies to BBC Sounds in radio access. Those broadcasters should not be charged significant amounts for that prominence.

While I am on radio, I appreciate what has been said about ensuring that Alexa and Siri provide the correct radio station. I would really like Alexa or Siri to play Taylor Swift when I ask for her, rather than Rage Against the Machine. It is not that they are trying to provide me with something else; it is that they do not understand my Scottish accent. Improving the listening ability of those services so that they can play the song that I want would be incredibly helpful.

I like the provisions on advertising. In some cases, it is not Alexa or Siri making decisions on advertising; it is TuneIn Radio—or whichever programme Alexa or Siri is playing through—that is making those decisions. As long as that provision applies to how we hear advertising, rather than who deals with the background stuff, I am happy enough with the measures.

I agree with the right hon. Member for Preseli Pembrokeshire (Stephen Crabb), who has just headed out of the Chamber, on the importance of local radio. In my constituency, Station House Media Unit—known as shmu—does local magazines as well as a significant amount of local radio. It feels really rooted in our communities in a way that, as the right hon. Member said, larger stations that have been taken over by other companies do not.

I appreciate the level of children’s content we have had, particularly on the BBC, having watched CBeebies with my children. When I was younger, I went to a fancy dress party dressed as a Tweenie. I cannot remember whether I was Bella, Milo, Fizz or Jake, but I can tell the House that I did not have to look up those names, because I remembered them. They are ingrained in my soul, having watched the show with my little sisters. They are significantly younger than me, which is why I mention such a recent television programme.

Ofcom has had to scale up massively to service the provisions of the Online Safety Act 2023. I am appreciative of that, and I have a lot of time for the growth in capacity and the number of excellent people it has brought in to do the work. Can the Minister give us a level of reassurance that, for the policing of this area, the writing of the regulations and guidance that this Bill will require and the different interactions that Ofcom will be having, in particular with video-on-demand services, it will have the number of individuals and capacity and resource to be able to undertake such additional layers of work? I am aware that Ofcom is doing significant portions of work around broadcasting already, but I do not want it to have to stretch itself when it is already having to grow at pace. I am concerned that there are not even the number of qualified individuals to take on that work, given how specialised and important it is. Can the Minister reassure me that he is having conversations at least with Ofcom about its capacity when this legislation comes in?

A number of my colleagues have mentioned the Gaelic language and the issues around it. Of course, those could all be solved by devolving broadcasting to the Scottish Government, but in lieu of that, I will highlight some of the disparities. The Secretary of State was perhaps getting a little confused between BBC Alba and MG Alba, which are two different organisations. [Interruption.] Alba—my pronunciation is nearly there. I am an east-coaster. The two organisations are different and operate differently. We appreciate the support being given to S4C, which is a good thing, but we have a disparity, as £89 million of licence fee is going to S4C, whereas only £10 million is going to the Gaelic language. There is a requirement for a quota of at least 10 hours a week of Welsh language programming, but no requirement for a similar quota for Gaelic programming. I am concerned by that.

Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

The hon. Member is making a very good point about the Gaelic language. I absolutely hate to say this in this place, but my constituency has a few native Gaelic speakers—there are so few of them. I pray that in a few years’ time another generation will have the language. Gaelic is in a vulnerable situation, which reinforces her point.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I very much appreciate the hon. Member’s point. I went to visit a Gaelic nursery in Aberdeen a couple of years ago. Staff there were concerned about the reduction in Gaelic programming for children, because outside the nursery the children were not necessarily getting the exposure to Gaelic that they might have had if they had lived in Skye or the Western Isles. They were concerned that, just because they had chosen not to live in those communities, the language embedded in those children and their ability to access TV programmes in their native first language was significantly reduced. I am concerned by the disparity. I hope the Minister appreciates that we are coming from a good place in trying to ensure the protection of Gaelic, some level of parity and that people across Scotland can access it.

I will highlight specifically what the Bill states. It states that there has to be

“a sufficient quantity of audiovisual content that is in, or mainly in, a recognised regional or minority language”.

Later, the Bill states that

“‘recognised regional or minority language’ means Welsh, the Gaelic language as spoken in Scotland, Irish, Scots, Ulster Scots or Cornish.”

The Bill does not define what “a sufficient quantity” is. It does not say whether it will be measured on the basis of the percentage of people who speak that language in each of the countries. That wording is concerning, and given that there is a quota for Welsh programming, it is disappointing that there is not a similarly recognised quota for any of the other languages.

Ian Blackford Portrait Ian Blackford
- Hansard - - - Excerpts

My hon. Friend is making some strong points, and all of us on the SNP Benches support full funding for S4C, but it is specifically worth saying that there is no index-linking of the funding available for MG Alba. In many respects, the situation that Gaelic broadcasting is now facing is even worse than people might consider, because in real terms the funding available for MG Alba will, by 2027, be 50% of what it was in 2008. We are facing an existential threat to the survival of Gaelic broadcasting. We can think about the breadth and depth of the programming. I have programme-making in Skye, including from Chris Young of Young Films, who is known for “The Inbetweeners”. He, for example, produced the excellent “Bannan”. We need to fund such broadcasting appropriately.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I agree. We do not regret or feel angry at the Welsh language programming that is provided and the support for it. As my right hon. Friend said, we are looking for parity, and the index-linking of funding is important. We also need to recognise that the Scottish Government are already providing significant funding for the Gaelic language and to MG Alba, but there is no parity in terms of the licence fee.

I have a few other things I wish to say. Sadly, the Bill finally says goodbye to teletext; it is the end of teletext as we know it. It has not been in use since 2009, but the Bill finally removes it from legislation.

I also wish to talk about football games and how broadcasting and listing works. Listing is the particular concern. The Secretary of State said that the listing system is being revamped—I am not sure exactly what word she used but that was the direction she intended. However, the listing system itself—the way in which category A and category B listings are chosen—is not being revamped. No change is being made to that.

My hon. Friend the Member for Paisley and Renfrewshire North (Gavin Newlands) is unwell and unable to take part in today’s debate, but he has done a huge amount of work on trying to ensure that we can access Scottish football games. It is incredibly important that we can see Scottish football games in Scotland. The Broadcasting Act 1996 says:

“’national interest’ includes interest within England, Scotland, Wales or Northern Ireland.”

It does not say, “England, Scotland, Wales and Northern Ireland”; it says “or Northern Ireland”. Given how popular Scotland’s football team is in Scotland, its games should be classed of national importance, especially as we have finally made it to the finals of a tournament. That is wonderful and we want to be able to see those games. It is not fair that viewers in Scotland have to pay to see their national team play, whereas viewers everywhere else in the UK do not have to pay for the same privilege. This issue is important. I note the point that the shadow Secretary of State made about the Culture, Media and Sport Committee’s digital rights enabling provision, and I agree that if enabling provisions could be made on digital rights for sports events, that would be an important move.

I have a couple more issues to raise. The first is on-demand services and the inclusion of the 30-day requirement. Unfortunately, the Bill does not make it clear whether that means 30 consecutive days. It is important that the word “consecutive” be added unless precedent in other legislation suggests that “30 days” means 30 consecutive days. Why is news excluded from that provision? The right hon. Member for Ashford (Damian Green) spoke about the economic and cultural importance of our media, but we must consider its democratic importance in ensuring that knowledge is spread. I do not understand why the Minister and the Secretary of State have chosen to exclude news from this 30-day requirement on digital provision. The other thing that could have been clearer is ensuring that some of the provision is accessible. I know that the BBC has worked hard on this, but we are not there yet, as some of the local news that is provided is nearly impossible to find. If I want to watch Aberdeen-specific news, or even Scotland-specific news, it is hard to find it and disentangle it from more national news. Accessibility is required in that regard.

This legislation provides for quite a lot of delegated powers. I have not managed to make my way through all of them, but using the affirmative procedure often strikes the right balance. Using the draft affirmative procedure for a significant amount of the delegated powers in this Bill is important.

I am pleased that we have the Bill. I am concerned about the lack of futureproofing in some of it and about the overcomplication, as some of the definitions are difficult to follow and therefore may not achieve what the Government intend. The cultural sector is incredibly important to the entirety of the UK. It is incredibly important in Scotland, and we certainly will not oppose the Media Bill as it goes forward.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I call the Chair of the Culture, Media and Sport Committee.

--- Later in debate ---
Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

Yes, the hon. Gentleman is absolutely correct.

Furthermore, as we know, local radio—and, as was expressed by the right hon. and learned Member for South Swindon (Sir Robert Buckland), who is no longer with us, the same is true of local television—is absolutely fundamental to the proper functioning of local democracy. I know this only too well, and in some ways I regret it. Let me give Members, for their lighter amusement, a cautionary tale. When I was first elected to be a member of Ross and Cromarty District Council a long time ago—I was once upon a time the youngest member of the council—my younger brother was a broadcaster on Moray Firth Radio, our local radio station, which is still alive and well today. He thought it would be kind to me to put me on his chat show on a Saturday morning called “The Chipboard Table” just days after I was first elected. He sat me down—this was live—and he said, “Jamie, last night we had a dram together, and you told me that you felt your fellow councillors were quite creative in the way they completed their expenses.” This led to an indifferent start to a career in local government, but that is one of the scars I bear. Luckily, it was a long time ago. For accountability and throwing a light on local democracy, local radio is absolutely crucial, and notwithstanding my experience, I would not have it any other way.

On the issue of quotas, the removal of Ofcom’s responsibility to monitor the delivery of content in education, science and culture may risk content in these areas declining. That would concern me because, as was eloquently expressed by the right hon. Member for Ashford (Damian Green), the soft power this country exerts is about being British, but it also about reflecting the different facets of our nation that English-speaking countries find absolutely fascinating. As the Bill progresses, I will be looking to ensure that Ofcom retains a statutory requirement to measure the output of each of these genres—language, culture or whatever—against, let us say for now, the benchmark of what we have at the moment. I do not wish to see any decline from that whatsoever.

On accessibility, when it comes to linear television, there is a requirement for 90% of programmes to be provided with subtitles, as we know. It is right that there should be greater access to those things. Let me give the House another personal example. On a Sunday evening, a cousin of mine who is a little older than me comes and has a meal with my wife and I, and she watches the television. She is a great friend and much loved. She is also pretty deaf, and for some television programmes we can get the subtitles up, but for others we cannot. Perhaps I am not very intelligent with IT, but by gosh we’ve tried, and it is hugely frustrating that she cannot see the words that are being said. The same applies to people with visual impairment—we are talking about signing and other ways of helping. The Liberal Democrat party will look to require that at least 80% of on-demand TV content be subtitled, with 10% audio described and 5% signed. That is our position at this stage.

While I find it tricky to find the subtitles, another issue is also tricky to find. One of the most important aspects of the Bill is the call for public service broadcaster prominence, ensuring that the likes of BBC, Channel 4 and ITV are not only easy to find on any smart TV, but are also given due prominence. This is the existential issue for our public service broadcasters, and the question of how appropriate prominence will be defined is vital. The Liberal Democrats would like the current call for “appropriate” prominence be strengthened to “significant” prominence, and I believe we will be tabling amendments to see whether we can achieve that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The hon. Member is talking about a range of different issues, which highlight the fact that there are a lot of disparate concerns about the Bill. Does he share my concern that the draft programme motion does not include taking oral evidence for the Bill, and does he understand why the Government have done that?

Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

I believe that is a wise point, and we would be wise to heed it.

When it comes to Channel 4, I believe I am not alone in having concerns about plans to relax the publisher-broadcaster status, and about the potential risk that that poses to the unique contribution that the channel makes to the diversity and sustainability of the independent production sector across the nations and regions. Again, that takes me back to my earlier point about the sheer diversity of the product being part of our soft power, which is important to this country. However, there is a caveat. With the increased independent production quota and Channel 4’s prediction that any changes will take at least five years to launch, that fundamental change might not lead to any market shock in the short term. But the proof of the pudding is in the eating, and we shall see.

Finally, let me turn to what is perhaps a core debating point today. Section 40 of the Crime and Courts Act 2013 requires new outlets to pay the costs—we know what that is all about. The Liberal Democrats stand firmly against that charge. The 2013 Bill followed the Leveson inquiry and the phone hacking scandal, and the proposed change will put at risk the balance between free speech and public safeguarding, all the while favouring news publishers. One could say that that is a standard political stance in this debate, and perhaps Conservative Members would take a different view. However, let us consider one final point, which is important in terms of the notion of British justice. This change would mean that anyone without substantial financial resources or deep pockets that can match the might of the newspapers would find it impossible to pursue legitimate grievances through the legal system. We need to think about that very deeply. What can the small man possibly do against the publishing giants? That is hugely important and I think there is a warning here. With that I will conclude my remarks. I sincerely hope that my career in this place will not include any more gaffes on live radio, but you can never tell, Madam Deputy Speaker, least of all from a highland Member of Parliament.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I did not even think about the TV schedule as something that people look at. I never look at a TV schedule. I do not know if my Fire Stick or my PlayStation has a TV schedule. On significant prominence, I was picturing the BBC iPlayer app being at the top of the apps list. Does the hon. Gentleman agree that Ofcom should look at both those things: how it appears on the screen and where the public service broadcasters are in any live schedule?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady makes an important point. It should be easier to find through app stores. Although they are not directly in scope of the legislation because they are not broadcast formats in their own right, that question should be asked—is it easy to find? It should be easy to find on a connected device when it is turned on, and it should be easy to locate the apps.

Ofcom also has to consider whether the business model that underpins connected devices is fair to public service broadcasters. There is no doubt that the business model for Amazon and Google is to try to create a connected device space where all the entertainment exists and is tailored to each person. They also want to build the ad tech into that, so that they are the principal beneficiaries of the ad revenue, by monetising the placement of that content as well and diverting it away from broadcasters who have traditionally sold audiences to make money. That is the underlying problem that public service broadcasting faces today. The sale of audiences to generate advertising revenue to invest in programmes—the model that has fuelled independent public broadcasting for 50 years—is not broken, but it does not work in the way it used to; it is much more diffuse.

The revenue challenges that come from that are extremely real. That is why, on Channel 4, although I am pleased to see the Government’s changes to the remit, we need to keep a watching brief to see whether they go far enough. We have not gone as far as Channel 4 asked to go in its counter-offer to privatisation, which was the ability to go to the markets to raise money from private investors to create a programming fund that would invest £1 billion over two years in new programming. If we simply allow Channel 4 to acquire a stake in the making of programmes that it will broadcast, which will make revenue in the future, will that be enough now to meet the challenges that it will face? Given the ongoing pressures this year on declining ad revenue for TV broadcasting, we need to make sure that that will be enough. We should not assume that the measures in the Bill, which are welcome, will be the last word on that. There may be more challenges to come.

I would like to add two further points. It is right that we try to create more parity between the regulation of on-demand online services and broadcast television. If a viewer turns on their connected TV device, as far as they are concerned Netflix is as much television as the BBC, and there should be some parity in the way the platforms are regulated, the obligations they have to their users and the notifications they give about the suitability of the content. That should apply to advertising too. Often the debate we have is around advertising that targets children, but children are not watching live television; they are watching it on demand. The danger at the moment is that we have a highly regulated live broadcast television environment, but an almost completely unregulated online one. We should be far more worried about the ad rules that apply on YouTube than those on ITV, because that is where the children are. It is vital that the work on the Government’s online advertising review is completed at pace. The project has been worked on for a number of years. There needs to be proper enforceability of the advertising codes that have stood us in good stead in the broadcast world, but do not yet work in the same way online.

Finally, on media ownership and media freedom, which the Secretary of State mentioned in her opening remarks, we should give some consideration—maybe the Bill is not the right place—to the ownership of UK news companies and news assets, particularly if they are acquired by organisations based in jurisdictions overseas where maybe the regard for press freedom is not the same as it is in the UK. The Bill does not address that concern. If we have an ongoing concern about a vibrant news media landscape, there should be some concern about the companies that own media organisations—where they are based, what their interests are and what interest they have in the way the news is reported here. We do not want to see the press regulated in any way—we want to avoid that and in many ways the measures in the Bill are a nod to that as well—but we want certainty about safeguarding media freedom in the future.

--- Later in debate ---
Andy Carter Portrait Andy Carter (Warrington South) (Con)
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow my hon. Friend the Member for Aylesbury (Rob Butler). I fear that I may repeat much of what he just said. I am pleased to be speaking in this debate, a week on from the King’s Speech debate in which I spent quite a bit of time calling on the Government to get on and introduce the Media Bill. For once, they listened to me—that’s nice.

The Media Bill we are debating is the first piece of media legislation for 20 years. The media landscape has changed beyond belief in the last two decades—it is vastly different from the world we lived in 20 years ago—so the Bill is vital to supporting broadcasters and audiences in the modern age. As the media landscape has changed, it is important that we support legislation without delay to give certainty to this important sector. We should recognise that the Bill will probably govern the media landscape for the next 20 years, so it must be forward-thinking, outward-looking and open, just as the previous legislation was.

I declare my interest as chair of the all-party parliamentary media group and the all-party parliamentary group on commercial radio. Let me start by saying that I welcome the Bill, which responds well to the needs of the sector. Because of time limitations, I will focus my remarks on three specific areas of the Bill. I will do something that I rarely do, and put television ahead of radio.

I welcome the Government’s commitment to simplifying the existing remit for public service broadcasters. PSBs are what make our television landscape renowned around the world, but they face unprecedented competition for viewers, programming content and talent in an era when global streaming services such as Netflix and Amazon Prime are producing original content and becoming increasingly dominant in the market. It is good that we have more content producers, but even better, they are choosing to make content here in the UK because of our regulatory framework.

TV prominence is about ensuring that UK viewers can easily find public service content that they value. We are living in an increasingly global marketplace, but there is still an appetite for programmes that reflect British values. In fact, around seven in 10 UK adults want UK life and culture to be represented on screen, and a similar number agree that PSBs make programmes designed for UK audiences. Why is it important that we introduce legislation to protect PSBs? Surely, viewers will want to watch the programmes that they make.

Until now, in return for providing public service content, the Government, through Ofcom, have allocated frequencies to broadcasters. In a relatively uncomplicated world, those channels have been easy to find on electronic programme guides: ITV, and STV in Scotland, on channel 3; Channel 4 and Channel 5 on their respective channels. Once someone has tuned in their TV to the nearest transmitter, they press the number on their remote control and the channel is there.

In a future world where the internet is used to deliver the linear TV and video on demand, the tech companies and platforms will decide where products and programmes appear. In fact, at the moment, if Samsung or LG decided not to include the BBC iPlayer app on their TV screens, there is nothing the BBC, UK viewers or the Government could do about it. If Amazon decided to double the charge for Channel 4’s on-demand service to appear on its Fire Stick, there is little Channel 4 could do about it. From speaking to Channel 4, I know that when Amazon moved the location of the Channel 4 app on the Fire Stick, there was a significant alteration in the viewing of Channel 4. It matters where the apps are located on the relevant platform.

If we want to make sure that British viewers can easily find BBC, ITV, Channel 4 and Channel 5, and STV in Scotland and S4C in Wales, we need to agree the framework that will ensure that platforms carry those services. I fully support that. I also urge the Government to look carefully at using the word “significant” rather than “appropriate”. That will determine where the channels are found on those platforms.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I wholeheartedly agree that it is not just about the schedule. As I said earlier, I was not aware that we had a schedule. We do not use Freeview; we open the Fire Stick or PlayStation and look at the apps. The prominence of the apps is important. If someone does not have terrestrial TV or an aerial hooked up, that is the only way that they are able to consume the public service broadcast content.

Andy Carter Portrait Andy Carter
- Hansard - - - Excerpts

There may be an age divide that determines whether someone looks at an electronic programme guide or the Radio Times, or whether they just look for a tile. The notion that viewers want to continue to use linear TV is important. That is why it is so critical that we legislate in the right way to make sure that British viewers can find it.

The changes in the Bill will impact Channel 4 more than any other PSB, given its unique publisher-broadcaster licence. Channel 4’s status, introduced by the Conservative Government back in the 1980s, has significantly aided the development of the independent production sector in the UK over the last 40 years, which is now worth nearly £4 billion. The removal of the publisher-broadcaster restrictions will allow for Channel 4 to produce its own content, as opposed to simply commissioning or acquiring all of its content from third parties. Why does that matter? For the first time, it will allow Channel 4, when it produces content, to own the rights for that content, which it can then sell around the world, creating another stream of revenue which will allow products and programmes to be funded on Channel 4.

The Government have announced plans to increase Channel 4’s independent production quota as part of the changes. However, there will be many small production companies in areas such as the north-west of England, which have seen a rapid growth in independent production businesses, who are still unsure about the full impact the changes will have for them. Will the Minister, in his response, expand a little more on what the changes will mean for those businesses and give some assurances that they will still be able to thrive once Channel 4 receives its new licence and the Bill receives Royal Assent?

Channel 4 has indicated that it will maintain its existing commitment to spend 50% of its budget for main channel commissions outside London. That is really important to regional production. Ofcom has announced that it will be consulting on whether changes will need to be made to Channel 4’s regional programming making quotas. Is the Minister able to provide a timeline for that consultation, so we know when any changes will come into effect?

I want to touch on local TV and echo some of the comments from other hon. and right hon. Members. I have received representations from the local TV networks who are concerned that the current Bill does not guarantee local TV service prominence in the new TV ecology, and neither does it grant powers on a par with those of local radio services. At some point, the sector will start to provide streamed linear programme services. Will the Government be giving consideration to including local TV as part of the licensed public service channel designation in the Bill to help ensure sustainability for the sector? It really is important that there is an understanding for this sector going forward, because it is making decisions today on the future of its business plans.

Finally on TV, if we are looking to the next 20 years, because this is the only Bill we are likely to see in the media landscape, we should be conscious that the previous broadcasting Bill ran for 20 years. On the Government’s management of a digital terrestrial television switchover, I have been reassured in my conversations with the Minister that he wants terrestrial television to remain accessible for the foreseeable future. I very much agree with him on that. When he is summing up, could he give an indication of the criteria he might want to set before broadcast TV services on Freeview are considered for switch off? That was in place for DAB digital radio. There was a clear criteria in terms of when that might happen. Things have moved many, many times over the years, but it would be helpful for the digital terrestrial sector to understand what the Government might be thinking.

Before I turn to the provisions on radio, may I put on record my congratulations to all those who have worked in commercial radio over the past 50 years? Independent local radio, as we once knew it, celebrated its 50th anniversary just a few weeks ago. It was 50 years ago in October since LBC and Capital Radio arrived on our airwaves in the capital, 50 years since Radio Clyde in Glasgow launched and 50 years since BRMB in Birmingham launched. They were the four stations in 1973 that appeared on our AM radios. Over the 50 years, we have seen a plethora of local, regional and national stations arrive on AM, FM, DAB and now online via Radioplayer and smart speakers. Today, commercial radio is delivering record audiences. Back in the early 1980s, we were all convinced that video was going to kill the radio star. Actually, radio is in rude health. We have regional brands, national stations and hyperlocal services focused on their own towns and cities that are doing remarkably well. We should all recognise in this House how strong commercial radio is today and how much we value the services that people who work in that sector provide for us.

There is unanimous agreement across the BBC, and across commercial and community radio, that the Bill, on the whole, works for radio. It contains crucial measures that will help to safeguard the future in the face of changing technology and shifts in listening habits. The radio sector continues to deliver significant public value, providing trusted news, entertainment and—particularly important—companionship for about 50 million listeners every week. UK radio broadcasters make a substantial contribution to the creative industries, and BBC and commercial radio combined generate more than £1.5 billion in gross value added for the UK economy.

I especially welcome the provisions to support the future of the UK radio industry on voice-activated smart speaker platforms, and the removal of outdated regulatory burdens such as music formats on analogue licences for commercial radio stations. When there was a limited number of stations in each market, it was right for the Government to regulate the number of stations that could provide each particular type of service, but today, when there are a great many services, it should be for the market to decide. If country music is not working, it is possible to switch to jazz without spending too much time bothering the regulator.

There are, however, a few parts of the Bill that I should like the Minister to clarify for the industry. Part 5 deals with the safeguarding of local news and information on DAB services, and it would be helpful if the Minister could explain how those powers will work in practice. For instance, how would a multiplex decide which services must carry local news? Would the multiplex owner be responsible for the enforcement against a digital sound service provider, or would that be the responsibility of Ofcom? What would happen if a service carrying local news stopped broadcasting? Would the obligation be transferred to another service holder, or to the multiplex owner? As for Ofcom’s new role in producing guidelines for the regular broadcast of local news, can the Minister tell us when and how Ofcom will be consulting on that process?

Part 6 contains clauses relating to futureproofing. Will there be scope for expansion of the provisions to cover on-demand and online-only radio content provided by UK broadcasters, as opposed to linear content? Finally, may I ask whether the Government will consider an amendment to protect access to radio in cars, which still accounts for about a quarter of all radio listening, by bringing non-voice activated infotainment systems within the scope of the Bill?

I want to touch briefly on the proposals

“for the repeal of section 40 of the Crime and Courts Act 2013”,

a decade-old provision that has never been brought into force. While I appreciated the opportunity to observe the perspective of my right hon. Friend the Member for Camborne and Redruth (George Eustice), whose knowledgeable account of the forming of that legislation was extremely insightful, I am afraid I disagree with the points that he made. It does not seem right to me that publishers who are taken to court could be forced to pay the legal costs of a judgment if they are not a member of an approved regulator, regardless of whether they win or lose the case. I am a firm believer in the freedom of the press. I have spent time working as a journalist, and there have been times when journalists have written about my activities. There are, occasionally, times when I do not like what the press have written, and there are, occasionally, times when I believe that the press have got it wrong. Healthy democracies, however, need objective journalism which is free from state involvement.

The reason I do not agree with my right hon. Friend is this. The Leveson report recommended a system of

“voluntary independent self-regulation”,

envisaging

“a body, established and organised by the industry”

which

“must be funded by its members”.

Lord Justice Leveson said that that body should include all the major players in the industry—national newspapers, and as many regional and local newspaper and magazine publishers as possible—

“although I am very anxious that it remain voluntary”.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Well, I would say to the hon. Gentleman that clause 1 makes clear that there should be a significant quantity of

“audiovisual content that is in, or mainly in, a recognised regional or minority language”.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Just to correct the Minister, it does not say “significant quantity”; it says “sufficient quantity”, but there is no definition of “sufficient”. We are concerned about the fact that that word has not been defined. We want a reasonable amount of Gaelic content to be available.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I apologise to the hon. Lady. She is absolutely right: it does say a

“sufficient quantity of audiovisual content”.

That will be a matter for Ofcom to rule on. MG Alba already gets support—

Oral Answers to Questions

Kirsty Blackman Excerpts
Thursday 16th November 2023

(5 months, 3 weeks ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

I call the SNP spokesperson.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

The post-Brexit tightening of immigration rules and the Brexit-caused cost of living crisis are having a disproportionate impact on the creative sector, as the UK Government continue to squeeze public services. In advance of the autumn statement next week, what representations are the Secretary of State and the Department making to the Chancellor to ensure that the creative sector is adequately funded and protected, so that Scotland can receive the Barnett consequentials from that in order to continue to support our wonderful and, as the Minister says, world-leading creative industry?

John Whittingdale Portrait Sir John Whittingdale
- View Speech - Hansard - - - Excerpts

The Chancellor has been very generous to the creative industries and I hope that he will continue to be so. However, I would point out to the hon. Lady that Creative Scotland benefits from a grant in aid budget of around £63 million, and I would have thought that she might welcome the fact that in the last March Budget the UK Government announced £8.6 million in support for two of Edinburgh’s world-leading festivals.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I call the SNP spokesperson, Kirsty Blackman.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

I congratulate the hon. Member for Gosport (Dame Caroline Dinenage) on what was one of the best speeches on this Bill—and we have heard quite a lot. It was excellent and very thoughtful. I will speak to a number of amendments. I will not cover the Labour amendments in any detail because, as ever, the Labour Front Benchers did an excellent job of that. The right hon. Member for Barking (Dame Margaret Hodge) covered nicely the amendment on liability, and brought up the issue of hate, particularly when pointed towards the Jewish community. I thank her for consistently bringing that up. It is important to hear her voice and others on this issue.

Amendment 43 was tabled by me and my hon. Friend the Member for Ochil and South Perthshire (John Nicolson) and it regards a default toggle for material that we all agree is unsafe or harmful. The Labour party has said that it agrees with the amendment, and the SNP believes that the safest option should be the default option. We should start from a point of view that if anybody wants to see eating disorder content, or racist or incredibly harmful content that does not meet the bar of illegality, they should have to opt in to receive it. They should not see it by default; they should have to make that choice to see such content.

Freedom of speech is written into the Bill. People can say whatever they want as long as it is below that bar of illegality, but we should not have to read it. We should not have to read abuse that is pointed toward minority groups. We should start from the position of having the safest option on. We are trying to improve the permissive approach that the Government have arrived at, and this simple change is not controversial. It would require users to flip a switch if they want to opt in to some of the worst and most dangerous content available online, including pro-suicide, pro-anorexia or pro-bulimia content, rather than leaving that switch on by default.

If the Government want the terms and conditions to be the place where things are excluded or included, I think platforms should have to say, “We are happy to have pro-bulimia or pro-anorexia content.” They should have to make that clear and explicit in their terms of service, rather than having to say, “We do not allow x, y and z.” They should have to be clear, up front and honest with people, because then people would know what they are signing up to when they sign up to a website.

Amendment 44 is on habit forming features, and we have not spoken enough about the habit forming nature of social media in particular. Sites such as TikTok, Instagram and Facebook are set up to encourage people to spend time on them. They make money by encouraging people to spend as much time on them as possible—that is the intention behind them. We know that 42% of respondents to a survey by YoungMinds reported displaying signs of addiction-like behaviour when questioned about their social media habits. Young people are worried about that, and they do not necessarily have the tools to avoid it. We therefore tabled amendment 44 to take that into account, and to require platforms to consider that important issue.

New clause 3, on child user empowerment, was mentioned earlier. There is a bizarre loophole in the Bill requiring user empowerment toggles for adults but not for children. It is really odd not to require them for children when we know that they will be able to see some of this content and access features that are much more inherently dangerous to them than to adults. That is why we tabled amendments on private messaging features and live streaming features.

Live streaming is a place where self-generated child sexual abuse has shot through the roof. With child user empowerment, children would have to opt in, and they would have empowerment tools to allow them opportunities to say, “No, I don’t want to be involved in live streaming,” or to allow their parents to say, “No, I don’t want my child to be able to do live streaming when they sign up to Instagram. I don’t want them able to share live photos and to speak to people they don’t know.” Amendment 46, on private messaging features, would allow children to say, “No, I don’t want to get any private messages from anyone I don’t know.” That is not written into terms of service or in the Bill as a potentially harmful thing, but children should be able to exclude themselves from having such conversations.

We have been talking about the relationship between real life and the online world. If a child is playing in a play park and some stranger comes up and talks to them, the child is perfectly within their rights to say, “No, I’m not speaking to strangers. My parents have told me that, and it is a good idea not to speak to strangers,” but they cannot do that in the online world. We are asking for that to be taken into account and for platforms to allow private messaging and live streaming features to be switched off for certain groups of people. If they were switched off for children under 13, that would make Roblox, for example, a far safer place than it currently is.

I turn to amendment 84, on conversion therapy. I am glad that the amendment was tabled and that there are moves by the UK Government to bring forward the conversion therapy ban. As far as I am aware—I have been in the Chamber all day—we have not yet seen that legislation, but I am told that it will be coming. I pay tribute to all those who have worked really hard to get us to the position where the Government have agreed to bring forward a Bill. They are to be commended on that. I am sorry that it has taken this long, but I am glad that we are in that position. The amendment was massively helpful in that.

Lastly, I turn to amendment 50, on the risk of harm. One of the biggest remaining issues with the Bill is about the categorisation of platforms, which is done on the basis of their size and the risk of their features. The size of the platform—the number of users on it—is the key thing, but that fails to take into account very small and incredibly harmful platforms. The amendment would give Ofcom the power to categorise platforms that are incredibly harmful—incel forums, for example, and Kiwi Farms, set up entirely to dox trans people and put their lives at risk—as category 1 platforms and require them to meet all the rules, risk assessments and things for those platforms.

We should be asking those platforms to answer for what they are doing, no matter how few members they have or how small their user base. One person being radicalised on such a platform is one person too many. Amendment 50 is not an extreme amendment saying that we should ban all those platforms, although we probably should. It would ask Ofcom to have a higher bar for them and require them to do more.

I cannot believe that we are here again and that the Bill has taken so long to get to this point. I agree that the Bill is far from perfect, but it is better than nothing. The SNP will therefore not be voting against its Third Reading, because it is marginally better than the situation that we have right now.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- View Speech - Hansard - - - Excerpts

I want to say in passing that I support amendments 52 and 53, which stand in the name of my hon. Friend the Member for Stroud (Siobhan Baillie) and others. She will explain them fully so I do not need to, but they seem to be sensible clarifications that I hope the Government will consider favourably.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that specific point, does the hon. Lady realise that the empowerment duties in respect of verified and non-verified users apply only to adult users? Children will not have the option to toggle off unverified users, because the user empowerment duties do not allow that to happen.

Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

The evidence we have received is that it is parents who need the powers. I want to normalise the ability to turn off anonymised accounts. I think we will see children do that very naturally. We should also try to persuade their parents to take those stances and to have those conversations in the home. I obviously need to take up the matter with the hon. Lady and think carefully about it as matters proceed through the other place.

We know that parents are very scared about what their children see online. I welcome what the Minister is trying to do with the Bill and I welcome the legislation and the openness to change it. These days, we are all called rebels whenever we do anything to improve legislation, but the reality is that that is our job. We are sending this legislation to the other House in a better shape.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- View Speech - Hansard - -

It has taken a while to get to this point; there have been hours and hours of scrutiny and so much time has been spent by campaigners and external organisations. I have received more correspondence on this Bill from people who really know what they are talking about than on any other I have worked on during my time in the House. I specifically thank the NSPCC and the Mental Health Foundation, which have provided me with a lot of information and advice about the amendments that we have tabled.

The internet is wonderful, exciting and incredibly useful, but it is also harmful, damaging and scary. The Bill is about trying to make it less harmful, damaging and scary while allowing people to still experience the wonderful, exciting and useful parts of it. The SNP will not vote against the Bill on Third Reading, but it would be remiss of me not to mention the issues that we still have with it.

I am concerned that the Government keep saying Children’s “Commissioner” when there are a number of Children’s Commissioners, and it is the Children’s Commissioner for England who has been added as a consultee, not the other ones. That is the decision that they have made, but they need to be clear when they are talking about it.

On protecting children, I am still concerned that there are issues on which the Bill is a little bit too social media-centric and does not necessarily take into account some of the ways that children generally interact with the internet, such as talking to their friends on Fortnite, talking to people they do not know on Fortnite and talking to people on Roblox. Things that are not caught by social media and things that are different are not covered by this as well as I would like. I am concerned that there is less an ability for children not to take part in risky features—to switch off private messaging and livestreaming, for example—than there is just to switch off types of content or features.

Lastly, on the changes that have been made, I do not know what people want to say that they felt they could not say as a result of the previous version of the Bill. I do not know why the Government feel it is okay to say, “Right, we’re concerned about ‘legal but harmful’, because we want people to be able to promote eating disorder content or because we want people to be able to promote self-harm content.” I am sure they do not—I am sure no Ministers want people to be able to promote that—so why have they made this change? Not one person has been able to tell me what they believe they would not be able to say under the previous iteration of the Bill. I have not had one person be able to say that. Ministers can just say “free speech” however much they like, but it does not mean anything if they cannot provide examples of what exactly it is that they believe somebody should be able to say that they could not under the previous iteration of the Bill.

I am glad that we have a Bill and I am glad to hear that a future Labour Government might change the legislation to make it better. I hope this will provide a safer environment for children online, and I hope we can get the Bill implemented as soon as it is through the Lords.

ONLINE SAFETY BILL (Third sitting)

Kirsty Blackman Excerpts
Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It will come as no surprise to Members to hear that we have serious concerns about the system of categorisation and the threshold conditions for platforms and service providers, given our long-standing view that the approach taken is far too inflexible.

In previous sittings, we raised the concern that the Government have not provided enough clarity about what will happen if a service is required to shift from one category to another, and how long that will take. We remain unclear about that, about how shifting categories will work in practice, and about how long Ofcom will have to preside over such changes and decisions.

I have been following this Bill closely for just over a year, and I recognise that the online space is constantly changing and evolving. New technologies are popping up that will make this categorisation process even more difficult. The Government must know that their approach does not capture smaller, high-harm platforms, which we know—we have debated this several times—can be at the root of some of the most dangerous and harmful content out there. Will the Minister clarify whether the Government amendments will allow Ofcom to consider adding such small, high-harm platforms to category 1, given the risk of harm?

More broadly, we are pleased that the Government tabled new clause 7, which will require Ofcom to prepare and update a list of regulated user-to-user services that have 75% of the number of users of a category 1 service, and at least one functionality of a category 1 service, or one required combination of a functionality and another characteristic or factor of a category 1 service. It is absolutely vital that Ofcom, as the regulator, is sufficiently prepared, and that there is monitoring of regulated user-to-user services so that this regime is as flexible as possible and able to cope with the rapid changes in the online space. That is why the Opposition support new clause 7 and have not sought to amend it. Moreover, we also support Government amendments 48 and 49, which are technical amendments to ensure that new clause 7 references user-to-user services and assessments of those services appropriately. I want to press the Minister on how he thinks these categories will work, and on Ofcom’s role in that.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I agree with everything that the hon. Lady said. New clause 7 is important. It was missing from the earlier iterations of the Bill, and it makes sense to have it here, but it raises further concerns about the number of people who are required to use a service before it is classed as category 1. We will come later to our amendment 104 to schedule 11, which is about adding high-risk platforms to the categorisation.

I am still concerned that the numbers are a pretty blunt instrument for categorising something as category 1. The number may end up being particularly high. I think it would be very easy for the number to be wrong—for it to be too high or too low, and probably too high rather than too low.

If Twitter were to disappear, which, given the changing nature of the online world, is not outside the realms of possibility, we could see a significant number of other platforms picking up the slack. A lot of them might have fewer users, but the same level of risk as platforms such as Twitter and Facebook. I am still concerned that choosing a number is a very difficult thing to get right, and I am not totally convinced that the Government’s way of going about this is right.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Ofcom will assess services that are close to meeting the threshold conditions of category 1 services and will publish a publicly available list of those emerging high-risk services. A service would have to meet two conditions to be added to the emerging services list: it would need at least 75% of the number of user figures in any category 1 threshold condition, and at least one functionality of a category 1 threshold condition, or one specified combination of a functionality and a characteristic or factor of a category 1 threshold condition.

Ofcom will monitor the emergence of new services. If it becomes apparent that a service has grown sufficiently to meet the threshold of becoming a category 1 service, Ofcom will be required to add that service to the register. The new clause and the consequential amendments take into account the possibility of quick growth.

Following the removal of “legal but harmful” duties, category 1 services will be subject to new transparency, accountability and free speech duties, as well as duties relating to protection for journalists and democratic content. Requiring all companies to comply with that full range of category 1 duties would pose a disproportionate regulatory burden on smaller companies that do not exert the same influence on public discourse, and that would possibly divert those companies’ resources away from tackling vital tasks.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

My understanding is that only a very small number of platforms will reach the category 1 threshold. We are talking about the platforms that everybody has heard of—Facebook, Twitter and so on—and not about the slightly smaller platforms that lots of people have heard of and use. We are probably not talking about platforms such as Twitch, which has a much smaller user base than Facebook and Twitter but has a massive reach. My concern continues to be that the number threshold does not take into account the significant risks of harm from some of those platforms.

I have a specific question about amendment 76. I agree with my Labour Front-Bench colleague, the hon. Member for Pontypridd, that it shows that the Government are willing to take into account other factors. However, I am concerned that the Secretary of State is somehow being seen as the arbiter of knowledge—the person who is best placed to make the decisions—when much more flexibility could have been given to Ofcom instead. From all the evidence I have heard and all the people I have spoken to, Ofcom seems much more expert in dealing with what is happening today than any Secretary of State could ever hope to be. There is no suggestion about how the Secretary of State will consult, get information and make decisions on how to change the threshold conditions.

It is important that other characteristics that may not relate to functionalities are included if we discover that there is an issue with them. For example, I have mentioned livestreaming on a number of occasions in Committee, and we know that livestreaming is inherently incredibly risky. The Secretary of State could designate livestreaming as a high-risk functionality, and it could be included, for example, in category 1. I do not know whether it will be, but we know that there are risks there. How will the Secretary of State get that information?

There is no agreement to set up a user advocacy board. The requirement for Ofcom to consult the Children’s Commissioner will be brought in later, but organisations such as the National Society for the Prevention of Cruelty to Children, which deals with phone calls from children asking for help, are most aware of emerging threats. My concern is that the Secretary of State cannot possibly be close enough to the issue to make decisions, unless they are required to consult and listen to organisations that are at the coal face and that regularly support people. I shall go into more detail about high-harm platforms when we come to amendment 104.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendments give the Secretary of State the flexibility to consider other characteristics of services as well as other relevant factors, which include functionalities, user base, business model, governance, and other systems and processes. They effectively introduce greater flexibility into the designation process, so that category 1 services are designated only if they have significant influence over public discourse. Although the Secretary of State will make the regulations, Ofcom will carry out the objective and evidence-based process, which will be subject to parliamentary scrutiny via statutory instruments. The Secretary of State will have due consultation with Ofcom at every stage, but to ensure flexibility and the ability to move fast, it is important that the Secretary of State has those powers.

Amendment 76 agreed to.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 104, in schedule 11, page 213, line 11, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

I would say this, but I think that this is the most important amendment. The key area that the Government are getting wrong is the way in which platforms, providers or services will be categorised. The threshold is based on the number of users. It is the number of users “and” one of those other things, not the number of users “or” one of those other things; even that would make a significant difference.

The Secretary of State talked about the places that have a significant influence over public discourse. It is perfectly possible to have a significant influence over public discourse with a small number of users, or with a number of users that does not number into the millions. We have seen the spread of conspiracy theories that have originated and been perpetuated on very small platforms—very small, shady places on the internet that none of us has experienced or even heard of. Those are the places that have a massive impact and effect.

We know that one person can have a significant impact on the world and on people’s lives. We have heard about the physical harm that people can be incited to cause by the platforms they access, and the radicalisation and extremism they find themselves subject to. That can cause massive, damaging effects to anybody they choose to take physical action against, and to some of the most marginalised communities and groups in society. We are seeing an increase in the amount of hate crime and the number of people who believe conspiracy theories, and not all of that is because of the spread of those things on Facebook and Twitter. It is because of the breadcrumbing and the spread that there can be on smaller platforms.

The most extreme views do not necessarily tip over into “illegal” or “incitement”; they do not actually say, “Please go out and kill everybody in this particular group.” They say, “This particular group is responsible for all of ills you feel and for every negative thing that is happening in your life”, and people are therefore driven to take extremist, terrorist action. That is a significant issue.

I want to talk about a couple of platforms. Kiwi Farms, which is no longer in existence and has been taken down, was a very small platform that dramatically damaged the lives of trans people in particular. It was a platform where people went to incite hatred and give out the addresses of folk who they knew were members of the trans community. Some of those people had to move to another continent to get away from the physical violence and attacks they faced as a result of the behaviour on that incredibly small platform, which very few people will have heard about.

Kiwi Farms has been taken down because the internet service providers decided that it was too extreme and they could not possibly host it any more. That was eventually recognised and change was made, but the influence that that small place had on lives—the difficulties and harm it caused—is untold. Some of that did tip over into illegality, but some did not.

I also want to talk about the places where there is a significant amount of pornography. I am not going to say that I have a problem with pornography online; the internet will always have pornography on it. It attracts a chunk of people to spend time online, and some of that pornography is on large mainstream sites. Searches for incest, underage girls, or black women being abused all get massive numbers of hits. There is a significant amount of pornography on these sites that is illegal, that pretends to be illegal or that acts against people with protected characteristics. Research has found that a significant proportion—significantly more than a half—of pornography on mainstream sites that involves black women also involves violence. That is completely and totally unacceptable, and has a massive negative impact on society, whereby it reinforces negativity and discrimination against groups that are already struggling with being discriminated against and that do not experience the privilege of a cis white man.

It is really grim that we are requiring a number of users to be specified, when we know the harm that caused by platforms that do not have 10 million or 20 million United Kingdom users. I do not know what the threshold will be, but I know it will be too high to include a lot of platforms that have a massive effect. The amendment is designed specifically to give Ofcom the power to designate as category 1 any service that it thinks has a very high risk of harm; I have not set the bar particularly low. Now that the Minister has increased the levels of transparency that will be required for category 1 platforms, it is even more important that we subject extremist sites and platforms—the radicalising ones, which are perpetuating discrimination—to a higher bar and require them to have the transparency that they need as a category 1 service. This is a place where the Bill could really make a difference and change lives, and I am really concerned that it is massively failing to do so.

The reason I have said that it should be Ofcom’s responsibility to designate category 1 services is on the basis that it has the experts who will be looking at all the risk assessments, dealing with companies on a day-to-day basis, and seeing the harms and transparencies that the rest of us will not be able to see. The reporting mechanisms will be public for only some of the category 1 platforms, and we will not be able to find out the level of information that Ofcom has, so it is right that it should be responsible for designating sites as having a very high risk of harm. That is why I tabled the amendment, which would make a massive difference to people who are the most discriminated against as it is and who are the most at risk of harm from extremism. I urge the Minister to think again.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise briefly to support everything the hon. Member for Aberdeen North just said. We have long called for the Bill to take a harm-led approach; indeed, the Government initially agreed with us, as when it was in its first iteration it was called the Online Harms Bill rather than the Online Safety Bill. Addressing harm must be a central focus of the Bill, as we know extremist content is perpetuated on smaller, high-harm platforms; this is something that the Antisemitism Policy Trust and Hope not Hate have long called for with regards to the Bill.

I want to put on the record our huge support for the amendment. Should the hon. Lady be willing to push it to a vote—I recognise that we are small in number—we will absolutely support her.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

As debated earlier, we are removing the adult safety duties from the Bill, which means that no company will face any duties related to legal but harmful content. In their place, the Government are introducing new transparency accountability, and free speech duties on category 1 services. They have been discussed in detail earlier this session.

It would not be proportionate to apply those new duties to smaller services, but, as we have heard from my hon. Friend the Member for Folkestone and Hythe, they will still have to comply with the illegal content and child safety duties if they are accessed by children. Those services have limited resources, and blanket applying additional duties on them would divert those resources away from complying with the illegal content and child safety duties. That would likely weaken the duties’ impact on tackling criminal activity and protecting children.

The new duties are about user choice and accountability on the largest platforms—if users do not want to use smaller harmful sites, they can choose not to—but, in recognition of the rapid pace with which companies can grow, I introduced an amendment earlier to create a watchlist of companies that are approaching the category 1 threshold, which will ensure that Ofcom can monitor rapidly scaling companies, reduce any delay in designating companies as category 1 services, and apply additional obligations on them.

The hon. Member for Aberdeen North talked about ISPs acting with respect to Kiwi Farms. I talked on Tuesday about the need for a holistic approach. There is not one silver bullet. It is important to look at Government, the platforms, parenting and ISPs, because that makes up a holistic view of how the internet works. It is the multi-stakeholder framework of governing the internet in its entirety, rather than the Government trying to do absolutely everything. We have talked a lot about illegality, and I think that a lot of the areas in that case were illegal; the hon. Lady described some very distasteful things. None the less, with the introduction of the watchlist, I do not believe amendment 104 is required.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The hon. Member for Folkestone and Hythe made a good point. I do not disagree that Ofcom will have a significant role in policing platforms that are below the category 1 threshold. I am sure it will be very hands on, particularly with platforms that have the highest risk and are causing the most harm.

I still do not think that is enough. I do not think that the Minister’s change with regard to emerging platforms should be based on user numbers. It is reasonable for us to require platforms that encourage extremism, spread conspiracy theories and have the most horrific pornography on them to meet a higher bar of transparency. I do not really care if they only have a handful of people working there. I am not fussed if they say, “Sorry, we can’t do this.” If they cannot keep people safe on their platform, they should have to meet a higher transparency bar, provide more information on how they are meeting their terms of service and provide toggles—all those things. It does not matter how small these platforms are. What matters is that they have massive risks and cause massive amounts of harm. It is completely reasonable that we hold them to a higher regulatory bar. On that basis, I will push the amendment to a vote.

--- Later in debate ---
Question proposed, That the clause, as amended, stand part of the Bill.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am glad that there is a review function in the Bill. I have been a member of a lot of Bill Committees and Delegated Legislation Committees that have considered legislation that has no review function and that says, “This will be looked at in the normal course of departmental reviews.” We know that not all Departments always do such reviews. In fact, some Departments do under 50% of the reviews that they are supposed to do, and whether reviews take place is not checked. We therefore we do not find out whether a piece of legislation has had the intended effect. I am sure some will have done, but some definitely will not.

If the Government do not internally review whether a Bill or piece of delegated legislation has had the effect it was supposed to have, they cannot say whether it has been a success and cannot make informed decisions about future legislation, so having a review function in this Bill is really good. However, that function is insufficient as it is not enough for the Secretary of State to do the review and we will not see enough outputs from Ofcom.

The Bill has dominated the lives of a significant number of parliamentarians for the past year—longer, in some cases—because it is so important and because it has required so much scrutiny, thinking and information gathering to get to this stage. That work will not go away once the Bill is enacted. Things will not change or move at once, and parts of the legislation will not work as effectively as they could, as is the case for any legislation, whether moved by my Government or somebody else’s. In every piece of legislation there will be things that do not pan out as intended, but a review by the Secretary of State and information from Ofcom about how things are working do not seem to be enough.

Committee members, including those on the Government Benches, have suggested having a committee to undertake the review or adding that function to the responsibilities of the Digital, Culture, Media and Sport Committee. We know that the DCMS Committee is busy and will be looking into a significant number of wide-ranging topics, so it would be difficult for it to keep a watching brief on the Online Safety Bill.

The previous Minister said that there will be some sort of reviewing mechanism, but I would like further commitment from the Government that the Bill will be kept under review and that the review process as set out will not be the only type of review that happens as things move and change and the internet develops. Many people talk about more widespread use of virtual reality, for example, but there could be other things that we have not even heard of yet. After the legislation is implemented, it will be years before every part of the Bill is in action and every requirement in the legislation is working. By the time we get to 2027-28—or whenever every part of the legislation is working—things could have changed again and be drastically different to today. Indeed, the legislation may not be fit for purpose when it first starts to work, so will the Minister provide more information about what the review process will look like on an ongoing basis? The Government say this is world-leading legislation, but how we will ensure that that is the case and that it makes a difference to the safety and experience of both children and adults online?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am glad that we are all in agreement on the need for a review. It is important that we have a comprehensive and timely review of the regulatory regime and how it is built into legislation. It is important that we understand that the legislation has the impact that we intend.

The legislation clearly sets out what the review must consider, how Ofcom is carrying out its role and if the legislation is effective in dealing with child protection, which as the hon. Lady rightly says is its core purpose. We have struck the balance of specifying two to five years after the regime comes into force, because it provides a degree of flexibility to future Ministers to judge when it should happen. None the less, I take the hon. Lady’s point that technology is developing. That is why this is a front-footed first move in this legislation, when other countries are looking at what we are doing; because of that less prescriptive approach to technologies, the legislation can be flexible and adapt to emerging new technologies. Inevitably, this will not be the last word. Some of the things in the Digital Economy Act 2017, for example, are already out of date, as is some of the other legislation that was put in place in the early 2000s. We will inevitably come back to this, but I think we have the right balance at the moment in terms of the timing.

I do not think we need to bed in whom we consult, but wider consultation will none the less be necessary to ascertain the effectiveness of the legislation.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We are setting out in schedule 17 how the existing video-sharing platform regime will be repealed in the transitional provisions that apply to these providers as they transition to the online safety framework. My understanding is that it does include livestreaming, but I will obviously write to the hon. Lady if I have got that wrong. I am not sure there is a significant legal effect here. To protect children and treat services fairly while avoiding unnecessary burdens on business, we are maintaining the current user protections in the VSP regime while the online safety framework is being implemented. That approach to transition avoids the duplication of regulation.

Question put and agreed to.

Schedule 17, as amended, accordingly agreed to.

Clause 203

Interpretation: general

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 105, in clause 203, page 167, line 8, after “including” insert “but not limited to”.

This amendment makes clear that the definition provided for content is not exhaustive.

I am delighted that we have a new Minister, because I can make exactly the same speech as I made previously in Committee—don’t worry, I won’t—and he will not know.

I still have concerns about the definition of “content”. I appreciate that the Government have tried to include a number of things in the definition. It currently states:

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.

That is pretty wide-ranging, but I do not think it takes everything into account. I know that it uses the word “including”; it does not say “only limited to” or anything like that. If there is to be a list of stuff, it should be exhaustive. That is my idea of how the Bill should be.

I have suggested in amendment 105 that we add “not limited to” after “including” in order to be absolutely clear that the content that we are talking about includes anything. It may or may not be on this list. Something that is missing from the list is VR technology. If someone is using VR or immersive technology and is a character on the screen, they can see what the character is doing and move their body around as that character, and whatever they do is user-generated content. It is not explicitly included in the Bill, even though there is a list of things. I do not even know how that would be written down in any way that would make sense.

I have suggested adding “not limited to” to make it absolutely clear that this is not an exhaustive list of the things that could be considered to be user-generated content or content for the purposes of the Bill. It could be absolutely anything that is user-generated. If the Minister is able to make it absolutely clear that this is not an exhaustive list and that “content” could be anything that is user-generated, I will not press the amendment to a vote. I would be happy enough with that commitment.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed I can give that commitment. This is an indicative list, not an exhaustive list, for the reasons that the hon. Lady set out. Earlier, we discussed the fact that technology moves on, and she has come up with an interesting example. It is important to note that adding unnecessary words in legislation could lead to unforeseen outcomes when it is interpreted by courts, which is why we have taken this approach, but we think it does achieve the same thing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that basis, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendment proposed: 58, in clause 203, page 167, leave out lines 26 to 31. —(Paul Scully.)

This amendment removes the definition of the “maximum summary term for either-way offences”, as that term has been replaced by references to the general limit in a magistrates’ court.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I would like to ask the Minister why this amendment has been tabled. I am not entirely clear. Could he give us some explanation of the intention behind the amendment? I am pretty sure it will be fine but, if he could just let us know what it is for, that would be helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am happy to do so. Clause 203 sets out the interpretation of the terms used throughout the Bill. Amendment 58 removes a definition that is no longer required because the term is no longer in the Bill. It is as simple as that. The definition of relevant crime penalties under the Bill now uses a definition that has been updated in the light of changes to sentencing power in magistrates courts set out in the Judicial Review And Courts Act 2022. The new definition of

“general limit in a magistrates court”

is now included in the Interpretation Act 1978, so no definition is required in this Bill.

Question put and agreed to.

Amendment 58 accordingly agreed to.

Amendment made: 59, in clause 203, page 168, line 48, at end insert—

“and references to restrictions on access to a service or to content are to be read accordingly.” —(Paul Scully.)

NC2 states what is meant by restricting users’ access to content, and this amendment makes it clear that the propositions in clause 203 about access read across to references about restricting access.

Question proposed, That the clause, as amended, stand part of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Once again, I will abuse the privilege of having a different Minister at the Dispatch Box and mention the fact that, in the definitions, “oral communications” is mentioned in line 9 that we already mentioned in terms of the definition of “content”. It is “oral communications” in this part of the Bill but “aural communications” in an earlier part of the Bill. I am still baffled as to why there is a difference. Perhaps we should have both included in both of these sections or perhaps there should be some level of consistency throughout the Bill.

The “aural communications” section that I mentioned earlier in clause 50 is the one of the parts that I am particularly concerned about because it could create a loophole. That is a different spelling of the word. I asked this last time. I am not convinced that the answer I got gave me any more clarity than I had previously. I would be keen to understand why there is a difference, if the difference is intentional and what the difference therefore is between “oral” and “aural” communications in terms of the Bill. My understanding is that oral communications are ones that are said and aural communications are ones that are heard. But, for the purposes of the Bill, those two things are really the same, unless user-generated content in which there is user-generated oral communication that no one can possibly hear is included. That surely does not fit into the definitions, because user-generated content is only considered if it is user-to-user—something that other people can see. Surely, oral communication would also be aural communication. In pretty much every instance that the Bill could possibly apply to, both definitions would mean the same thing. I understand the Minister may not have the answer to this at his fingertips, and I would be happy to hear from him later if that would suit him better.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The clause provides legal certainty about the meaning of those terms as used in the Bill: things such as “content”, “encounter”, “taking down” and “terms of service”. That is what the clause is intended to do. It is intentional and is for the reasons the hon. Lady said. Oral means speech and speech only. Aural is speech and other sounds, which is what can be heard on voice calls. That includes music as well. One is speech. The other is the whole gamut.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My knowledge is being tested, so I will write to the hon. Member for Aberdeen North and make that available to the Committee. Coming back to the point she made about oral and aural on Tuesday about another clause on the exclusions, as I said, we have a narrow exemption to ensure that traditional phone calls are not subject to regulation. But that does mean that if a service such as Fortnite, which she spoke about previously, enables adults and children to have one-to-one oral calls, companies will still need to address the surrounding functionality around how that happens, because to enable that might cause harm—for example if an adult can contact an unknown child. That is still captured within the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Platforms will have to address, for example, the ways in which users can communicate with people who are not on their friends list. Things like that and other ways in which communication can be set up will have to be looked at in the risk assessment. With Discord, for instance, where two people can speak to each other, Discord will have to look at the way those people got into contact with each other and the risks associated with that, rather than the conversation itself, even though the conversation might be the only bit that involves illegality.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It is the functionalities around it that enable the voice conversation to happen.

Question put and agreed to.

Clause 203, as amended, accordingly ordered to stand part of the Bill.

Clause 206

Extent

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour is pleased to see the introduction of the new clause, which clarifies the role of Ofcom in delivering guidance to providers about their duties. Specifically, the new clause will require Ofcom to give guidance to providers on the kind of content that Ofcom considers to be harmful to children, or relevant to the user empowerment duty in clause 14. That is a very welcome addition indeed.

Labour remains concerned about exactly how these so-called user empowerment tools will work in practice—we have discussed that at length—and let us face it: we have had little assurance from the Minister on that point. We welcome the new clause, as it clarifies what guidance providers can expect to receive from Ofcom once the Bill is finally enacted. We can all recognise that Ofcom has a colossal task ahead of it—the Minister said so himself—so it is particularly welcome that the guidance will be subject to consultation with those that it deems appropriate. I can hope only that that will include the experts, and the many groups that provided expertise, support and guidance on internet regulation long before the Bill even received its First Reading, a long time ago. There are far too many of those experts and groups to list, but it is fundamental that the experts who often spot online harms before they properly emerge be consulted and included in this process if we are to truly capture the priority harms to children, as the new clause intends.

We also welcome the clarification in subsection (2) that Ofcom will be required to provide “examples of content” that would be considered to be—or not be—harmful. These examples will be key to ensuring that the platforms have nowhere to hide when it comes to deciding what is harmful; there will be no grey area. Ofcom will have the power to show them exact examples of what could be deemed harmful.

We recognise, however, that there is subjectivity to the work that Ofcom will have to do once the Bill passes. On priority content, it is most important that providers are clear about what is and is not acceptable; that is why we welcome the new clause, but we do of course wish that the Government applied the same logic to harm pertaining to adults online.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am also happy to support new clause 1, but I have a couple of questions. It mentions that “replacement guidance” may be provided, which is important because, as we have said a number of times, things will change, and we will end up with a different online experience; that can happen quickly. I am glad that Ofcom has the ability to refresh and update the guidance.

My question is about timelines. There do not seem to be any timelines in the new clause for when the guidance is required to be published. It is key that the guidance be published before companies and organisations have to comply with it. My preference would be for it to be published as early as possible. There may well need to be more work, and updated versions of the guidance may therefore need to be published, but I would rather companies had an idea of the direction of travel, and what they must comply with, as soon as possible, knowing that it might be tweaked. That would be better than waiting until the guidance was absolutely perfect and definitely the final version, but releasing it just before people had to start complying with it. I would like an assurance that Ofcom will make publishing the guidance a priority, so that there is enough time to ensure compliance. We want the Bill to work; it will not work if people do not know what they have to comply with. Assurance on that would be helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I absolutely give that assurance to the hon. Lady; that is important. We all want the measures to be implemented, and the guidance to be out there, as soon as possible. Just now I talked about the platforms bringing in measures as soon as possible, without waiting for the implementation period. They can do that far better if they have the guidance. We are already working with Ofcom to ensure that the implementation period is as short as possible, and we will continue to do so.

Question put and agreed to.

New clause 1 accordingly read a Second time, and added to the Bill.

New Clause 2

Restricting users’ access to content

“(1) This section applies for the purposes of this Part.

(2) References to restricting users’ access to content, and related references, include any case where a provider takes or uses a measure which has the effect that—

(a) a user is unable to access content without taking a prior step (whether or not taking that step might result in access being denied), or

(b) content is temporarily hidden from a user.

(3) But such references do not include any case where—

(a) the effect mentioned in subsection (2) results from the use or application by a user of features, functionalities or settings which a provider includes in a service in compliance with the duty set out in section 14(2) (user empowerment), or

(b) access to content is controlled by another user, rather than the provider.

(4) See also section 203(5).”—(Paul Scully.)

This new clause deals with the meaning of references to restricting users’ access to content, in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Duty not to act against users except in accordance with terms of service

“(1) A provider of a Category 1 service must operate the service using proportionate systems and processes designed to ensure that the provider does not—

(a) take down regulated user-generated content from the service,

(b) restrict users’ access to regulated user-generated content, or

(c) suspend or ban users from using the service,

except in accordance with the terms of service.

(2) Nothing in subsection (1) is to be read as preventing a provider from taking down content from a service or restricting users’ access to it, or suspending or banning a user, if such an action is taken—

(a) to comply with the duties set out in—

(i) section 9(2) or (3) (protecting individuals from illegal content), or

(ii) section 11(2) or (3) (protecting children from content that is harmful to children), or

(b) to avoid criminal or civil liability on the part of the provider that might reasonably be expected to arise if such an action were not taken.

(3) In addition, nothing in subsection (1) is to be read as preventing a provider from—

(a) taking down content from a service or restricting users’ access to it on the basis that a user has committed an offence in generating, uploading or sharing it on the service, or

(b) suspending or banning a user on the basis that—

(i) the user has committed an offence in generating, uploading or sharing content on the service, or

(ii) the user is responsible for, or has facilitated, the presence or attempted placement of a fraudulent advertisement on the service.

(4) The duty set out in subsection (1) does not apply in relation to—

(a) consumer content (see section (Interpretation of this Chapter));

(b) terms of service which deal with the treatment of consumer content.

(5) If a person is the provider of more than one Category 1 service, the duty set out in subsection (1) applies in relation to each such service.

(6) The duty set out in subsection (1) extends only to the design, operation and use of a service in the United Kingdom, and references in this section to users are to United Kingdom users of a service.

(7) In this section—

‘criminal or civil liability’ includes such a liability under the law of a country outside the United Kingdom;

‘fraudulent advertisement’ has the meaning given by section 35;

‘offence’ includes an offence under the law of a country outside the United Kingdom.

(8) See also section 16 (duties to protect news publisher content).”—(Paul Scully.)

This new clause imposes a duty on providers of Category 1 services to ensure that they do not take down content or restrict users’ access to it, or suspend or ban users, except in accordance with the terms of service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 4

Further duties about terms of service

All services

“(1) A provider of a regulated user-to-user service must include clear and accessible provisions in the terms of service informing users about their right to bring a claim for breach of contract if—

(a) regulated user-generated content which they generate, upload or share is taken down, or access to it is restricted, in breach of the terms of service, or

(b) they are suspended or banned from using the service in breach of the terms of service.

Category 1 services

(2) The duties set out in subsections (3) to (7) apply in relation to a Category 1 service, and references in subsections (3) to (9) to ‘provider’ and ‘service’ are to be read accordingly.

(3) A provider must operate a service using proportionate systems and processes designed to ensure that—

(a) if the terms of service state that the provider will take down a particular kind of regulated user-generated content from the service, the provider does take down such content;

(b) if the terms of service state that the provider will restrict users’ access to a particular kind of regulated user-generated content in a specified way, the provider does restrict users’ access to such content in that way;

(c) if the terms of service state cases in which the provider will suspend or ban a user from using the service, the provider does suspend or ban the user in those cases.

(4) A provider must ensure that—

(a) terms of service which make provision about the provider taking down regulated user-generated content from the service or restricting users’ access to such content, or suspending or banning a user from using the service, are—

(i) clear and accessible, and

(ii) written in sufficient detail to enable users to be reasonably certain whether the provider would be justified in taking the specified action in a particular case, and

(b) those terms of service are applied consistently.

(5) A provider must operate a service using systems and processes that allow users and affected persons to easily report—

(a) content which they consider to be relevant content (see section (Interpretation of this Chapter));

(b) a user who they consider should be suspended or banned from using the service in accordance with the terms of service.

(6) A provider must operate a complaints procedure in relation to a service that—

(a) allows for complaints of a kind mentioned in subsection (8) to be made,

(b) provides for appropriate action to be taken by the provider of the service in response to complaints of those kinds, and

(c) is easy to access, easy to use (including by children) and transparent.

(7) A provider must include in the terms of service provisions which are easily accessible (including to children) specifying the policies and processes that govern the handling and resolution of complaints of a kind mentioned in subsection (8).

(8) The kinds of complaints referred to in subsections (6) and (7) are—

(a) complaints by users and affected persons about content present on a service which they consider to be relevant content;

(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in any of subsections (1) or (3) to (5);

(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is relevant content;

(d) complaints by users who have been suspended or banned from using a service.

(9) The duties set out in subsections (3) and (4) do not apply in relation to terms of service which—

(a) make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children), or

(b) deal with the treatment of consumer content.

Further provision

(10) If a person is the provider of more than one regulated user-to-user service or Category 1 service, the duties set out in this section apply in relation to each such service.

(11) The duties set out in this section extend only to the design, operation and use of a service in the United Kingdom, and references to users are to United Kingdom users of a service.

(12) See also section 16 (duties to protect news publisher content).”—(Paul Scully.)

Subsections (3) to (8) of this new clause impose new duties on providers of Category 1 services in relation to terms of service that allow a provider to take down content or restrict users’ access to it, or to suspend or ban users. Such terms of service must be clear and applied consistently. Subsection (1) of the clause contains a duty which, in part, was previously in clause 20 of the Bill.

Brought up, read the First and Second time, and added to the Bill.

New Clause 5

OFCOM’s guidance about duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service)

“(1) OFCOM must produce guidance for providers of Category 1 services to assist them in complying with their duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service)(3) to (7).

(2) OFCOM must publish the guidance (and any revised or replacement guidance).”—(Paul Scully.)

This new clause requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4.

Brought up, read the First and Second time, and added to the Bill.

New Clause 6

Interpretation of this Chapter

“(1) This section applies for the purposes of this Chapter.

(2) “Regulated user-generated content” has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question.

(3) “Consumer content” means—

(a) regulated user-generated content that constitutes, or is directly connected with content that constitutes, an offer to sell goods or to supply services,

(b) regulated user-generated content that amounts to an offence under the Consumer Protection from Unfair Trading Regulations 2008 (S.I. 2008/1277) (construed in accordance with section 53: see subsections (3), (11) and (12) of that section), or

(c) any other regulated user-generated content in relation to which an enforcement authority has functions under those Regulations (see regulation 19 of those Regulations).

(4) References to restricting users’ access to content, and related references, are to be construed in accordance with sections (Restricting users’ access to content) and 203(5).

(5) Content of a particular kind is “relevant content” if—

(a) a term of service, other than a term of service mentioned in section (Further duties about terms of service)(9), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and

(b) it is regulated user-generated content.

References to relevant content are to content that is relevant content in relation to the service in question.

(6) “Affected person” means a person, other than a user of the service in question, who is in the United Kingdom and who is—

(a) the subject of the content,

(b) a member of a class or group of people with a certain characteristic targeted by the content,

(c) a parent of, or other adult with responsibility for, a child who is a user of the service or is the subject of the content, or

(d) an adult providing assistance in using the service to another adult who requires such assistance, where that other adult is a user of the service or is the subject of the content.

(7) In determining what is proportionate for the purposes of sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service), the size and capacity of the provider of a service is, in particular, relevant.

(8) For the meaning of “Category 1 service”, see section 83 (register of categories of services).”—(Paul Scully.)

This new clause gives the meaning of terms used in NC3 and NC4.

Brought up, read the First and Second time, and added to the Bill.

New Clause 7

List of emerging Category 1 services

“(1) As soon as reasonably practicable after the first regulations under paragraph 1(1) of Schedule 11 come into force (regulations specifying Category 1 threshold conditions), OFCOM must comply with subsections (2) and (3).

(2) OFCOM must assess each regulated user-to-user service which they consider is likely to meet each of the following conditions, to determine whether the service does, or does not, meet them—

(a) the first condition is that the number of United Kingdom users of the user-to-user part of the service is at least 75% of the figure specified in any of the Category 1 threshold conditions relating to number of users (calculating the number of users in accordance with the threshold condition in question);

(b) the second condition is that—

(i) at least one of the Category 1 threshold conditions relating to functionalities of the user-to-user part of the service is met, or

(ii) if the regulations under paragraph 1(1) of Schedule 11 specify that a Category 1 threshold condition relating to a functionality of the user-to-user part of the service must be met in combination with a Category 1 threshold condition relating to another characteristic of that part of the service or a factor relating to that part of the service (see paragraph 1(4) of Schedule 11), at least one of those combinations of conditions is met.

(3) OFCOM must prepare a list of regulated user-to-user services which meet the conditions in subsection (2).

(4) The list must contain the following details about a service included in it—

(a) the name of the service,

(b) a description of the service,

(c) the name of the provider of the service, and

(d) a description of the Category 1 threshold conditions by reference to which the conditions in subsection (2) are met.

(5) OFCOM must take appropriate steps to keep the list up to date, including by carrying out further assessments of regulated user-to-user services.

(6) OFCOM must publish the list when it is first prepared and each time it is revised.

(7) When assessing whether a service does, or does not, meet the conditions in subsection (2), OFCOM must take such steps as are reasonably practicable to obtain or generate information or evidence for the purposes of the assessment.

(8) An assessment for the purposes of this section may be included in an assessment under section 83 or 84 (as the case may be) or carried out separately.”—(Paul Scully.)

This new clause requires OFCOM to prepare and keep up to date a list of regulated user-to-user services that have 75% of the number of users of a Category 1 service, and at least one functionality of a Category 1 service or one required combination of a functionality and another characteristic or factor of a Category 1 service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 8

Child user empowerment duties

“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 58(1)).

(12) In this section references to features include references to functionalities and settings.”—(Kirsty Blackman.)

Brought up, and read the First time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move, That the clause be read a Second time.

That was some stretch of procedure, Dame Angela, but we got there in the end. This new clause is about child user empowerment duties. I am really pleased that the Government have user empowerment duties in the Bill—they are a good thing—but I am confused as to why they apply only to adult users, and why children do not deserve the same empowerment rights over what they access online.

In writing the new clause, I pretty much copied clause 14, before there were any amendments to it, and added a couple of extra bits: subsections (8) and (9). In subsection (8), I have included:

“A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.”

That would go a step further than the verification process and allow users to approve only people who are in their class at school, people with whom they are friends, or even certain people in their class at school, and to not have others on that list. I know that young people playing Fortnite—I have mentioned Fortnite a lot because people play it a lot—or Roblox are contacted by users whom they do not know, and there is no ability for young people to switch off some of the features while still being able to contact their friends. Users can either have no contact from anyone, or they can have a free-for-all. That is not the case for all platforms, but a chunk of them do not let users speak only to people on their friends list, or receive messages only from people on the list.

My proposed subsection (8) would ensure that children could have a “white list” of people who they believe are acceptable, and who they want to be contacted by, and could leave others off the list. That would help tackle not just online child exploitation, but the significant online bullying that teachers and children report. Children have spoken of the harms they experience as a result of people bullying them and causing trouble online; the perpetrators are mainly other children. Children would be able to remove such people from the list and so would not receive any content, messages or comments from those who make their lives more negative.

Subsection (9) is related to subsection (8); it would require a service to include

“features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.”

Adults looking to exploit children will use private messaging on platforms such as Instagram. Instagram has to know how old its users are, so anybody who is signed up to it will have had to provide it with their date of birth. It is completely reasonable for a child to say, “I want to filter out everything from an adult.” When we talk about children online, we are talking about anybody from zero to 18, which is a very wide age range. Some of those people will be working and paying bills, but will not have access to the empowerment features that adults have access to, because they have not yet reached that magical threshold. Some services may decide to give children access to user empowerment tools, but there is no requirement to. The only requirement in the Bill on user empowerment tools is for adults. That is not fair.

Children should have more control over the online environment. We know how many children feel sad as a result of their interactions online, and how many encounter content online that they wish they had never seen and cannot unsee. We should give them more power over that, and more power to say, “No, I don’t want to see that. I don’t want people I don’t know contacting me. I don’t want to get unsolicited messaged. I don’t want somebody messaging me, pretending that they are my friend or that they go to another school, when they are in fact an adult, and I won’t realise until it is far too late.”

The Bill applies to people of all ages. All of us make pretty crappy decisions sometimes. That includes teenagers, but they also make great decisions. If there was a requirement for them to have these tools, they could choose to make their online experience better. I do not think this was an intentional oversight, or that the Government set out to disadvantage children when they wrote the adult user empowerment clauses. I think they thought that it would be really good to have those clauses in the Bill, in order to give users a measure of autonomy over their time and interactions online. However, they have failed to include the same thing for children. It is a gap.

I appreciate that there are child safety duties, and that there is a much higher bar for platforms that have child users, but children are allowed a level of autonomy; look at the UN convention on the rights of the child. We give children choices and flexibilities; we do not force them to do every single thing they do, all day every day. We recognise that children should be empowered to make decisions where they can.

I know the Government will not accept the provision—I am not an idiot. I have never moved a new clause in Committee that has been accepted, and I am pretty sure that it will not happen today. However, if the Government were to say that they would consider, or even look at the possibility of, adding child user empowerment duties to the Bill, the internet would be a more pleasant place for children. They are going to use it anyway; let us try to improve their online experience even more than the Bill does already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The hon. Member for Aberdeen North has outlined the case for the new clause eloquently and powerfully. She may not press it to a Division, if the Minister can give her assurances, but if she did, she would have the wholehearted support of the Opposition.

We see new clause 8 as complementing the child safety duties in the legislation. We fully welcome provisions that provide children with greater power and autonomy in choosing to avoid exposure to certain types of content. We have concerns about how the provisions would work in practice, but that issue has more to do with the Government’s triple-shield protections than the new clause.

The Opposition support new clause 8 because it aims to provide further protections, in addition to the child safety duties, to fully protect children from harmful content and to empower them. It would empower and enable them to filter out private messages from adults or non-verified users. We also welcome the measures in the new clause that require platforms and service providers to design accessible terms of service. That is absolutely vital to best protect children online, which is why we are all here, and what the legislation was designed for.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The aim of the user empowerment duty is to give adults more control over certain categories of legal content that some users will welcome greater choice over. Those duties also give adult users greater control over who they interact with online, but these provisions are not appropriate for children. As the hon. Member for Aberdeen North acknowledged, there are already separate duties on services likely to be accessed by children, in scope of part 3, to undertake comprehensive risk assessments and to comply with safety duties to protect children from harm. That includes requirements to assess how many specific functionalities may facilitate the spread of harmful content, as outlined in clause 10(6)(e), and to protect children from harmful content, including content that has been designated as priority harmful content, by putting in place age-appropriate protections.

As such, children will not need to be provided with tools to control any harmful content they see, as the platform will need to put in place age-appropriate protections. We do not want to give children an option to choose to see content that is harmful to them. The Bill also outlines in clause 11(4)(f) that, where it is proportionate to do so, service providers will be required to take measures in certain areas to meet the child-safety duties. That includes functionalities allowing for control over content that is encountered. It would not be appropriate to require providers to offer children the option to verify their identity, due to the safeguarding and data protection risks that that would pose. Although we expect companies to use technologies such as age assurance to protect children on their service, they would only be used to establish age, not identity.

The new clause would create provisions to enable children to filter out private messages from adults and users who are not on an approved list, but the Bill already contains provisions that address the risks of adults contacting children. There are also requirements on service providers to consider how their service could be used for grooming or child sexual exploitation and abuse, and to apply proportionate measures to mitigate those risks. The service providers already have to assess and mitigate the risks. They have to provide the risk assessment, and within it they could choose to mitigate risk by requiring services to prevent unknown users from contacting children.

For the reasons I have set out, the Bill already provides strong protections for children on services that they are likely to access. I am therefore not able to accept the new clause, and I hope that the hon. Member for Aberdeen North will withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That was one of the more disappointing responses from the Minister, I am afraid. I would appreciate it if he could write to me to explain which part of the Bill provides protection to children from private messaging. I would be interested to have another look at that, so it would be helpful if he could provide details.

We do not want children to choose to see unsafe stuff, but the Bill is not strong enough on stuff like private messaging or the ability of unsolicited users to contact children, because it relies on the providers noticing that in their risk assessment, and putting in place mitigations after recognising the problem. It relies on the providers being willing to act to keep children safe in a way that they have not yet done.

When I am assisting my children online, and making rules about how they behave online, the thing I worry most about is unsolicited contact: what people might say to them online, and what they might hear from adults online. I am happy enough for them to talk to their friends online—I think that is grand—but I worry about what adults will say to them online, whether by private messaging through text or voice messages, or when they are playing a game online with the ability for a group of people working as a team together to broadcast their voices to the others and say whatever they want to say.

Lastly, one issue we have seen on Roblox, which is marketed as a children’s platform, is people creating games within it—people creating sex dungeons within a child’s game, or having conversations with children and asking the child to have their character take off their clothes. Those things have happened on that platform, and I am concerned that there is not enough protection in place, particularly to address that unsolicited contact. Given the disappointing response from the Minister, I am keen to push this clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Government recognise that the intent behind the new clause is to create new criminal offences of non-compliance with selected duties. It would establish a framework for personal criminal offences punishable through fines or imprisonment. It would mean that providers committed a criminal offence if they did not comply with certain duties.

We all want this Bill to be effective. We want it to be on the statute book. It is a question of getting that fine balance right, so that we can properly hold companies to account for the safety of their users. The existing approach to enforcement and senior manager liability strikes the right balance between robust enforcement and deterrent, and ensuring that the UK remains an attractive place to do business. We are confident that the Bill as a whole will bring about the change necessary to ensure that users, especially younger users, are kept safe online.

This new clause tries to criminalise not complying with the Bill’s duties. Exactly what activity would be criminalised is not obvious from the new clause, so it could be difficult for individuals to foresee exactly what type of conduct would constitute an offence. That could lead to unintended consequences, with tech executives driving an over-zealous approach to content take-down for fear of imprisonment, and potentially removing large volumes of innocuous content and so affecting the ability for open debate to take place.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Does the Minister not think that the freedom of speech stuff and the requirement to stick to terms of service that he has put in as safeguards for that are strong enough, then?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I come back to this point: I think that if people were threatened with personal legal liability, that would stifle innovation and make them over-cautious in their approach. That would remove the balance, disturb the balance, that we have tried to achieve in this iteration of the Bill. Trying to keep internet users, particularly children, safe has to be achieved alongside free speech and not at its expense.

Further, the threat of criminal prosecution for failing to comply with numerous duties also runs a real risk of damaging the attractiveness of the UK as a place to start up and grow a digital business. I want internet users in the future to be able to access all the benefits of the internet safely, but we cannot achieve that if businesses avoid the UK because our enforcement regime is so far out of kilter with international comparators. Instead, the most effective way to ensure that services act to protect people online is through the existing framework and the civil enforcement options that are already provided for in the Bill, overseen by an expert regulator.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank you, too, Dame Angela. I echo the Minister’s sentiments, and thank all the Clerks, the Doorkeepers, the team, and all the stakeholders who have massively contributed, with very short turnarounds, to the scrutiny of this legislation. I have so appreciated all that assistance and expertise, which has helped me, as shadow Minister, to compile our comments on the Bill following the Government’s recommittal of it to Committee, which is an unusual step. Huge thanks to my colleagues who joined us today and in previous sittings, and to colleagues from across the House, and particularly from the SNP, a number of whose amendments we have supported. We look forward to scrutinising the Bill further when it comes back to the House in the new year.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank you, Dame Angela, as well as Sir Roger for chairing our debates. Recommittal has been a very odd and unusual process; it has been a bit like groundhog day, discussing things we have discussed previously. I very much appreciate the hard work of departmental and Ofcom staff that went into making this happen, as well as the work of the Clerks, the Doorkeepers, and the team who ensured that we have a room that is not freezing—that has been really helpful.

I thank colleagues from across the House, particularly the Labour Front-Bench spokespeople, who have been incredibly helpful in supporting our amendments. This has been a pretty good-tempered Committee and we have all got on fairly well, even though we have disagreed on a significant number of issues. I am sure we will have those arguments again on Report.

None Portrait The Chair
- Hansard -

There being no more obvious niceties, I add my thanks to everybody. I wish everybody season’s greetings and a happy Christmas.

Question put and agreed to.

Bill, as amended, accordingly to be reported.

ONLINE SAFETY BILL (Second sitting)

Kirsty Blackman Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The onus on adults is very much a safety net—very much a catch-all, after we have put the onus on the social media companies and the platforms to adhere to their own terms and conditions.

We have heard a lot about Twitter and the changes to Twitter. We can see the commercial imperative for mainstream platforms, certainly the category 1 platforms, to have a wide enough catch-all in their terms of service—anything that an advertiser, for example, would see as reasonably sensible—to be able to remain a viable platform in the first place. When Elon Musk first started making changes at Twitter, a comment did the rounds: “How do you build a multimillion-dollar company? You sell it to Elon Musk for £44 billion.” He made that change. He has seen the bottom falling out of his market and has lost a lot of the cash he put into Twitter. That is the commercial impetus that underpins a lot of the changes we are making.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Is the Minister really suggesting that it is reasonable for people to say, “Right, I am going to have to walk away from Facebook because I don’t agree with their terms of service,” to hold the platform to account? How does he expect people to keep in touch with each other if they have to walk away from social media platforms in order to try to hold them to account?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think the hon. Lady is seriously suggesting that people can communicate only via Facebook—via one platform. The point is that there are a variety of methods of communication, of which has been a major one, although it is not one of the biggest now, with its share value having dropped 71% in the last year. That is, again, another commercial impetus in terms of changing its platform in other, usability-related ways.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Lady makes a good point. I talked about the offline world rather than the real world, but clearly that can happen. That is where the balance has to be struck, as we heard from my hon. Friend the Member for Don Valley. It is not black and white; it is a spectrum of greys. Any sensible person can soon see when they stray into areas that we have talked about such as holocaust denial and extremism, but we do not want to penalise people who invariably are testing their freedom of expression.

It is a fine balance, but I think that we have reached the right balance between protecting freedom of expression and protecting vulnerable adults by having three layers of checks. The first is illegality. The second is enforcing the terms of service, which provide a higher bar than we had in the original Bill for the vast majority of platforms, so that we can see right at the beginning how they will be enforced by the platforms. If they change them and do not adhere them, Ofcom can step in. Ofcom can step in at any point to ensure that they are being enforced. The third is a safety net.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On illegal content, is the Minister proposing that the Government will introduce new legislation to make, for example, holocaust denial and eating disorder content illegal, whether it is online or offline? If he is saying that the bar in the online and offline worlds should be the same, will the Government introduce more hate crime legislation?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Hate crime legislation will always be considered by the Ministry of Justice, but I am not committing to any changes. That is beyond my reach, but the two shields that we talked about are underpinned by a safety net.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Government recognise the importance of giving adult users greater choice about what they see online and who they interact with, while upholding users’ rights to free expression online. That is why we have removed the “legal but harmful” provisions from the Bill in relation to adults and replaced it with a fairer, simpler approach: the triple shield.

As I said earlier, the first shield will require all companies in scope to take preventive measures to tackle illegal content or activity. The second shield will place new duties on category 1 services to improve transparency and accountability, and protect free speech, by requiring them to adhere to their terms of service when restricting access to content or suspending or banning users. As I said earlier, user empowerment is the key third shield, empowering adults with a greater control over their exposure to legal forms of abuse or hatred, or content that encourages, promotes or provides instructions for suicide, self-harm or eating disorders. That has been done while upholding and protecting freedom of expression.

Amendments 9 and 12 will strengthen the user empowerment duty, so that the largest companies are required to ensure that those tools are effective in reducing the likelihood of encountering the listed content or alerting users to it, and are easy for users to access. That will provide adult users with greater control over their online experience.

We are also setting out the categories of content that those user empowerment tools apply to in the Bill, through amendment 15. Adult users will be given the choice of whether they want to take advantage of those tools to have greater control over content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders, and content that targets abuse or incites hate against people on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. This is a targeted approach, focused on areas where we know that adult users—particularly those who are vulnerable or disproportionately targeted by online hate and abuse—would benefit from having greater choice.

As I said, the Government remain committed to free speech, which is why we have made changes to the adult safety duties. By establishing high thresholds for inclusion in those content categories, we have ensured that legitimate debate online will not be affected by the user empowerment duties.

I want to emphasise that the user empowerment duties do not require companies to remove legal content from their services; they are about giving individual adult users the option to increase their control over those kinds of content. Platforms will still be required to provide users with the ability to filter out unverified users, if they so wish. That duty remains unchanged. For the reasons that I have set out, I hope that Members can support Government amendments 8 to 17.

I turn to the amendments in the name of the hon. Member for Pontypridd to Government amendments 15 and 16. As I have set out in relation to Government amendments 8 to 17, the Government recognise the intent behind the amendments—to apply the user empowerment tools in clause 14(2) to a greater range of content categories. As I have already set out, it is crucial that a tailored approach is taken, so that the user empowerment tools stay in balance with users’ rights to free expression online. I am sympathetic to the amendments, but they propose categories of content that risk being either unworkable for companies or duplicative to the approach already set out in amendment 15.

The category of

“content that is harmful to health”

sets an extremely broad scope. That risks requiring companies to apply the tools in clause 14(2) to an unfeasibly large volume of content. It is not a proportionate approach and would place an unreasonable burden on companies. It might also have concerning implications for freedom of expression, as it may capture important health advice. That risks, ultimately, undermining the intention behind the user empowerment tools in clause 14(2) by preventing users from accessing helpful content, and disincentivising users from using the features.

In addition, the category

“provides false information about climate change”

places a requirement on private companies to be the arbiters of truth on subjective and evolving issues. Those companies would be responsible for determining what types of legal content were considered false information, which poses a risk to freedom of expression and risks silencing genuine debate.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Did the Minister just say that climate change is subjective?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

No, not about whether climate change is happening, but we are talking about a wide range. “Provides false information”—how do the companies determine what is false? I am not talking about the binary question of whether climate change is happening, but climate change is a wide-ranging debate. “Provides false information” means that someone has to determine what is false and what is not. Basically, the amendment outsources that to the social media platforms. That is not appropriate.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. With your permission, I will take this opportunity to make some broad reflections on the Government’s approach to the new so-called triple-shield protection that we have heard so much about, before coming on to the amendment tabled in my name in the group.

Broadly, Labour is disappointed that the system-level approach to content that is harmful to adults is being stripped from the Bill and replaced with a duty that puts the onus on the user to keep themselves safe. As the Antisemitism Policy Trust among others has argued, the two should be able to work in tandem. The clause allows a user to manage what harmful material they see by requiring the largest or most risky service providers to provide tools to allow a person in effect to reduce their likelihood of encountering, or to alert them to, certain types of material. We have concerns about the overall approach of the Government, but Labour believes that important additions can be made to the list of content where user-empowerment tools must be in place, hence our amendment (a) to Government amendment 15.

In July, in a little-noticed written ministerial statement, the Government produced a prototype list of content that would be harmful to adults. The list included priority content that category 1 services need to address in their terms and conditions; online abuse and harassment—mere disagreement with another’s point of view would not reach the threshold for harmful content, and so would not be covered; circulation of real or manufactured intimate images without the subject’s consent; content promoting self-harm; content promoting eating disorders; legal suicide content; and harmful health content that is demonstrably false, such as urging people to drink bleach to cure cancer.

We have concerns about whether listing those harms in the Bill is the most effective mechanism, mostly because we feel that the list should be more flexible and able to change according to the issues of the day, but it is clear that the Government will continue to pursue this avenue despite some very worrying gaps. With that in mind, will the Minister clarify what exactly underpins that list if there have been no risk assessments? What was the basis for drawing up that specific list? Surely the Government should be implored to publish the research that determined the list, at the very least.

I recognise that the false communications offence has remained in the Bill, but the list in Government amendment 15 is not exhaustive. Without the additions outlined in our amendment (a) to amendment 15, the list will do little to tackle some of the most pressing harm of our time, some of which we have already heard about today.

I am pleased that the list from the written ministerial statement has more or less been reproduced in amendment 15, under subsection (2), but there is a key and unexplained omission that our amendment (a) to it seeks to correct: the absence of the last point, on harmful health content. Amendment (a) seeks to reinsert such important content into the Bill directly. It seems implausible that the Government failed to consider the dangerous harm that health misinformation can have online, especially given that back in July they seemed to have a grasp of its importance by including it in the original list.

We all know that health-related misinformation and disinformation can significantly undermine public health, as we have heard. We only have to cast our minds back to the height of the coronavirus pandemic to remind ourselves of how dangerous the online space was, with anti-vax scepticism being rife. Many groups were impacted, including pregnant women, who received mixed messages about the safety of covid vaccination, causing widespread confusion, fear and inaction. By tabling amendment (a) to amendment 15, we wanted to understand why the Government have dropped that from the list and on what exact grounds.

In addition to harmful health content, our amendment (a) to amendment 15 would also add to the list content that incites hateful extremism and provides false information about climate change, as we have heard. In early written evidence from Carnegie, it outlined how serious the threat of climate change disinformation is to the UK. Malicious actors spreading false information on social media could undermine collective action to combat the threats. At present, the Online Safety Bill is not designed to tackle those threats head on.

We all recognise that social media is an important source of news and information for many people, and evidence is emerging of its role in climate change disinformation. The Centre for Countering Digital Hate published a report in 2021 called “The Toxic Ten: How ten fringe publishers fuel 69% of digital climate change denial”, which explores the issue further. Further analysis of activity on Facebook around COP26 undertaken by the Institute for Strategic Dialogue demonstrates the scale of the challenge in dealing with climate change misinformation and disinformation. The research compared the levels of engagement generated by reliable, scientific organisations and climate-sceptic actors, and found that posts from the latter frequently received more traction and reach than the former, which is shocking. For example, in the fortnight in which COP26 took place, sceptic content garnered 12 times the level of engagement that authoritative sources did on the platform, and 60% of the sceptic posts analysed could be classified as actively and explicitly attacking efforts to curb climate change, which just goes to show the importance of ensuring that climate change disinformation is also included in the list in Government amendment 15.

Our two amendments—amendment (a) to amendment 15, and amendment (a) to amendment 16 —seek to ensure that the long-standing omission from the Bill of hateful extremism is put right here as a priority. There is increasing concern about extremism leading to violence and death that does not meet the definition for terrorism. The internet and user-to-user services play a central role in the radicalisation process, yet the Online Safety Bill does not cover extremism.

Colleagues may be aware that Sara Khan, the former lead commissioner for countering extremism, provided a definition of extremism for the Government in February 2021, but there has been no response. The issue has been raised repeatedly by Members across the House, including by my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), following the tragic murders carried out by a radicalised incel in his constituency.

Amendment (a) to amendment 16 seeks to bring a formal definition of hateful extremism into the Bill and supports amendment (a) to amendment 15. The definition, as proposed by Sara Khan, who was appointed as Britain’s first countering extremism commissioner in 2018, is an important first step in addressing the gaps that social media platforms and providers have left open for harm and radicalisation.

Social media platforms have often been ineffective in removing other hateful extremist content. In November 2020, The Guardian reported that research from the Centre for Countering Digital Hate had uncovered how extremist merchandise had been sold on Facebook and Instagram to help fund neo-Nazi groups. That is just one of a huge number of instances, and it goes some way to suggest that a repeatedly inconsistent and ineffective approach to regulating extremist content is the one favoured by some social media platforms.

I hope that the Minister will seriously consider the amendments and will see the merits in expanding the list in Government amendment 15 to include these additional important harms.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you for chairing the meeting this afternoon, Dame Angela. I agree wholeheartedly with the amendments tabled by the Labour Front-Bench team. It is important that we talk about climate change denial and what we can do to ensure people are not exposed to that harmful conspiracy theory through content. It is also important that we do what we can to ensure that pregnant women, for example, are not told not to take the covid vaccine or that parents are not told not to vaccinate their children against measles, mumps and rubella. We need to do what we can to ensure measures are in place.

I appreciate the list in Government amendment 15, but I have real issues with this idea of a toggle system—of being able to switch off this stuff. Why do the Government think people should have to switch off the promotion of suicide content or content that promotes eating disorders? Why is it acceptable that people should have to make an active choice to switch that content off in order to not see it? People have to make an active choice to tick a box that says, “No, I don’t want to see content that is abusing me because of my religion,” or “No, I don’t want to see content that is abusing me because of my membership of the LGBT community.” We do not want people to have to look through the abuse they are receiving in order to press the right buttons to switch it off. As the hon. Member for Don Valley said, people should be allowed to say what they want online, but the reality is that the extremist content that we have seen published online is radicalising people and bringing them to the point that they are taking physical action against people in the real, offline world as well as taking action online.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I rise briefly to say that the introduction of the shields is a significant additional safety measure in the Bill and shows that the Government have thought about how to improve certain safety features as the Bill has progressed.

In the previous version of the Bill, as we have discussed at length, there were the priority legal offences that companies had to proactively identify and mitigate, and there were the measures on transparency and accountability on the terms of service. However, if pieces of content fell below the threshold for the priority legal offences or were not covered, or if they were not addressed in the terms of service, the previous version of the Bill never required the companies to act in any particular way. Reports might be done by Ofcom raising concerns, but there was no requirement for further action to be taken if the content was not a breach of platform policies or the priority safety duties.

The additional measure before us says that there may be content where there is no legal basis for removal, but users nevertheless have the right to have that content blocked. Many platforms offer ad tools already—they are not perfect, but people can opt in to say that they do not want to see ads for particular types of content—but there was nothing for the types of content covered by the Online Safety Bill, where someone could say, “I want to make sure I protect myself from seeing this at all,” and then, for the more serious content, “I expect the platforms to take action to mitigate it.” So this measure is an important additional level of protection for adult users, which allows them to give themselves the certainty that they will not see certain types of content and puts an important, additional duty on the companies themselves.

Briefly, on the point about gambling, the hon. Member for Aberdeen North is quite right to say that someone can self-exclude from gambling at the betting shop, but the advertising code already requires that companies do not target people who have self-excluded with advertising messages. As the Government complete their online advertising review, which is a separate piece of work, it is important that that is effectively enforced on big platforms, such as Facebook and Google, to ensure that they do not allow companies to advertise to vulnerable users in breach of the code. However, that can be done outside the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

My concern is not just about advertising content or stuff that is specifically considered as an advert. If someone put up a TikTok video about how to cheat an online poker system, that would not be classed as an advert and therefore would not be caught. People would still be able to see it, and could not opt out.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I totally appreciate the point that the hon. Lady makes, which is a different one. For gambling, the inducement to act straightaway often comes in the form of advertising. It usually comes in the form of free bets and immediate inducements to act. People who have self-excluded should not be targeted in that way. We need to ensure that that is rigorously enforced on online platforms too.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The amendments relate to the tools proposed in clause 14, which as we know will be available for individuals to use on platforms to protect themselves from harm. As the Minister knows, Labour fundamentally disagrees with that approach, which will place the onus on the user, rather than the platform, to protect themselves from harmful content. It is widely recognised that the purpose of this week’s Committee proceedings is to allow the Government to remove the so-called “legal but harmful” clauses and replace them with the user empowerment tool option. Let us be clear that that goes against the very essence of the Bill, which was created to address the particular way in which social media allows content to be shared, spread and broadcast around the world at speed.

This approach could very well see a two-tier internet system develop, which leaves those of us who choose to utilise the user empowerment tools ignorant of harmful content perpetuated elsewhere for others to see. The tools proposed in clause 14, however, reflect something that we all know to be true: that there is some very harmful content out there for us all to see online. We can all agree that individuals should therefore have access to the appropriate tools to protect themselves. It is also right that providers will be required to ensure that adults have greater choice and control over the content that they see and engage with, but let us be clear that instead of focusing on defining exactly what content is or is not harmful, the Bill should focus on the processes by which harmful content is amplified on social media.

However, we are where we are, and Labour believes that it is better to have the Bill over the line, with a regulator in place with some powers, than simply to do nothing at all. With that in mind, we have tabled the amendment specifically to force platforms to have safety tools on by default. We believe that the user empowerment tools should be on by default and that they must be appropriately visible and easy to use. We must recognise that for people at a point of crisis—if a person is suffering with depressive or suicidal thoughts, or with significant personal isolation, for example—the tools may not be at the forefront of their minds if their mental state is severely impacted.

On a similar point, we must not patronise the public. Labour sees no rational argument why the Government would not support the amendment. We should all assume that if a rational adult is able to easily find and use these user empowerment tools, then they will be easily able to turn them off if they choose to do so.

The Minister knows that I am not in the habit of guessing but, judging from our private conversations, his rebuttal to my points may be because he believes it is not the Government’s role to impose rules directly on platforms, particularly when they impact their functionality. However, for Labour, the existence of harm and the importance of protecting people online tips the balance in favour of turning these user empowerment tools on by default. We see no negative reason why that should not be the case, and we now have a simple amendment that could have a significantly positive impact.

I hope the Minister and colleagues will reflect strongly on these amendments, as we believe they are a reasonable and simple ask of platforms to do the right thing and have the user empowerment tools on by default.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Once again, this is a very smart amendment that I wish I had thought of myself and I am happy to support. The case made by those campaigning for freedom of speech at any cost is about people being able to say what they want to say, no matter how harmful that may be. It is not about requiring me, or anyone else, to read those things—the harmful bile, the holocaust denial or the promotion of suicide that is spouted. It is not freedom of speech to require someone else to see and read such content so I cannot see any potential argument that the Government could come up with against these amendments.

The amendments have nothing to do with freedom of speech or with limiting people’s ability to say whatever they want to say or to promote whatever untruths they want to promote. However, they are about making sure that people are protected and that they are starting from a position of having to opt in if they want to see harmful content. If I want to see content about holocaust denial—I do not want to see that, but if I did—I should have to clearly tick a button that says, “Yes, I am pretty extreme in my views and I want to see things that are abusing people. I want to see that sort of content.” I should have to opt in to be able to see that.

There are a significant number of newspapers out there. I will not even pick up a lot of them because there is so much stuff in them with which I disagree, but I can choose not to pick them up. I do not have that newspaper served to me against my will because I have the opportunity to choose to opt out from buying it. I do not have to go into the supermarket and say, “No, please do not give me that newspaper!” I just do not pick it up. If we put the Government’s proposal on its head and do what has been suggested in the Opposition amendments, everyone would be in a much better position.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I note that many providers of 4G internet, including the one I have on my own phone, already block adult content. Essentially, if people want to look at pornography or other forms of content, they have to proactively opt in to be allowed to see it. Would it not make sense to make something as straightforward as that, which already exists, into the model that we want on the internet more widely, as opposed to leaving it to EE and others to do?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. Another point that has been made is that this is not creating undue burden; the Government are already creating the burden for companies—I am not saying that it is a bad burden, but the Government are already creating it. We just want people to have the opportunity to opt into it, or out of it. That is the position that we are in.

--- Later in debate ---

Division 4

Ayes: 6

Noes: 8

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 101, in clause 14, page 14, line 17, at end insert—

“(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.”

This amendment creates a duty that user empowerment functions must be accessible and understandable to adult users with learning disabilities.

This issue was originally brought to my attention by Mencap. It is incredibly important, and it has potentially not been covered adequately by either our previous discussions of the Bill or the Bill itself. The amendment is specifically about ensuring that available features are accessible to adult users with learning disabilities. An awful lot of people use the internet, and people should not be excluded from using it and having access to safety features because they have a learning disability. That should not be the case, for example, when someone is trying to find how to report something on a social media platform. I had an absolute nightmare trying to report a racist gif that was offered in the list of gifs that came up. There is no potential way to report that racist gif to Facebook because it does not take responsibility for it, and GIPHY does not take responsibility for it because it might not be a GIPHY gif.

It is difficult to find the ways to report some of this stuff and to find some of the privacy settings. Even when someone does find the privacy settings, on a significant number of these platforms they do not make much sense—they are not understandable. I am able to read fairly well, I would think, and I am able to speak in the House of Commons, but I still do not understand some of the stuff in the privacy features found on some social media sites. I cannot find how to toggle off things that I want to toggle off on the level of accessibility or privacy that I have, particularly on social media platforms; I will focus on those for the moment. The Bill will not achieve even its intended purpose if all people using these services cannot access or understand the safety features and user empowerment tools.

I am quite happy to talk about the difference between the real world and the online world. My online friends have no problem with me talking about the real world as if it is something different, because it is. In the real world, we have a situation where things such as cuckooing take place and people take advantage of vulnerable adults. Social services, the police and various organisations are on the lookout for that and try to do what they can to put protections in place. I am asking for more parity with the real world here. Let us ensure that we have the protections in place, and that people who are vulnerable and taken advantage of far too often have access to those tools in order to protect themselves. It is particularly reasonable.

Let us say that somebody with a learning disability particularly likes cats; the Committee may have worked out that I also particularly like cats. Let us say that they want to go on TikTok or YouTube and look at videos of cats. They have to sign up to watch videos of cats. They may not have the capacity or understanding to know that there might be extreme content on those sites. They may not be able to grasp that. It may never cross their minds that there could be extreme content on that site. When they are signing up to TikTok, they should not have to go and find the specific toggle to switch off eating disorder content. All they had thought about was that this is a cool place to look at videos of cats.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

In view of the Minister’s statement, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendments made: 13, in clause 14, page 14, line 26, leave out paragraph (a) and insert—

“(a) the likelihood of adult users encountering content to which subsection (2) applies by means of the service, and”

This amendment is about factors relevant to the proportionality of measures to comply with the duty in subsection (2). The new wording replaces a reference to an adults’ risk assessment, as adults’ risk assessments are no longer required (see Amendment 6 which removes clause 12).

Amendment 14, in clause 14, page 14, line 29, leave out “a” and insert “the”.—(Paul Scully.)

This is a technical amendment consequential on Amendment 13.

Amendment (a) proposed to amendment 15: (a), at end insert—

“(8E) Content is within this subsection if it—

(a) incites hateful extremism,

(b) provides false information about climate change, or

(c) is harmful to health.”—(Alex Davies-Jones.)

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, I will keep my comments on clause 19 brief, as we broadly support the intentions behind the clause and the associated measures in the grouping. My hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) spoke at length about this important clause, which relates to the all-important complaints procedures available around social media platforms and companies, in the previous Bill Committee.

During the previous Committee, Labour tabled amendments that would have empowered more individuals to make a complaint about search content in the event of non-compliance. In addition, we wanted an external complaints option for individuals seeking redress. Sadly, all those amendments were voted down by the last Committee, but I must once again press the Minister on those points, particularly in the context of the new amendments that have been tabled.

Without redress for individual complaints, once internal mechanisms have been exhausted, victims of online abuse could be left with no further options. Consumer protections could be compromised and freedom of expression, with which the Government seem to be borderline obsessed, could be infringed for people who feel that their content has been unfairly removed.

Government new clause 2 deals with the meaning of references to

“restricting users’ access to content”,

in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14. We see amendments 22 and 59 as important components of new clause 2, and are therefore more than happy to support them. However, I reiterate to the Minister and place on the record once again the importance of introducing an online safety ombudsman, which we feel is crucial to new clause 2. The Joint Committee recommended the introduction of such an ombudsman, who would consider complaints when internal routes of redress had not resulted in resolution, had failed to address risk and had led to significant and demonstrable harm. As new clause 2 relates to restricting users’ access to content, we must also ensure that there is an appropriate channel for complaints if there is an issue that users wish to take up around restrictions in accessing content.

By now, the Minister will be well versed in my thoughts on the Government’s approach, and on the reliance on the user empowerment tool approach more broadly. It is fundamentally an error to pursue a regime that is so content-focused. Despite those points, we see the merits in Government amendments 22 and 59, and in new clause 2, so have not sought to table any further amendments at this stage.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am slightly confused, and would appreciate a little clarification from the Minister. I understand what new clause 2 means; if the hon. Member for Pontypridd says that she does not want to see content of a certain nature, and I put something of that nature online, I am not being unfairly discriminated against in any way because she has chosen to opt out of receiving that content. I am slightly confused about the downgrading bit.

I know that an awful lot of platforms use downgrading when there is content that they find problematic, or something that they feel is an issue. Rather than taking that content off the platform completely, they may just no longer put it in users’ feeds, for example; they may move it down the priority list, and that may be part of what they already do to keep people safe. I am not trying to criticise what the Government are doing, but I genuinely do not understand whether that downgrading would still be allowed, whether it would be an issue, and whether people could complain about their content being downgraded because the platform was a bit concerned about it, and needed to check it out and work out what was going on, or if it was taken off users’ feeds.

Some companies, if they think that videos have been uploaded by people who are too young to use the platform, or by a registered child user of the platform, will not serve that content to everybody’s feeds. I will not be able to see something in my TikTok feed that was published by a user who is 13, for example, because there are restrictions on how TikTok deals with and serves that content, in order to provide increased protection and the safety that they want on their services.

Will it still be acceptable for companies to have their own internal downgrading system, in order to keep people safe, when content does not necessarily meet an illegality bar or child safety duty bar? The Minister has not used the phrase “market forces”; I think he said “commercial imperative”, and he has talked a lot about that. Some companies and organisations use downgrading to improve the systems on their site and to improve the user experience on the platform. I would very much appreciate it if the Minister explained whether that will still be the case. If not, will we all have a worse online experience as a result?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will have a go at that, but I am happy to write to the hon. Lady if I do not respond as fully as she wants. Down-ranking content is a moderation action, as she says, but it is not always done just to restrict access to content; there are many reasons why people might want to do it. Through these changes, we are saying that the content is not actually being restricted; it can still be seen if it is searched for or otherwise encountered. That is consistent with the clarification.

--- Later in debate ---
Labour has not sought to amend the clause, but one again I must reiterate a point that we have pushed on numerous occasions—namely, the importance of requiring in-scope services to publish their risk assessments. The Government have refused on a number of occasions to understand the significance of the level of transparency, but it could bring great benefits, as it would allow researchers and civil society to track harms and hold services to account. Again, I push the Minister and urge him to stress that the risk assessments are published.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Specifically on the issue that was just raised, there were two written ministerial statements on the Online Safety Bill. The first specifically said that an amendment would

“require the largest platforms to publish summaries of their risk assessments for illegal content and material that is harmful to children, to allow users and empower parents to clearly understand the risks presented by these services and the approach platforms are taking to children’s safety”.—[Official Report, 29 November 2022; Vol. 723, c. 31WS.]

Unless I have completely missed an amendment that has been tabled for this Committee, my impression is that that amendment will be tabled in the Lords and that details will be made available about how exactly the publishing will work and which platforms will be required to publish.

I would appreciate it if the Minister could provide more clarity about what that might look like, and about which platforms might have to publish their assessments. I appreciate that that will be scrutinised in the Lords but, to be fair, this is the second time that the Bill has been in Committee in the Commons. It would be helpful if we could be a bit more sighted on what exactly the Government intend to do—meaning more than the handful of lines in a written ministerial statement—because then we would know whether the proposal is adequate, or whether we would have to ask further questions in order to draw it out and ensure that it is published in a certain form. The more information the Minister can provide, the better.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we all agree that written records are hugely important. They are important as evidence in cases where Ofcom is considering enforcement action, and a company’s compliance review should be done regularly, especially before they make changes to their service.

The Bill does not intend to place excessive burdens on small and low-risk businesses. As such, clause 21 provides Ofcom with the power to exempt certain types of service from the record-keeping and review duties. However, the details of any exemptions must be published.

To half-answer the point made by the hon. Member for Aberdeen North, the measures will be brought to the Lords, but I will endeavour to keep her up to date as best we can so that we can continue the conversation. We have served together on several Bill Committees, including on technical Bills that required us to spend several days in Committee—although they did not come back for re-committal—so I will endeavour to keep her and, indeed, the hon. Member for Pontypridd, up to date with developments.

Question put and agreed to. 

Clause 21, as amended, accordingly ordered to stand part of the Bill.

Clause 30

duties about freedom of expression and privacy

Amendments made: 36, in clause 30, page 31, line 31, after “have” insert “particular”.

This amendment has the result that providers of regulated search services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

Amendment 37, in clause 30, page 31, line 34, after “have” insert “particular”.—(Paul Scully.)

This amendment has the result that providers of regulated search services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Clause 30, as amended, ordered to stand part of the Bill.

Clause 46

Relationship between duties and codes of practice

Amendments made: 38, in clause 46, page 44, line 27, after “have” insert “particular”.

This amendment has the result that providers of services who take measures other than those recommended in codes of practice in order to comply with safety duties must have particular regard to freedom of expression and users’ privacy.

Amendment 39, in clause 46, page 45, line 12, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 40, in clause 46, page 45, line 31, at end insert “, or

(ii) a duty set out in section 14 (user empowerment);”.—(Paul Scully.)

This amendment has the effect that measures recommended in codes of practice to comply with the duty in clause 14 are relevant to the question of whether a provider is complying with the duties in clause 20(2) and (3) (having regard to freedom of expression and users’ privacy).

Question proposed, That the clause, as amended, stand part of the Bill.

--- Later in debate ---
Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I want to add to the brilliant points made by my hon. Friend the shadow Minister, in particular on the continually changing nature of market forces, which the Minister himself referenced. We want innovation. We want the tech companies to innovate—preferably ones in the UK—but we do not want to be playing catch-up as we are now, making legislation retrospectively to right wrongs that have taken place because our legislative process has been too slow to deal with the technological changes and the changes in social media, in apps, and with how we access data and communicate with one another online. The bare minimum is a biannual report.

Within six months, if a new piece of technology comes up, it does not simply stay with one app or platform; that technology will be leapfrogged by others. Such technological advances can take place at a very rapid pace. The transparency aspect is important, because people should have a right to know what they are using and whether it is safe. We as policy makers should have a right to know clearly whether the legislation that we have introduced, or the legislation that we want to amend or update, is effective.

If we look at any other approach that we take to protect the health and safety of the people in our country—the people we all represent in our constituencies —we always say that prevention is better than cure. At the moment, without transparency and without researchers being able to update the information we need to see, we will constantly be playing catch-up with digital tech.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Lady makes an important point. In terms of transparency, the question for me is, what are the Government worried about? Surely part of the Bill is about finding out what is really going on, and the only way that we will do that is by having access to the information. The more transparency, the better. The hon. Lady is right that having experts who can research what is going on is fundamental. If there is a concern around the workload for Ofcom, that is a separate issue that the Minister needs to address, but surely the more work that is done in terms of research and transparency, the better.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

We have seen that just from the people from external organisations who have contacted us about the Bill. The amount of expertise that we do not have that they have brought to the table has significantly improved the debate and hopefully the Bill. Even prior to the consultations that have happened, that encouraged the Minister to make the Bill better. Surely that is why the pre-legislative scrutiny Committee looked at the Bill—in order to improve it and to get expert advice. I still think that having specific access to expertise in order to analyse the transparency report has not been covered adequately.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Annual transparency reporting is an important part of how the system will work. Transparency is one of the most important aspects of how the Online Safety Bill works, because without it companies can hide behind the transparency reports they produce at the moment, which give no transparency at all. For example, Facebook and YouTube report annually that their AI finds 95% of the hate speech they remove, but Frances Haugen said that they removed only 5% of the hate speech. So the transparency report means that they remove 95% of 5%, and that is one of the fundamental problems. The Bill gives the regulator the power to know, and the regulator then has to make informed decisions based on the information it has access to.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will keep my comments on this grouping brief, because I have already raised our concerns and our overarching priority in terms of transparency reports in the previous debate, which was good one, with all Members highlighting the need for transparency and reporting in the Bill. With the Chair’s permission, I will make some brief comments on Government amendment 72 before addressing Government amendments 73 and 75.

It will come as no surprise to the Minister that amendment 72, which defines relevant content for the purposes of schedule 8, has a key omission—specifying priority content harmful to adults. For reasons we have covered at length, we think that it is a gross mistake on the Government’s side to attempt to water down the Bill in this way. If the Minister is serious about keeping adults safe online, he must reconsider this approach. However, we are happy to see amendments 73 and 75, which define consumer content and regulated user-generated content. It is important for all of us—whether we are politicians, researchers, academics, civil society, stakeholders, platforms, users or anyone else—that these definitions are in the Bill so that, when it is passed, it can be applied properly and at pace. That is why we have not sought to amend this grouping.

I must press the Minister to respond on the issues around relevant content as outlined in amendment 72. We greatly feel that more needs to be done to address this type of content and its harm to adults, so I would be grateful to hear the Minister’s assessment of how exactly these transparency reports will report back on this type of harm, given its absence in this group of amendments and the lack of a definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am pleased to see the list included and the number of things that Ofcom can ask for more information on. I have a specific question about amendment 75. Amendment 75 talks about regulated user-generated content and says it has the same meaning as it does in the interpretation of part 3 under clause 50. The Minister may or may not know that there are concerns about clause 50(5), which relates to

“One-to-one live aural communications”.

One-to-one live aural communications are exempted. I understand that that is because the Government do not believe that telephony services, for example, should be part of the Online Safety Bill—that is a pretty reasonable position for them to take. However, allowing one-to-one live aural communications not to be regulated means that if someone is using voice chat in Fortnite, for example, and there are only two people on the team that they are on, or if someone is using voice chat in Discord and there are only two people online on the channel at that time, that is completely unregulated and not taken into account by the Bill.

I know that that is not the intention of the Bill, which is intended to cover user-generated content online. The exemption is purely in place for telephony services, but it is far wider than the Government intend it to be. With the advent of more and more people using virtual reality technology, for example, we will have more and more aural communication between just two people, and that needs to be regulated by the Bill. We cannot just allow a free-for-all.

If we have child protection duties, for example, they need to apply to all user-generated content and not exempt it specifically because it is a live, one-to-one aural communication. Children are still at significant risk from this type of communication. The Government have put this exemption in because they consider such communication to be analogous to telephony services, but it is not. It is analogous to telephony services if we are talking about a voice call on Skype, WhatsApp or Signal—those are voice calls, just like telephone services—but we are talking about a voice chat that people can have with people who they do not know, whose phone number they do not know and who they have no sort of relationship with.

Some of the Discord servers are pretty horrendous, and some of the channels are created by social media influencers or people who have pretty extreme views in some cases. We could end up with a case where the Discord server and its chat functions are regulated, but if aural communication or a voice chat is happening on that server, and there are only two people online because it is 3 o’clock in the morning where most of the people live and lots of them are asleep, that would be exempted. That is not the intention of the Bill, but the Government have not yet fixed this. So I will make one more plea to the Government: will they please fix this unintended loophole, so that it does not exist? It is difficult to do, but it needs to be done, and I would appreciate it if the Minister could take that into consideration.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not believe that the provisions in terms of Ofcom’s transparency powers have been watered down. It is really important that the Bill’s protection for adults strikes the right balance with its protections for free speech, which is why we have replaced the “legal but harmful” clause. I know we will not agree on that, but there are more new duties that will make platforms more accountable. Ofcom’s transparency powers will enable it to assess compliance with the new safety duties and hold platforms accountable for enforcing their terms of service to keep users safe. Companies will also have to report on the measures that they have in place to tackle illegal content or activity and content that is harmful for children, which includes proactive steps to address offences such as child sexual exploitation and abuse.

The legislation will set out high-level categories of information that companies may be required to include in their transparency reports, and Ofcom will then specify the information that service providers will need to include in those reports, in the form of a notice. Ofcom will consider companies’ resources and capacity, service type and audience in determining what information they will need to include. It is likely that the information that is most useful to the regulator and to users will vary between different services. To ensure that the transparency framework is proportionate and reflects the diversity of services in scope, the transparency reporting requirements set out in the Ofcom notice are likely to differ between those services, and the Secretary of State will have powers to update the list of information that Ofcom may require to reflect any changes of approach.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The in-game chat that children use is overwhelmingly voice chat. Children do not type if they can possibly avoid it. I am sure that that is not the case for all children, but it is for most children. Aural communication is used if someone is playing Fortnite duos, for example, with somebody they do not know. That is why that needs to be included.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I very much get that point. It is not something that I do, but I have certainly seen it myself. I am happy to chat to the hon. Lady to ensure that we get it right.

Amendment 72 agreed to.

Amendments made: 73, in schedule 8, page 206, line 6, at end insert—

“‘consumer content’ has the same meaning as in Chapter 2A of Part 4 (see section (Interpretation of this Chapter)(3));”.

This amendment defines “consumer content” for the purposes of Schedule 8.

Amendment 74, in schedule 8, page 206, leave out lines 7 and 8.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 75, in schedule 8, page 206, line 12, at end insert—

“‘regulated user-generated content’ has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question;”.—(Paul Scully.)

This amendment defines “regulated user-generated content” for the purposes of Schedule 8.

Schedule 8, as amended, agreed to.

Ordered, That further consideration be now adjourned. —(Mike Wood.)

ONLINE SAFETY BILL (First sitting)

Kirsty Blackman Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
None Portrait The Chair
- Hansard -

We now begin line-by-line consideration of the Bill. Owing to the unusual nature of today’s proceedings on recommittal, which is exceptional, I need to make a few points.

Only the recommitted clauses and schedules, and amendments and new clauses relating to them, are in scope for consideration. The selection list, which has been circulated to Members and is available in the room, outlines which clauses and schedules those are. Any clause or schedule not on the list is not in scope for discussion. Basically, that means that we cannot have another Second Reading debate. Moreover, we cannot have a further debate on any issues that have been debated already on Report on the Floor of the House. As I say, this is unusual; in fact, I think it is probably a precedent—“Erskine May” will no doubt wish to take note.

The selection list also shows the selected amendments and how they have been grouped. Colleagues will by now be aware that we group amendments by subject for debate. They are not necessarily voted on at the time of the completion of the debate on that group, but as we reach their position in the Bill. Do not panic; we have expert advice to ensure that we do not miss anything—at least, I hope we have.

Finally, only the lead amendment is decided on at the end of the debate. If a Member wishes to move any other amendment in the group, please let the Chair know. Dame Angela or I will not necessarily select it for a Division, but we need to know if Members wish to press it to one. Otherwise, there will be no Division on the non-lead amendments.

Clause 11

Safety duties protecting children

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I beg to move amendment 98, in clause 11, page 10, line 17, at end insert

“, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Thank you, Sir Roger, for chairing this recommitted Bill Committee. I will not say that it is nice to be back discussing the Bill again; we had all hoped to have made more progress by now. If you will indulge me for a second, I would like to thank the Clerks, who have been massively helpful in ensuring that this quick turnaround could happen and that we could table the amendments in a sensible form.

Amendment 98 arose from comments and evidence from the Royal College of Psychiatrists highlighting that a number of platforms, and particularly social media platforms such as TikTok and Facebook, generally encourage habit-forming behaviour or have algorithms that encourage it. Such companies are there to make money—that is what companies do—so they want people to linger on their sites and to spend as much time there as possible.

I do not know how many hon. Members have spent time on TikTok, but if they do, and they enjoy some of the cat videos, for instance, the algorithm will know and will show them more videos of cats. They will sit there and think, “Gosh, where did the last half-hour go? I have been watching any number of 20-second videos about cats, because they constantly come up.” Social media sites work by encouraging people to linger on the site and to spend the time dawdling and looking at the advertisements, which make the company additional revenue.

That is good for capitalism and for the company’s ability to make money but the issue, particularly in relation to clause 11, is how that affects children. Children may not have the necessary filters; they may not have the ability that we have to put our phones down—not that we always manage to do so. That ability and decision-making process may not be as refined in children as in adults. Children can be sucked into the platforms by watching videos of cats or of something far more harmful.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - - - Excerpts

The hon. Member makes an excellent point about TikTok, but it also applies to YouTube. The platforms’ addictive nature has to do with the content. A platform does not just show a person a video of a cat, because that will not keep them hooked for half an hour. It has to show them a cat doing something extraordinary, and then a cat doing something even more extraordinary. That is why vulnerable people, especially children, get sucked into a dark hole. They click to see not just the same video but something more exciting, and then something even more exciting. That is the addictive nature of this.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is absolutely the case. We are talking about cats because I chose them to illustrate the situation, but people may look at content about healthy eating, and that moves on to content that encourages them to be sick. The way the algorithms step it up is insidious; they get more and more extreme, so that the linger time is increased and people do not get bored. It is important that platforms look specifically at their habit-forming features.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

A specific case on the platform TikTok relates to a misogynist who goes by the name of Andrew Tate, who has been banned from a number of social media platforms. However, because TikTok works by making clips shorter, which makes it more difficult for the company to identify some of this behaviour among users, young boys looking for videos of things that might interest them were very quickly shown misogynist content from Andrew Tate. Because they watched one video of him, they were then shown more and more. It is easy to see how the habit-forming behaviours built into platforms’ algorithms, which the hon. Lady identifies, can also be a means of quickly radicalising children into extreme ideologies.

None Portrait The Chair
- Hansard -

Order. I think we have the message. I have to say to all hon. Members that interventions are interventions, not speeches. If Members wish to make speeches, there is plenty of time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger. I absolutely agree with the hon. Member for Warrington North. The platform works by stitching things together, so a video could have a bit of somebody else’s video in it, and that content ends up being shared and disseminated more widely.

This is not an attack on every algorithm. I am delighted to see lots of videos of cats—it is wonderful, and it suits me down to the ground—but the amendment asks platforms to analyse how those processes contribute to the development of habit-forming behaviour and to mitigate the harm caused to children by habit-forming features in the service. It is not saying, “You can’t use algorithms” or “You can’t use anything that may encourage people to linger on your site.” The specific issue is addiction—the fact that people will get sucked in and stay on platforms for hours longer than is healthy.

There is a demographic divide here. There is a significant issue when we compare children whose parents are engaged in these issues and spend time—and have the time to spend—assisting them to use the internet. There is a divide between the experiences of those children online and the experiences of children who are generally not nearly as well off, whose parents may be working two or three jobs to try to keep their homes warm and keep food on the table, so the level of supervision those children have may be far lower. We have a parental education gap, where parents are not able to instruct or teach their children a sensible way to use these things. A lot of parents have not used things such as TikTok and do not know how it works, so they are unable to teach their children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Does the hon. Lady agree that this feeds into the problem we have with the lack of a digital media literacy strategy in the Bill, which we have, sadly, had to accept? However, that makes it even more important that we protect children wherever we have the opportunity to do so, and this amendment is a good example of where we can do that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The hon. Lady makes an excellent point. This is not about mandating that platforms stop doing these things; it is about ensuring that they take this issue into account and that they agree—or that we as legislators agree—with the Royal College of Psychiatrists that we have a responsibility to tackle it. We have a responsibility to ask Ofcom to tackle it with platforms.

This comes back to the fact that we do not have a user advocacy panel, and groups representing children are not able to bring emerging issues forward adequately and effectively. Because of the many other inadequacies in the Bill, that is even more important than it was. I assume the Minister will not accept my amendment—that generally does not happen in Bill Committees—but if he does not, it would be helpful if he could give Ofcom some sort of direction of travel so that it knows it should take this issue into consideration when it deals with platforms. Ofcom should be talking to platforms about habit-forming features and considering the addictive nature of these things; it should be doing what it can to protect children. This threat has emerged only in recent years, and things will not get any better unless we take action.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is a privilege to see you back in the Chair for round 2 of the Bill Committee, Sir Roger. It feels slightly like déjà vu to return to line-by-line scrutiny of the Bill, which, as you said, Sir Roger, is quite unusual and unprecedented. Seeing this Bill through Committee is the Christmas gift that keeps on giving. Sequels are rarely better than the original, but we will give it a go. I have made no secret of my plans, and my thoughts on the Minister’s plans, to bring forward significant changes to the Bill, which has already been long delayed. I am grateful that, as we progress through Committee, I will have the opportunity to put on record once again some of Labour’s long-held concerns with the direction of the Bill.

I will touch briefly on clause 11 specifically before addressing the amendments to the clause. Clause 11 covers safety duties to protect children, and it is a key part of the Bill—indeed, it is the key reason many of us have taken a keen interest in online safety more widely. Many of us, on both sides of the House, have been united in our frustrations with the business models of platform providers and search engines, which have paid little regard to the safety of children over the years in which the internet has expanded rapidly.

That is why Labour has worked with the Government. We want to see the legislation get over the line, and we recognise—as I have said in Committee previously—that the world is watching, so we need to get this right. The previous Minister characterised the social media platforms and providers as entirely driven by finance, but safety must be the No. 1 priority. Labour believes that that must apply to both adults and children, but that is an issue for debate on a subsequent clause, so I will keep my comments on this clause brief.

The clause and Government amendments 1, 2 and 3 address the thorny issue of age assurance measures. Labour has been clear that we have concerns that the Government are relying heavily on the ability of social media companies to distinguish between adults and children, but age verification processes remain fairly complex, and that clearly needs addressing. Indeed, Ofcom’s own research found that a third of children have false social media accounts aged over 18. This is an area we certainly need to get right.

I am grateful to the many stakeholders, charities and groups working in this area. There are far too many to mention, but a special shout-out should go to Iain Corby from the Age Verification Providers Association, along with colleagues at the Centre to End All Sexual Exploitation and Barnardo’s, and the esteemed John Carr. They have all provided extremely useful briefings for my team and me as we have attempted to unpick this extremely complicated part of the Bill.

We accept that there are effective age checks out there, and many have substantial anti-evasion mechanisms, but it is the frustrating reality that this is the road the Government have decided to go down. As we have repeatedly placed on the record, the Government should have retained the “legal but harmful” provisions that were promised in the earlier iteration of the Bill. Despite that, we are where we are.

I will therefore put on the record some brief comments on the range of amendments on this clause. First, with your permission, Sir Roger, I will speak to amendments 98, 99—

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think that a single number can be put on that, because it depends on the platform and the type of viewing. It is not easy to put a single number on that. An “appreciable number” is basically as identified by Ofcom, which will be the arbiter of all this. It comes back to what the hon. Member for Aberdeen North said about the direction that we, as she rightly said, want to give Ofcom. Ofcom has a range of powers already to help it assess whether companies are fulfilling their duties, including the power to require information about the operation of their algorithms. I would set the direction that the hon. Lady is looking for, to ensure that Ofcom uses those powers to the fullest and can look at the algorithms. We should bear in mind that social media platforms face criminal liability if they do not supply the information required by Ofcom to look under the bonnet.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

If platforms do not recognise that they have an issue with habit-forming features, even though we know they have, will Ofcom say to them, “Your risk assessment is insufficient. We know that the habit-forming features are really causing a problem for children”?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We do not want to wait for the Bill’s implementation to start those conversations with the platforms. We expect companies to be transparent about their design practices that encourage extended engagement and to engage with researchers to understand the impact of those practices on their users.

The child safety duties in clause 11 apply across all areas of a service, including the way it is operated and used by children and the content present on the service. Subsection (4)(b) specifically requires services to consider the

“design of functionalities, algorithms and other features”

when complying with the child safety duties. Given the direction I have suggested that Ofcom has, and the range of powers that it will already have under the Bill, I am unable to accept the hon. Member’s amendment, and I hope she will therefore withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I would have preferred it had the Minister been slightly more explicit that habit-forming features are harmful. That would have been slightly more helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will say that habit-forming features can be harmful.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister. Absolutely—they are not always harmful. With that clarification, I am happy to beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I beg to move amendment 1, in clause 11, page 10, line 22, leave out

“, or another means of age assurance”.

This amendment omits words which are no longer necessary in subsection (3)(a) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

--- Later in debate ---
None Portrait The Chair
- Hansard -

The Minister leapt to his feet before I had the opportunity to call any other Members. I call Kirsty Blackman.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger. It was helpful to hear the Minister’s clarification of age assurance and age verification, and it was useful for him to put on the record the difference between the two.

I have a couple of points. In respect of Ofcom keeping up to date with the types of age verification and the processes, new ones will come through and excellent new methods will appear in coming years. I welcome the Minister’s suggestion that Ofcom will keep up to date with that, because it is incredibly important that we do not rely on, say, the one provider that there is currently, when really good methods could come out. We need the legislation to ensure that we get the best possible service and the best possible verification to keep children away from content that is inappropriate for them.

This is one of the most important parts of the Bill for ensuring that we can continue to have adult sections of the internet—places where there is content that would be disturbing for children, as well as for some adults—and that an age-verification system is in place to ensure that that content can continue to be there. Websites that require a subscription, such as OnlyFans, need to continue to have in place the age-verification systems that they currently have. By writing into legislation the requirement for them to continue to have such systems in place, we can ensure that children cannot access such services but adults can continue to do so. This is not about what is banned online or about trying to make sure that this content does not exist anywhere; it is specifically about gatekeeping to ensure that no child, as far as we can possibly manage, can access content that is inappropriate for kids.

There was a briefing recently on children’s access to pornography, and we heard horrendous stories. It is horrendous that a significant number of children have seen inappropriate content online, and the damage that that has caused to so many young people cannot be overstated. Blocking access to adult parts of the internet is so important for the next generation, not just so that children are not disturbed by the content they see, but so that they learn that it is not okay and normal and understand that the depictions of relationships in pornography are not the way reality works, not the way reality should work and not how women should be treated. Having a situation in which Ofcom or anybody else is better able to take action to ensure that adult content is specifically accessed only by adults is really important for the protection of children and for protecting the next generation and their attitudes, particularly towards sex and relationships.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I wish to add some brief words in support of the Government’s proposals and to build on the comments from Members of all parties.

We know that access to extreme and abusive pornography is a direct factor in violence against women and girls. We see that play out in the court system every day. People claim to have watched and become addicted to this type of pornography; they are put on trial because they seek to play that out in their relationships, which has resulted in the deaths of women. The platforms already have technology that allows them to figure out the age of people on their platforms. The Bill seeks to ensure that they use that for a good end, so I thoroughly support it. I thank the Minister.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

There are two very important and distinct issues here. One is age verification. The platforms ask adults who have identification to verify their age; if they cannot verify their age, they cannot access the service. Platforms have a choice within that. They can design their service so that it does not have adult content, in which case they may not need to build in verification systems—the platform polices itself. However, a platform such as Twitter, which allows adult content on an app that is open to children, has to build in those systems. As the hon. Member for Aberdeen North mentioned, people will also have to verify their identity to access a service such as OnlyFans, which is an adult-only service.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that specific point, I searched on Twitter for the name—first name and surname—of a politician to see what people had been saying, because I knew that he was in the news. The pictures that I saw! That was purely by searching for the name of the politician; it is not as though people are necessarily seeking such stuff out.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On these platforms, the age verification requirements are clear: they must age-gate the adult content or get rid of it. They must do one or the other. Rightly, the Bill does not specify technologies. Technologies are available. The point is that a company must demonstrate that it is using an existing and available technology or that it has some other policy in place to remedy the issue. It has a choice, but it cannot do nothing. It cannot say that it does not have a policy on it.

Age assurance is always more difficult for children, because they do not have the same sort of ID that adults have. However, technologies exist: for instance, Yoti uses facial scanning. Companies do not have to do that either; they have to demonstrate that they do something beyond self-certification at the point of signing up. That is right. Companies may also demonstrate what they do to take robust action to close the accounts of children they have identified on their platforms.

If a company’s terms of service state that people must be 13 or over to use the platform, the company is inherently stating that the platform is not safe for someone under 13. What does it do to identify people who sign up? What does it do to identify people once they are on the platform, and what action does it then take? The Bill gives Ofcom the powers to understand those things and to force a change of behaviour and action. That is why—to the point made by the hon. Member for Pontypridd—age assurance is a slightly broader term, but companies can still extract a lot of information to determine the likely age of a child and take the appropriate action.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we are all in agreement, and I hope that the Committee will accept the amendments.

Amendment 1 agreed to.

Amendments made: 2, in clause 11, page 10, line 25, leave out

“(for example, by using age assurance)”.

This amendment omits words which are no longer necessary in subsection (3)(b) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

Amendment 3, in clause 11, page 10, line 26, at end insert—

“(3A) Age assurance to identify who is a child user or which age group a child user is in is an example of a measure which may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).”—(Paul Scully.)

This amendment makes it clear that age assurance measures may be used to comply with duties in clause 11(2) as well as (3) (safety duties protecting children).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I beg to move amendment 99, in clause 11, page 10, line 34, leave out paragraph (d) and insert—

“(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,”.

This amendment is intended to make clear that if it is proportionate to do so, services should have policies that include blocking access to parts of a service, rather than just the entire service or particular content on the service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 96, in clause 11, page 10, line 41, at end insert—

“(i) reducing or removing a user’s access to private messaging features”.

This amendment is intended to explicitly include removing or reducing access to private messaging features in the list of areas where proportionate measures can be taken to protect children.

Amendment 97, in clause 11, page 10, line 41, at end insert—

“(i) reducing or removing a user’s access to livestreaming features”.

This amendment is intended to explicitly include removing or reducing access to livestreaming features in the list of areas where proportionate measures can be taken to protect children.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am glad that the three amendments are grouped, because they link together nicely. I am concerned that clause 11(4)(d) does not do exactly what the Government intend it to. It refers to

“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.

There is a difference between content and parts of the service. It would be possible to block users from accessing some of the things that we have been talking about —for example, eating disorder content—on the basis of clause 11(4)(d). A platform would be able to take that action, provided that it had the architecture in place. However, on my reading, I do not think it would be possible to block a user from accessing, for example, private messaging or livestreaming features. Clause 11(4)(d) would allow a platform to block certain content, or access to the service, but it would not explicitly allow it to block users from using part of the service.

Let us think about platforms such as Discord and Roblox. I have an awful lot of issues with Roblox, but it can be a pretty fun place for people to spend time. However, if a child, or an adult, is inappropriately using its private messaging features, or somebody on Instagram is using the livestreaming features, there are massive potential risks of harm. Massive harm is happening on such platforms. That is not to say that Instagram is necessarily inherently harmful, but if it could block a child’s access to livestreaming features, that could have a massive impact in protecting them.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does the hon. Lady accept that the amendments would give people control over the bit of the service that they do not currently have control of? A user can choose what to search for and which members to engage with, and can block people. What they cannot do is stop the recommendation feeds recommending things to them. The shields intervene there, which gives user protection, enabling them to say, “I don’t want this sort of content recommended to me. On other things, I can either not search for them, or I can block and report offensive users.” Does she accept that that is what the amendment achieves?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I think that that is what the clause achieves, rather than the amendments that I have tabled. I recognise that the clause achieves that, and I have no concerns about it. It is good that the clause does that; my concern is that it does not take the second step of blocking access to certain features on the platform. For example, somebody could be having a great time on Instagram looking at various people’s pictures or whatever, but they may not want to be bombarded with private messages. They have no ability to turn off the private messaging section.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

They can disengage with the user who is sending the messages. On a Meta platform, often those messages will be from someone they are following or engaging with. They can block them, and the platforms have the ability, in most in-app messaging services, to see whether somebody is sending priority illegal content material to other users. They can scan for that and mitigate that as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is exactly why users should be able to block private messaging in general. Someone on Twitter can say, “I’m not going to receive a direct message from anybody I don’t follow.” Twitter users have the opportunity to do that, but there is not necessarily that opportunity on all platforms. We are asking for those things to be included, so that the provider can say, “You’re using private messaging inappropriately. Therefore, we are blocking all your access to private messaging,” or, “You are being harmed as a result of accessing private messaging. Therefore, we are blocking your access to any private messaging. You can still see pictures on Instagram, but you can no longer receive any private messages, because we are blocking your access to that part of the site.” That is very different from blocking a user’s access to certain kinds of content, for example. I agree that that should happen, but it is about the functionalities and stopping access to some of them.

We are not asking Ofcom to mandate that platforms take this measure; they could still take the slightly more nuclear option of banning somebody entirely from their service. However, if this option is included, we could say, “Your service is doing pretty well, but we know there is an issue with private messaging. Could you please take action to ensure that those people who are using private messaging to harm children no longer have access to private messaging and are no longer able to use the part of the service that enables them to do these things?” Somebody might be doing a great job of making games in Roblox, but they may be saying inappropriate things. It may be proportionate to block that person entirely, but it may be more proportionate to block their access to voice chat, so that they can no longer say those things, or direct message or contact anybody. It is about proportionality and recognising that the service is not necessarily inherently harmful but that specific parts of it could be.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

The hon. Member is making fantastic, salient points. The damage with private messaging is around phishing, as well as seeing a really harmful message and not being able to unsee it. Would she agree that it is about protecting the victim, not putting the onus on the victim to disengage from such conversations?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. The hon. Member put that much better than I could. I was trying to formulate that point in my head, but had not quite got there, so I appreciate her intervention. She is right: we should not put the onus on a victim to deal with a situation. Once they have seen a message from someone, they can absolutely block that person, but that person could create another account and send them messages again. People could be able to choose, and to say, “No, I don’t want anyone to be able to send me private messages,” or “I don’t want any private messages from anyone I don’t know.” We could put in those safeguards.

I am talking about adding another layer to the clause, so that companies would not necessarily have to demonstrate that it was proportionate to ban a person from using their service, as that may be too high a bar—a concern I will come to later. They could, however, demonstrate that it was proportionate to ban a person from using private messaging services, or from accessing livestreaming features. There has been a massive increase in self-generated child sexual abuse images, and huge amount has come from livestreaming. There are massive risks with livestreaming features on services.

Livestreaming is not always bad. Someone could livestream themselves showing how to make pancakes. There is no issue with that—that is grand—but livestreaming is being used by bad actors to manipulate children into sharing videos of themselves, and once they are on the internet, they are there forever. It cannot be undone. If we were able to ban vulnerable users—my preferred option would be all children—from accessing livestreaming services, they would be much safer.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is talking about extremely serious matters. My expectation is that Ofcom would look at all of a platform’s features when risk-assessing the platform and enforcing safety, and in-app messaging services would not be exempt. Platforms have to demonstrate what they would do to mitigate harmful and abusive behaviour, and that they would take action against the accounts responsible.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Absolutely, I agree, but the problem is with the way the Bill is written. It does not suggest that a platform could stop somebody accessing a certain part of a service. The Bill refers to content, and to the service as a whole, but it does not have that middle point that I am talking about.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

A platform is required to demonstrate to Ofcom what it would do to mitigate activity that would breach the safety duties. It could do that through a feature that it builds in, or it may take a more draconian stance and say, “Rather than turning off certain features, we will just suspend the account altogether.” That could be discussed in the risk assessments, and agreed in the codes of practice.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

What I am saying is that the clause does not actually allow that middle step. It does not explicitly say that somebody could be stopped from accessing private messaging. The only options are being banned from certain content, or being banned from the entire platform.

I absolutely recognise the hard work that Ofcom has done, and I recognise that it will work very hard to ensure that risks are mitigated, but the amendment ensures what the Minister intended with this legislation. I am not convinced that he intended there to be just the two options that I outlined. I think he intended something more in line with what I am suggesting in the amendment. It would be very helpful if the Minister explicitly said something in this Committee that makes it clear that Ofcom has the power to say to platforms, “Your risk assessment says that there is a real risk from private messaging”—or from livestreaming—“so why don’t you turn that off for all users under 18?” Ofcom should be able to do that.

Could the Minister be clear that that is the direction of travel he is hoping and intending that Ofcom will take? If he could be clear on that, and will recognise that the clause could have been slightly better written to ensure Ofcom had that power, I would be quite happy to not push the amendment to a vote. Will the Minister be clear about the direction he hopes will be taken?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise to support my SNP colleagues’ amendments 99, and 96 and 97, just as I supported amendment 98. The amendments are sensible and will ensure that service providers are empowered to take action to mitigate harms done through their services. In particular, we support amendment 99, which makes it clear that a service should be required to have the tools available to allow it to block access to parts of its service, if that is proportionate.

Amendments 96 and 97 would ensure that private messaging and livestreaming features were brought into scope, and that platforms and services could block access to them when that was proportionate, with the aim of protecting children, which is the ultimate aim of the Bill. Those are incredibly important points to raise.

In previous iterations of the Bill Committee, Labour too tabled a number of amendments to do with platforms’ responsibilities for livestreaming. I expressed concerns about how easy it is for platforms to host live content, and about how ready they were to screen that content for harm, illegal or not. I am therefore pleased to support our SNP colleagues. The amendments are sensible, will empower platforms and will keep children safe.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

If someone on a PlayStation wants to play online games, they must sign up to PlayStation Plus—that is how the model works. Once they pay that subscription, they can access online games and play Fortnite or Rocket League or whatever they want online. They then also have access to a suite of communication features; they can private message people. It would be disproportionate to ban somebody from playing any PlayStation game online in order to stop them from being able to private message inappropriate things. That would be a disproportionate step. I do not want PlayStation to be unable to act against somebody because it could not ban them, as that would be disproportionate, but was unable to switch off the direct messaging features because the clause does not allow it that flexibility. A person could continue to be in danger on the PlayStation platform as a result of private communications that they could receive. That is one example of how the provision would be key and important.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Again, the Government recognise the intent behind amendment 99, which, as the hon. Member for Aberdeen North said, would require providers to be able to block children’s access to parts of a service, rather than the entire service. I very much get that. We recognise the nature and scale of the harm that can be caused to children through livestreaming and private messaging, as has been outlined, but the Bill already delivers what is intended by these amendments. Clause 11(4) sets out examples of areas in which providers will need to take measures, if proportionate, to meet the child safety duties. It is not an exhaustive list of every measure that a provider might be required to take. It would not be feasible or practical to list every type of measure that a provider could take to protect children from harm, because such a list could become out of date quickly as new technologies emerge, as the hon. Lady outlined with her PlayStation example.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a concern. The Minister’s phrasing was “to block children’s access”. Surely some of the issues would be around blocking adults’ access, because they are the ones causing risk to the children. From my reading of the clause, it does not suggest that the action could be taken only against child users; it could be taken against any user in order to protect children.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will come to that in a second. The hon. Member for Luton North talked about putting the onus on the victim. Any element of choice is there for adults; the children will be protected anyway, as I will outline in a second. We all agree that the primary purpose of the Bill is to be a children’s protection measure.

Ofcom will set out in codes of practice the specific steps that providers can take to protect children who are using their service, and the Government expect those to include steps relating to children’s access to high-risk features, such as livestreaming or private messaging. Clause 11(4)(d) sets out that that providers may be required to take measures in the following areas:

“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.

The other areas listed are intentionally broad categories that allow for providers to take specific measures. For example, a measure in the area of blocking user access to particular content could include specific measures that restrict children’s access to parts of a service, if that is a proportionate way to stop users accessing that type of content. It can also apply to any of the features of a service that enable children to access particular content, and could therefore include children’s access to livestreaming and private messaging features. In addition, the child safety duties make it clear that providers need to use proportionate systems and processes that prevent children from encountering primary priority content that is harmful to them, and protect children and age groups at risk of harm from other content that is harmful to them.

While Ofcom will set out in codes of practice the steps that providers can take to meet these duties, we expect those steps, as we have heard, to include the use of age verification to prevent children accessing content that poses the greatest risk of harm to them. To meet that duty, providers may use measures that restrict children from accessing parts of the service. The Bill therefore allows Ofcom to require providers to take that step where it is proportionate. I hope that that satisfies the hon. Member for Aberdeen North, and gives her the direction that she asked for—that is, a direction to be more specific that Ofcom does indeed have the powers that she seeks.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The ministerial direction that the various platforms are receiving from the Dispatch Box, from our conversations with them and from the Bill’s progress as it goes through the House of Lords will be helpful to them. We do not expect providers to wait until the very last minute to implement the measures. They are starting to do so now, but we want them to go them further, quicker.

Government amendment 4 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details of the measures that they use to restrict access in their terms of service and apply them consistently. Providers will also need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service, as well as reviewing children’s use of higher-risk features, as I have said.

To meet the child safety risk assessment duties in clause 10, providers must assess: the risk of harm to children from functionalities that facilitate the presence or dissemination of harmful content; the level of risk from different kinds of harmful content, giving separate consideration to children in different age groups; the different ways in which the service is used, and the impact of such use on the level of risk of harm; and how the design and operation of the service may increase the risks identified.

The child safety duties in clause 11 apply across all areas of the service, including the way it is operated and used by children, as well as the content present on the service. For the reasons I have set out, I am not able to accept the amendments, but I hope that the hon. Member for Aberdeen North will take on board my assurances.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That was quite helpful. I am slightly concerned about the Minister’s focus on reducing children’s access to the service or to parts of it. I appreciate that is part of what the clause is intended to do, but I would also expect platforms to be able to reduce the ability of adults to access parts of the service or content in order to protect children. Rather than just blocking children, blocking adults from accessing some features—whether that is certain adults or adults as a group—would indeed protect children. My reading of clause 11(4) was that users could be prevented from accessing some of this stuff, rather than just child users. Although the Minister has given me more questions, I do not intend to push the amendment to a vote.

May I ask a question of you, Sir Roger? I have not spoken about clause stand part. Are we still planning to have a clause stand part debate?

None Portrait The Chair
- Hansard -

No.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger; I appreciate the clarification. When I talk about Government amendment 4, I will also talk about clause stand part. I withdraw the amendment.

None Portrait The Chair
- Hansard -

That is up to the Committee.

Amendment, by leave, withdrawn.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Although the previous version of the Bill already focused on protecting children, as I have said, the Government are clear that it must do more to achieve that and to ensure that requirements for providers are as clear as possible. That is why we are making changes to strengthen the Bill. Amendments 4 and 5 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details in their terms of services of the measures that they use to ensure that children below the minimum age are prevented access. Those terms must be applied consistently and be clear and accessible to users. The change will mean that providers can be held to account for what they say in their terms of service, and will no longer do nothing to prevent underage access.

The Government recognise the intent behind amendment 100, which is to ensure that terms of service are clear and accessible for child users, but the Bill as drafted sets an appropriate standard for terms of service. The duty in clause 11(8) sets an objective standard for terms of service to be clear and accessible, rather than requiring them to be clear for particular users. Ofcom will produce codes of practice setting out how providers can meet that duty, which may include provisions about how to tailor the terms of service to the user base where appropriate.

The amendment would have the unintended consequence of limiting to children the current accessibility requirement for terms of service. As a result, any complicated and detailed information that would not be accessible for children—for example, how the provider uses proactive technology—would probably need to be left out of the terms of service, which would clearly conflict with the duty in clause 11(7) and other duties relating to the terms of service. It is more appropriate to have an objective standard of “clear and accessible” so that the terms of service can be tailored to provide the necessary level of information and be useful to other users such as parents and guardians, who are most likely to be able to engage with the more detailed information included in the terms of service and are involved in monitoring children’s online activities.

Ofcom will set out steps that providers can take to meet the duty and will have a tough suite of enforcement powers to take action against companies that do not meet their child safety duties, including if their terms of service are not clear and accessible. For the reasons I have set out, I am not able to accept the amendment tabled by the hon. Member for Aberdeen North and I hope she will withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

As I said, I will also talk about clause 11. I can understand why the Government are moving their amendments. It makes sense, particularly with things like complying with the provisions. I have had concerns all the way along—particularly acute now as we are back in Committee with a slightly different Bill from the one that we were first presented with—about the reliance on terms of service. There is a major issue with choosing to go down that route, given that providers of services can choose what to put in their terms of service. They can choose to have very good terms of service that mean that they will take action on anything that is potentially an issue and that will be strong enough to allow them to take the actions they need to take to apply proportionate measures to ban users that are breaking the terms of service. Providers will have the ability to write terms of service like that, but not all providers will choose to do that. Not all providers will choose to write the gold standard terms of service that the Minister expects everybody will write.

We have to remember that these companies’ and organisations’ No. 1 aim is not to protect children. If their No. 1 aim was to protect children, we would not be here. We would not need an Online Safety Bill because they would be putting protection front and centre of every decision they make. Their No. 1 aim is to increase the number of users so that they can get more money. That is the aim. They are companies that have a duty to their shareholders. They are trying to make money. That is the intention. They will not therefore necessarily draw up the best possible terms of service.

I heard an argument on Report that market forces will mean that companies that do not have strong enough terms of service, companies that have inherent risks in their platforms, will just not be used by people. If that were true, we would not be in the current situation. Instead, the platforms that are damaging people and causing harm—4chan, KiwiFarms or any of those places that cause horrendous difficulties—would not be used by people because market forces would have intervened. That approach does not work; it does not happen that the market will regulate itself and people will stay away from places that cause them or others harm. That is not how it works. I am concerned about the reliance on terms of service and requiring companies to stick to their own terms of service. They might stick to their own terms of service, but those terms of service might be utterly rubbish and might not protect people. Companies might not have in place what we need to ensure that children and adults are protected online.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does the hon. Lady agree that people out there in the real world have absolutely no idea what a platform’s terms of service are, so we are being expected to make a judgment on something about which we have absolutely no knowledge?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Absolutely. The amendment I tabled regarding the accessibility of terms of service was designed to ensure that if the Government rely on terms of service, children can access those terms of service and are able to see what risks they are putting themselves at. We know that in reality children will not read these things. Adults do not read these things. I do not know what Twitter’s terms of service say, but I do know that Twitter managed to change its terms of service overnight, very easily and quickly. Companies could just say, “I’m a bit fed up with Ofcom breathing down my neck on this. I’m just going to change my terms of service, so that Ofcom will not take action on some of the egregious harm that has been done. If we just change our terms of service, we don’t need to bother. If we say that we are not going to ban transphobia on our platform—if we take that out of the terms of service—we do not need to worry about transphobia on our platform. We can just let it happen, because it is not in our terms of service.”

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does the hon. Lady agree that the Government are not relying solely on terms of service, but are rightly saying, “If you say in your terms of service that this is what you will do, Ofcom will make sure that you do it”? Ofcom will take on that responsibility for people, making sure that these complex terms of service are understood and enforced, but the companies still have to meet all the priority illegal harms objectives that are set out in the legislation. Offences that exist in law are still enforced on platforms, and risk-assessed by Ofcom as well, so if a company does not have a policy on race hate, we have a law on race hate, and that will apply.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

It is absolutely the case that those companies still have to do a risk assessment, and a child risk assessment if they meet the relevant criteria. The largest platforms, for example, will still have to do a significant amount of work on risk assessments. However, every time a Minister stands up and talks about what they are requiring platforms and companies to do, they say, “Companies must stick to their terms of service. They must ensure that they enforce things in line with their terms of service.” If a company is finding it too difficult, it will just take the tough things out of their terms of service. It will take out transphobia, it will take out abuse. Twitter does not ban anyone for abuse anyway, it seems, but it will be easier for Twitter to say, “Ofcom is going to try to hold us for account for the fact that we are not getting rid of people for abusive but not illegal messages, even though we say in our terms of service, ‘You must act with respect’, or ‘You must not abuse other users’. We will just take that out of our terms of service so that we are not held to account for the fact that we are not following our terms of service.” Then, because the abuse is not illegal—because it does not meet that bar—those places will end up being even less safe than they are right now.

For example, occasionally Twitter does act in line with its terms of service, which is quite nice: it does ban people who are behaving inappropriately, but not necessarily illegally, on its platform. However, if it is required to implement that across the board for everybody, it will be far easier for Twitter to say, “We’ve sacked all our moderators—we do not have enough people to be able to do this job—so we will just take it all out of the terms of service. The terms of service will say, ‘We will ban people for sharing illegal content, full stop.’” We will end up in a worse situation than we are currently in, so the reliance on terms of service causes me a big, big problem.

Turning to amendment 100, dealing specifically with the accessibility of this feature for child users, I appreciate the ministerial clarification, and agree that my amendment could have been better worded and potentially causes some problems. However, can the Minister talk more about the level of accessibility? I would like children to be able to see a version of the terms of service that is age-appropriate, so that they understand what is expected of them and others on the platform, and understand when and how they can make a report and how that report will be acted on. The kids who are using Discord, TikTok or YouTube are over 13—well, some of them are—so they are able to read and understand, and they want to know how to make reports and for the reporting functions to be there. One of the biggest complaints we hear from kids is that they do not know how to report things they see that are disturbing.

A requirement for children to have an understanding of how reporting functions work, particularly on social media platforms where people are interacting with each other, and of the behaviour that is expected of them, does not mean that there cannot be a more in-depth and detailed version of the terms of service, laying out potential punishments using language that children may not be able to understand. The amendment would specifically ensure that children have an understanding of that.

We want children to have a great time on the internet. There are so many ace things out there and wonderful places they can access. Lego has been in touch, for example; its website is really pretty cool. We want kids to be able to access that stuff and communicate with their friends, but we also want them to have access to features that allow them to make reports that will keep them safe. If children are making reports, then platforms will say, “Actually, there is real problem with this because we are getting loads of reports about it.” They will then be able to take action. They will be able to have proper risk assessments in place because they will be able to understand what is disturbing people and what is causing the problems.

I am glad to hear the Minister’s words. If he were even more clear about the fact that he would expect children to be able to understand and access information about keeping themselves safe on the platforms, then that would be even more helpful.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

On terms and conditions, it is clearly best practice to have a different level of explanation that ensures children can fully understand what they are getting into. The hon. Lady talked about the fact that children do not know how to report harm. Frankly, judging by a lot of conversations we have had in our debates, we do not know how to report harm because it is not transparent. On a number of platforms, how to do that is very opaque.

A wider aim of the Bill is to make sure that platforms have better reporting patterns. I encourage platforms to do exactly what the hon. Member for Aberdeen North says to engage children, and to engage parents. Parents are well placed to engage with reporting and it is important that we do not forget parenting in the equation of how Government and platforms are acting. I hope that is clear to the hon. Lady. We are mainly relying on terms and conditions for adults, but the Bill imposes a wider set of protections for children on the platforms.

Amendment 4 agreed to.

Amendment made: 5, in clause 11, page 11, line 15, after “(5)” insert “, (6A)”.—(Paul Scully.)

This amendment ensures that the duty in clause 11(8) to have clear and accessible terms of service applies to the terms of service mentioned in the new subsection inserted by Amendment 4.

Clause 11, as amended, ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a valid point. This is not just about misinformation and disinformation; it is about leading people to really extreme, vile content on the internet. As we all know, that is a rabbit warren. That situation does not change as soon as a 17-year-old turns 18 on their 18th birthday—that they are then exempt when it comes to seeing this horrendous content. The rules need to be there to protect all of us.

As we have heard, terms and conditions can change overnight. Stakeholders have raised the concern that, if faced with a clearer focus on their terms of service, platforms and providers may choose to make their terms of service shorter, in an attempt to cut out harmful material that, if left undealt with, they may be held liable for.

In addition, the fact that there is no minimum requirement in the regime means that companies have complete freedom to set terms of service for adults, which may not reflect the risks to adults on that service. At present, service providers do not even have to include terms of service in relation to the list of harmful content proposed by the Government for the user empowerment duties—an area we will come on to in more detail shortly as we address clause 14. The Government’s approach and overreliance on terms of service, which as we know can be so susceptible to rapid change, is the wrong approach. For that reason, we cannot support these amendments.

I would just say, finally, that none of us was happy with the term “legal but harmful”. It was a phrase we all disliked, and it did not encapsulate exactly what the content is or includes. Throwing the baby out with the bathwater is not the way to tackle that situation. My hon. Friend the Member for Batley and Spen is right that this is a tricky area, and it is difficult to get it right. We need to protect free speech, which is sacrosanct, but we also need to recognise that there are so many users on the internet who do not have access to free speech as a result of being piled on or shouted down. Their free speech needs to be protected too. We believe that the clauses as they stand in the Bill go some way to making the Bill a meaningful piece of legislation. I urge the Minister not to strip them out, to do the right thing and to keep them in the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Throughout the consideration of the Bill, I have been clear that I do not want it to end up simply being the keep MPs safe on Twitter Bill. That is not what it should be about. I did not mean that we should therefore take out everything that protects adults; what I meant was that we need to have a big focus on protecting children in the Bill, which thankfully we still do. For all our concerns about the issues and inadequacies of the Bill, it will go some way to providing better protections for children online. But saying that it should not be the keep MPs safe on Twitter Bill does not mean that it should not keep MPs safe on Twitter.

I understand how we have got to this situation. What I cannot understand is the Minister’s being willing to stand up there and say, “We can’t have these clauses because they are a risk to freedom of speech.” Why are they in the Bill in the first place if they are such a big risk to freedom of speech? If the Government’s No. 1 priority is making sure that we do not have these clauses, why did they put them in it? Why did it go through pre-legislative scrutiny? Why were they in the draft Bill? Why were they in the Bill? Why did they agree with them in Committee? Why did they agree with them on Report? Why have we ended up in a situation where, suddenly, there is a massive epiphany that they are a threat to freedom of speech and therefore we cannot possibly have them?

What is it that people want to say that they will be banned from saying as a result of this Bill? What is it that freedom of speech campaigners are so desperate to want to say online? Do they want to promote self-harm on platforms? Is that what people want to do? Is that what freedom of speech campaigners are out for? They are now allowed to do that a result of the Bill.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I believe that the triple shield being put in is in place of “legal but harmful”. That will enable users to put a layer of protection in so they can actually take control. But the illegal content still has to be taken down: anything that promotes self-harm is illegal content and would still have to be removed. The problem with the way it was before is that we had a Secretary of State telling us what could be said out there and what could not. What may offend the hon. Lady may not offend me, and vice versa. We have to be very careful of that. It is so important that we protect free speech. We are now giving control to each individual who uses the internet.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The promotion of self-harm is not illegal content; people are now able to do that online—congratulations, great! The promotion of incel culture is not illegal content, so this Bill will now allow people to do that online. It will allow terms of service that do not require people to be banned for promoting incel culture, self-harm, not wearing masks and not getting a covid vaccine. It will allow the platforms to allow people to say these things. That is what has been achieved by campaigners.

The Bill is making people less safe online. We will continue to have the same problems that we have with people being driven to suicide and radicalised online as a result of the changes being made in this Bill. I know the Government have been leaned on heavily by the free speech lobby. I still do not know what people want to say that they cannot say as a result of the Bill as it stands. I do not know. I cannot imagine that anybody is not offended by content online that drives people to hurt themselves. I cannot imagine anybody being okay and happy with that. Certainly, I imagine that nobody in this room is okay and happy with that.

These people have won this war on the attack on free speech. They have won a situation where they are able to promote misogynistic, incel culture and health disinformation, where they are able to say that the covid vaccine is entirely about putting microchips in people. People are allowed to say that now—great! That is what has been achieved, and it is a societal issue. We have a generational issue where people online are being exposed to harmful content. That will now continue.

It is not just a generational societal thing—it is not just an issue for society as a whole that these conspiracy theories are pervading. Some of the conspiracy theories around antisemitism are unbelievably horrific, but do not step over into illegality or David Icke would not be able to stand up and suggest that the world is run by lizard people—who happen to be Jewish. He would not be allowed to say that because it would be considered harmful content. But now he is. That is fine. He is allowed to say that because this Bill is refusing to take action on that.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Can the hon. Lady tell me where in the Bill, as it is currently drafted—so, unamended—it requires platforms to remove legal speech?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

It allows the platforms to do that. It allows them, and requires legal but harmful stuff to be taken into account. It requires the platforms to act—to consider, through risk assessments, the harm done to adults by content that is legal but massively harmful.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is right: the Bill does not require the removal of legal speech. Platforms must take the issue into account—it can be risk assessed—but it is ultimately their decision. I think the point has been massively overstated that, somehow, previously, Ofcom had the power to strike down legal but harmful speech that was not a breach of either terms of service or the law. It never had that power.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Member is making an excellent and very passionate speech, and I commend her for that. Would she agree with one of my concerns, which is about the message that this sends to the public? It is almost that the Government were acknowledging that there was a problem with legal but harmful content—we can all, hopefully, acknowledge that that is a problem, even though we know it is a tricky one to tackle—but, by removing these clauses from the Bill, are now sending the message that, “We were trying to clean up the wild west of the internet, but, actually, we are not that bothered anymore.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The hon. Lady is absolutely right. We have all heard from organisations and individuals who have had their lives destroyed as a result of “legal but harmful”—I don’t have a better phrase for it—content online and of being radicalised by being driven deeper and deeper into blacker and blacker Discord servers, for example, that are getting further and further right wing.

A number of the people who are radicalised—who are committing terror attacks, or being referred to the Prevent programme because they are at risk of committing terror attacks—are not so much on the far-right levels of extremism any more, or those with incredible levels of religious extremism, but are in a situation where they have got mixed up or unclear ideological drivers. It is not the same situation as it was before, because people are being radicalised by the stuff that they find online. They are being radicalised into situations where they “must do something”—they “must take some action”—because of the culture change in society.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The hon. Member is making a powerful point. Just a few weeks ago, I asked the Secretary of State for Digital, Culture, Media and Sport, at the Dispatch Box, whether the horrendous and horrific content that led a man to shoot and kill five people in Keyham—in the constituency of my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard)—would be allowed to remain and perpetuate online as a result of the removal of these clauses from the Bill. I did not get a substantial answer then, but we all know that the answer is yes.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is the thing: this Bill is supposed to be the Online Safety Bill. It is supposed to be about protecting people from the harm that can be done to them by others. It is also supposed to be about protecting people from that radicalisation and that harm that they can end up in. It is supposed to make a difference. It is supposed to be game changer and a world leader.

Although, absolutely, I recognise the importance of the child-safety duties in the clauses and the change that that will have, when people turn 18 they do not suddenly become different humans. They do not wake up on their 18th birthday as a different person from the one that they were before. They should not have to go from that level of protection, prior to 18, to being immediately exposed to comments and content encouraging them to self-harm, and to all of the negative things that we know are present online.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I understand some of the arguments the hon. Lady is making, but that is a poor argument given that the day people turn 17 they can learn to drive or the day they turn 16 they can do something else. There are lots of these things, but we have to draw a line in the sand somewhere. Eighteen is when people become adults. If we do not like that, we can change the age, but there has to be a line in the sand. I agree with much of what the hon. Lady is saying, but that is a poor argument. I am sorry, but it is.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I do not disagree that overnight changes are involved, but the problem is that we are going from a certain level of protection to nothing; there will be a drastic, dramatic shift. We will end up with any vulnerable person who is over 18 being potentially subject to all this content online.

I still do not understand what people think they will have won as a result of having the provisions removed from the Bill. I do not understand how people can say, “This is now a substantially better Bill, and we are much freer and better off as a result of the changes.” That is not the case; removing the provisions will mean the internet continuing to be unsafe—much more unsafe than it would have been under the previous iteration of the Bill. It will ensure that more people are harmed as a result of online content. It will absolutely—

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Will the hon. Lady give way?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

No, I will not give way again. The change will ensure that people can absolutely say what they like online, but the damage and harm that it will cause are not balanced by the freedoms that have been won.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As a Back-Bench Member of Parliament, I recommended that the “legal but harmful” provisions be removed from the Bill. When I chaired the Joint Committee of both Houses of Parliament that scrutinised the draft Bill, it was the unanimous recommendation of the Committee that the “legal but harmful” provisions be removed. As a Minister at the Dispatch Box, I said that I thought “legal but harmful” was a problematic term and we should not use it. The term “legal but harmful” does not exist in the Bill, and has never existed in the Bill, but it has provoked a debate that has caused a huge confusion. There is a belief, which we have heard expressed in debate today, that somehow there are categories of content that Ofcom can deem categories for removal whether they are unlawful or not.

During the Bill’s journey from publication in draft to where we are today, it has become more specific. Rather than our relying on general duties of care, written into the Bill are areas of priority illegal activity that the companies must proactively look for, monitor and mitigate. In the original version of the Bill, that included only terrorist content and child sexual exploitation material, but on the recommendation of the Joint Committee, the Government moved in the direction of writing into the Bill at schedule 7 offences in law that will be the priority illegal offences.

The list of offences is quite wide, and it is more comprehensive than any other such list in the world in specifying exactly what offences are in scope. There is no ambiguity for the platforms as to what offences are in scope. Stalking, harassment and inciting violence, which are all serious offences, as well as the horrible abuse a person might receive as a consequence of their race or religious beliefs, are written into the Bill as priority illegal offences.

There has to be a risk assessment of whether such content exists on platforms and what action platforms should take. They are required to carry out such a risk assessment, although that was never part of the Bill before. The “legal but harmful” provisions in some ways predate that. Changes were made; the offences were written into the Bill, risk assessments were provided for, and Parliament was invited to create new offences and write them into the Bill, if there were categories of content that had not been captured. In some ways, that creates a democratic lock that says, “If we are going to start to regulate areas of speech, what is the legal reason for doing that? Where is the legal threshold? What are the grounds for us taking that decision, if it is something that is not already covered in platforms’ terms of service?”

We are moving in that direction. We have a schedule of offences that we are writing into the Bill, and those priority illegal offences cover most of the most serious behaviour and most of the concerns raised in today’s debate. On top of that, there is a risk assessment of platforms’ terms of service. When we look at the terms of service of the companies—the major platforms we have been discussing—we see that they set a higher bar again than the priority illegal harms. On the whole, platforms do not have policies that say, “We won’t do anything about this illegal activity, race hate, incitement to violence, or promotion or glorification of terrorism.” The problem is that although have terms of service, they do not enforce them. Therefore, we are not relying on terms of service. What we are saying, and what the Bill says, is that the minimum safety standards are based on the offences written into the Bill. In addition, we have risk assessment, and we have enforcement based on the terms of service.

There may be a situation in which there is a category of content that is not in breach of a platform’s terms of service and not included in the priority areas of illegal harm. It is very difficult to think of what that could be—something that is not already covered, and over which Ofcom would not have power. There is the inclusion of the new offences of promoting self-harm and suicide. That captures not just an individual piece of content, but the systematic effect of a teenager like Molly Russell—or an adult of any age—being targeted with such content. There are also new offences for cyber-flashing, and there is Zach’s law, which was discussed in the Chamber on Report. We are creating and writing into the Bill these new priority areas of illegal harm.

Freedom of speech groups’ concern was that the Government could have a secret list of extra things that they also wanted risk-assessed, rather enforcement being clearly based either on the law or on clear terms of service. It is difficult to think of categories of harm that are not already captured in terms of service or priority areas of illegal harm, and that would be on such a list. I think that is why the change was made. For freedom of speech campaigners, there was a concern about exactly what enforcement was based on: “Is it based on the law? Is it based on terms of service? Or is it based on something else?”

I personally believed that the “legal but harmful” provisions in the Bill, as far as they existed, were not an infringement on free speech, because there was never a requirement to remove legal speech. I do not think the removal of those clauses from the Bill suddenly creates a wild west in which no enforcement will take place at all. There will be very effective enforcement based on the terms of service, and on the schedule 7 offences, which deal with the worst kinds of illegal activity; there is a broad list. The changes make it much clearer to everybody—platforms and users alike, and Ofcom—exactly what the duties are, how they are enforced and what they are based on.

For future regulation, we have to use this framework, so that we can say that when we add new offences to the scope of the legislation, they are offences that have been approved by Parliament and have gone through a proper process, and are a necessary addition because terms of service do not cover them. That is a much clearer and better structure to follow, which is why I support the Government amendments.

--- Later in debate ---
Organisations such as the Community Security Trust and the Antisemitism Policy Trust, which do excellent work in this area, have been very clear that someone’s right to be protected from that sort of content should not end the day they turn 18. Duties should remain on platforms to do risk assessments to protect certain groups of adults who may be at increased risk from such content, in order to protect their freedom of speech and expression.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The hon. Member makes a powerful point about the different ways in which people experience things. That tips over into real-life abusive interactions, and goes as far as terrorist incidents in some cases. Does she agree that protecting people’s freedom of expression and safety online also protects people in their real, day-to-day life?

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I could not agree more. I suppose that is why this aspect of the Bill is so important, not just to me but to all those categories of user. I mentioned paragraphs (d) to (f), which would require platforms to assess exactly that risk. This is not about being offended. Personally, I have the skin of a rhino. People can say most things to me and I am not particularly bothered by it. My concern is where things that are said online are transposed into real-life harms. I will use myself as an example. Online, we can see antisemitic and conspiratorial content, covid misinformation, and covid misinformation that meets with antisemitism and conspiracies. When people decide that I, as a Jewish Member of Parliament, am personally responsible for George Soros putting a 5G chip in their arm, or whatever other nonsense they have become persuaded by on the internet, that is exactly the kind of thing that has meant people coming to my office armed with a knife. The kind of content that they were radicalised by on the internet led to their perpetrating a real-life, in-person harm. Thank God—Baruch Hashem—neither I nor my staff were in the office that day, but that could have ended very differently, because of the sorts of content that the Bill is meant to protect online users from.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I accept the points that the hon. Member raised, but he is fundamentally missing the point. The categories of information and content that these people had seen and been radicalised by would not fall under the scope of public order offences or harassment. The person was not sending me harassing messages before they turned up at my office. Essentially, social media companies and other online platforms have to take measures to mitigate the risk of categories of offences that are illegal, whether or not they are in the Bill. I am talking about what clauses 12 and 13 covered, whether we call it the “legal but harmful” category or “lawful but awful”. Whatever we name those provisions, by taking out of the Bill clauses relating to the “legal but harmful” category, we are opening up an area of harm that already exists, that has a real-world impact, and that the Bill was meant to go some way towards addressing.

The provisions have taken out the risk assessments that need to be done. The Bill says,

“(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults”.

Again, the idea that we are talking about offence, and that the clauses need to be taken out to protect free speech, is fundamentally nonsense.

I have already mentioned holocaust denial, but it is also worth mentioning health-related disinformation. We have already seen real-world harms from some of the covid misinformation online. It led to people including Piers Corbyn turning up outside Parliament with a gallows, threatening to hang hon. Members for treason. Obviously, that was rightly dealt with by the police, but the kind of information and misinformation that he had been getting online and that led him to do that, which is legal but harmful, will now not be covered by the Bill.

I will also raise an issue I have heard about from a number of people dealing with cancer and conditions such as multiple sclerosis. People online try to discourage them from accessing the proper medical interventions for their illnesses, and instead encourage them to take more vitamin B or adopt a vegan diet. There are people who have died because they had cancer but were encouraged online to not access cancer treatment because they were subject to lawful but awful categories of harm.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I wonder if the hon. Member saw the story online about the couple in New Zealand who refused to let their child have a life-saving operation because they could not guarantee that the blood used would not be from vaccinated people? Is the hon. Member similarly concerned that this has caused real-life harm?

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I am aware of the case that the hon. Member mentioned. I appreciate that I am probably testing the patience of everybody in the Committee Room, but I want to be clear just how abhorrent I find it that these provisions are coming out of the Bill. I am trying to be restrained, measured and reasonably concise, but that is difficult when there are so many parts of the change that I find egregious.

My final point is on self-harm and suicide content. For men under the age of 45, suicide is the biggest killer. In the Bill, we are doing as much as we can to protect young people from that sort of content. My real concern is this: many young people are being protected by the Bill’s provisions relating to children. They are perhaps waiting for support from child and adolescent mental health services, which are massively oversubscribed. The minute they tick over into 18, fall off the CAMHS waiting list and go to the bottom of the adult mental health waiting list—they may have to wait years for treatment of various conditions—there is no requirement or duty on the social media companies and platforms to do risk assessments.

Online Harms

Kirsty Blackman Excerpts
Wednesday 26th October 2022

(1 year, 6 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I thank the right hon. Member for East Hampshire (Damian Hinds) for securing the debate. As he said, it is the right time to have this discussion, as one of the last opportunities to do so before the legislation leaves the House of Commons. He mentioned a number of organisations that have been in touch and have assisted with information. I do not think he mentioned—I apologise if he did—Refuge and Girlguiding, which both do excellent work and have provided an awful lot of useful information, particularly on how women and girls experience the online world. I accept that he could not possibly have covered every organisation in the time that he had to speak.

I apologise to hon. Members for the lack of Scottish National party colleagues here, which is not intentional: three others were supposed to attend, but for genuinely good reasons that I cannot pass on, they did not. I apologise for the fact that I am the only representative of the SNP—it was not intentional.

I want to pass on a comment from my hon. Friend the Member for Glasgow Central (Alison Thewliss), who highlighted to me what happened to St Albert’s Primary School at the beginning of this month or the tail end of last month. The First Minister of Scotland went to visit the school on 30 September to celebrate the work that it was doing on tackling climate change. As a result, the school was subject to horrific racist abuse. Thousands of racist messages were sent to St Albert’s Primary. I want to highlight that, because it is one of the reasons that we need this legislation. That abuse was aimed specifically at children and was genuinely horrific. I urge the Minister to look at that case so that he is aware.

The Bill has been needed for 30 years. It is not just something that we need now; we have needed it for a long time. I am very pleased that the Commons stages are nearly completed. Along with all other voices here, I urge the Government to please let the Bill come back to us so that we can finish our debate on it and it can complete its Commons stages. I feel as though I have spent quite a significant portion of my life dealing with the Bill, but I recognise that that is nothing compared with the hours that many hon. Members, organisations and staff have put in. It has been uppermost in my mind since the commencement of the Bill Committee earlier this year.

The internet is wonderful and brilliant. There are so many cool and exciting things to do on it. There are so many ways in which it makes our lives easier and enables people to communicate with each other. I can be down here and Facetime my children, which would not have been possible had I been an MP 20 or 30 years ago. Those things are great. It is brilliant for children to be able to access the internet, to be able to access games and to be able to play. It is amazing that there is a new playground for people—one that we did not have 30 years ago—and these are really good things. We need to make sure that the legislation that comes in is permissive and allows those things to continue to happen, but in a way that is safe and that protects children.

Child sexual abuse has been mentioned. I do not want to go into it too much, but for me that is the key thing about the Bill. The Bill largely covers what I would hope it would cover in terms of child sexual abuse. I strenuously resist any suggestion that we need to have total end-to-end encryption that cannot be looked at even if there is suspicion of child sexual abuse, because it is paramount that we protect children and that we are able to catch the perpetrators sharing images.

We have talked about the metaverse and things in the future, but I am still concerned that some of the things that happen today are not adequately covered by the scope of the Bill. I appreciate what the hon. Member for Leeds East (Richard Burgon) said about amendment 159, which is incredibly important. It would allow Ofcom, which is the expert, to classify additional sites that are incredibly harmful as category 1. It would not be down to the Government to say, “We’re adding this one site.” It would be down to Ofcom, the expert, to make those decisions.

Social media is not just Facebook or Twitter. It is not just the way that older adults interact with each other on the internet. It is Fortnite, Discord, Twitch, Snapchat and Roblox. I do not whether Members heard “File on 4” last night, but it was scathing in its criticism of Roblox and the number of horrific experiences that children are subjected to, on a platform that is supposed to be safe. It is promoted as a safe space for children, and it is absolutely not.

I am still massively concerned about clause 49, which talks about exempting

“one-to-one live aural communications”.

If one-to-one live aural communications are exempted, a one-to-one communication on Discord will be exempt from the legislation and will not count as user-generated content, even though it is user-generated content. I understand why the Government have put that in the Bill—it is about exempting telecoms, and I get that—but they have accidentally exempted a platform that groomers use in order to get children off Roblox, Fortnite or whatever they are playing and on to Discord, where they can have a conversation with those children. I am absolutely clear that clause 49 needs to be sorted so that the things the Government want to be exempted are still exempted, but the things that need to be in scope are in scope.

A point was made about the level of addiction, and the level of harm, that can be caused by algorithms. The idea of having a named person is very smart, and it is something that I would wholeheartedly support. It makes a huge amount of sense to include that in the Bill.

We have had an awful lot of chaos in the past wee while. Things have not looked as we expected them to look on any given day—things are changing in a matter of hours—but whatever chaos there is, the Government need to be clear that this issue is really important. It transcends party lines, arguments within the Conservative party and all of that. This is about protecting children and vulnerable people, and ensuring that we have protections in place. We need to make sure that legal but harmful is included in the Bill.

The hon. Member for Leeds East talked about ensuring that vulnerable adults are included in the Bill. We cannot just have provisions in place for children when we are aware that a huge number of adults are vulnerable for various reasons—whether that is because of mental health conditions, learning difficulties or age—and are potentially not protected if legal but harmful does not make it over the final hurdle. I urge the Minister to do that. The key thing is to please bring the Bill back so that we can get it into legislation.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As the hon. Lady knows, I can speak to the Bill; I cannot speak to the business of the House—that is a matter for the business managers in the usual way. Department officials—some here and some back at the Department—have been working tirelessly on the Bill to ensure we can get it in a timely fashion. I want to see it complete its Commons stages and go to the House of Lords as quickly as possible. Our target is to ensure that it receives safe passage in this Session of Parliament. Obviously, I cannot talk to the business of the House, which may alter as a consequence of the changes to Government.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that point, will the Minister assure us that he will push for the Bill to come back? Will he make the case to the business managers that the Bill should come back as soon as possible, in order to fulfil his aim of having it pass in this Session of Parliament?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As the hon. Lady knows, I cannot speak to the business of the House. What I would say is that the Department has worked tirelessly to ensure the safe passage of the Bill. We want to see it on the Floor of the House as quickly as possible—our only objective is to ensure that that happens. I hope that the business managers will be able to confirm shortly when that will be. Obviously, the hon. Lady can raise the issue herself with the Leader of the House at the business statement tomorrow.

Online Safety Bill

Kirsty Blackman Excerpts
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.

In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.

We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright).

As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.

Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:

“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.

I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.

Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.

Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.

I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.

We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.

In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

You will resume your seat no later than 4.20 pm. We will therefore not put the clock on you.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will try to avoid too much preamble, but I thank the former Minister, the hon. Member for Croydon South (Chris Philp), for all his work in Committee and for listening to my nearly 200 contributions, for which I apologise. I welcome the new Minister to his place.

As time has been short today, I am keen to meet the Minister to discuss my new clauses and amendments. If he cannot meet me, I would be keen for him to meet the NSPCC, in particular, on some of my concerns.

Amendment 196 is about using proactive technology to identify CSEA content, which we discussed at some length in Committee. The hon. Member for Croydon South made it very clear that we should use scanning to check for child sexual abuse images. My concern is that new clause 38, tabled by the Lib Dems, might exclude proactive scanning to look for child sexual abuse images. I hope that the Government do not lurch in that direction, because we need proactive scanning to keep children protected.

New clause 18 specifically addresses child user empowerment duties. The Bill currently requires that internet service providers have user empowerment duties for adults but not for children, which seems bizarre. Children need to be able to say yes or no. They should be able to make their own choices about excluding content and not receiving unsolicited comments or approaches from anybody not on their friend list, for example. Children should be allowed to do that, but the Bill explicitly says that user empowerment duties apply only to adults. New clause 18 is almost a direct copy of the adult user empowerment duties, with a few extra bits added. It is important that children have access to user empowerment.

Amendment 190 addresses habit-forming features. I have had conversations about this with a number of organisations, including The Mix. I regularly accessed its predecessor, The Site, more than 20 years ago, and it is concerned that 42% of young people surveyed by YoungMinds show addiction-like behaviour in what they are accessing on social media. There is nothing on that in this Bill. The Mix, the Mental Health Foundation, the British Psychological Society, YoungMinds and the Royal College of Psychiatrists are all unhappy about the Bill’s failure to regulate habit-forming features. It is right that we provide support for our children, and it is right that our children are able to access the internet safely, so it is important to address habit-forming behaviour.

Amendment 162 addresses child access assessments. The Bill currently says that providers need to do a child access assessment only if there is a “significant” number of child users. I do not think that is enough and I do not think it is appropriate, and the NSPCC agrees. The amendment would remove the word “significant.” OnlyFans, for example, should not be able to dodge the requirement to child risk assess its services because it does not have a “significant” number of child users. These sites are massively harmful, and we need to ensure changes are made so they cannot wriggle out of their responsibilities.

Finally, amendment 161 is about live, one-to-one oral communications. I understand why the Government want to exempt live, one-to-one oral communications, as they want to ensure that phone calls continue to be phone calls, which is totally fine, but they misunderstand the nature of things like Discord and how people communicate on Fortnite, for example. People are having live, one-to-one oral communications, some of which are used to groom children. We cannot explicitly exempt them and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way so that children can be protected from the grooming behaviour we see on some online platforms.

Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot meet me, will he please meet the NSPCC? We cannot explicitly exempt those and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way, in order that children can be protected from that grooming behaviour that we see on some of those platforms that are coming online. Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot do that, I ask that the NSPCC have a meeting with him.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

We have had a wide-ranging debate of passion and expert opinion from Members in all parts of the House, which shows the depth of interest in this subject, and the depth of concern that the Bill is delivered and that we make sure we get it right. I speak as someone who only a couple of days ago became the Minister for online safety, although I was previously involved in engaging with the Government on this subject. As I said in my opening remarks, this has been an iterative process, where Members from across the House have worked successfully with the Government to improve the Bill. That is the spirit in which we should complete its stages, both in the Commons and in the Lords, and look at how we operate this regime when it has been created.

I wish to start by addressing remarks made by the hon. Member for Pontypridd (Alex Davies-Jones), the shadow Minister, and by the hon. Member for Cardiff North (Anna McMorrin) about violence against women and girls. There is a slight assumption that if the Government do not accept an amendment that writes, “Violence against women and girls” into the priority harms in the Bill, somehow the Bill does not address that issue. I think we would all agree that that is not the case. The provisions on harmful content that is directed at any individual, particularly the new harms offences approved by the Law Commission, do create offences in respect of harm that is likely to lead to actual physical harm or severe psychological harm. As the father of a teenage girl, who was watching earlier but has now gone to do better things, I say that the targeting of young girls, particularly vulnerable ones, with content that is likely to make them more vulnerable is one of the most egregious aspects of the way social media works. It is right that we are looking to address serious levels of self-harm and suicide in the Bill and in the transparency requirements. We are addressing the self-harm and suicide content that falls below the illegal threshold but where a young girl who is vulnerable is being sent content and prompted with content that can make her more vulnerable, could lead her to harm herself or worse. It is absolutely right that that was in the scope of the Bill.

New clause 3, perfectly properly, cites international conventions on violence against women and girls, and how that is defined. At the moment, with the way the Bill is structured, the schedule 7 offences are all based on existing areas of UK law, where there is an existing, clear criminal threshold. Those offences, which are listed extensively, will all apply as priority areas of harm. If there is, through the work of the Law Commission or elsewhere, a clear legal definition of misogyny and violence against women and girls that is not included, I think it should be included within scope. However, if new clause 3 was approved, as tabled, it would be a very different sort of offence, where it would not be as clear where the criminal threshold applied, because it is not cited against existing legislation. My view, and that of the Government, is that existing legislation covers the sorts of offences and breadth of offences that the shadow Minister rightly mentioned, as did other Members. We should continue to look at this—

--- Later in debate ---
John Nicolson Portrait John Nicolson
- View Speech - Hansard - - - Excerpts

I wish to speak to new clause 33, my proposed new schedule 1 and amendments 201 to 203. I notice that the Secretary of State is off again. I place on record my thanks to Naomi Miles of CEASE—the Centre to End All Sexual Exploitation—and Ceri Finnegan of Barnardos for their support.

The UK Government have taken some steps to strengthen protections on pornography and I welcome the fact that young teenagers will no longer be able to access pornography online. However, huge quantities of extreme and harmful pornography remain online, and we need to address the damage that it does. New clause 33 would seek to create parity between online and offline content—consistent legal standards for pornography. It includes a comprehensive definition of pornography and puts a duty on websites not to host content that would fail to attain the British Board of Film Classification standard for R18 classification.

The point of the Bill, as the Minister has repeatedly said, is to make the online world a safer place, by doing what we all agree must be done—making what is illegal offline, illegal online. That is why so many Members think that the lack of regulation around pornography is a major omission in the Bill.

The new clause stipulates age and consent checks for anyone featured in pornographic content. It addresses the proliferation of pornographic content that is both illegal and harmful, protecting women, children and minorities on both sides of the camera.

The Bill presents an opportunity to end the proliferation of illegal and harmful content on the internet. Representations of sexual violence, animal abuse, incest, rape, coercion, abuse and exploitation—particularly directed towards women and children—are rife. Such content can normalise dangerous and abusive acts and attitudes, leading to real-world harm. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said in her eloquent speech earlier, we are seeing an epidemic of violence against women and girls online. When bile and hatred is so prolific online, it bleeds into the offline space. There are real-world harms that flow from that.

The Minister has said how much of a priority tackling violence against women and girls is for him. Knowing that, and knowing him, he will understand that pornography is always harmful to children, and certain kinds of pornographic content are also potentially harmful to adults. Under the Video Recordings Act 1984, the BBFC has responsibility for classifying pornographic content to ensure that it is not illegal, and that it does not promote an interest in abusive relationships, such as incest. Nor can it promote acts likely to cause serious physical harm, such as breath restriction or strangulation. In the United Kingdom, it is against the law to supply pornographic material that does not meet this established BBFC classification standard, but there is no equivalent standard in the online world because the internet evolved without equivalent regulatory oversight.

I know too that the Minister is determined to tackle some of the abusive and dangerous pornographic content online. The Bill does include a definition of pornography, in clause 66(2), but that definition is inadequate; it is too brief and narrow in scope. In my amendment, I propose a tighter and more comprehensive definition, based on that in part 3 of the Digital Economy Act 2017, which was debated in this place and passed into law. The amendment will remove ambiguity and prevent confusion, ensuring that all websites know where they stand with regard to the law.

The new duty on pornographic websites aligns with the UK Government’s 2020 legislation regulating UK-established video-sharing platforms and video-on-demand services, both of which appeal to the BBFC’s R18 classification standards. The same “high standard of rules in place to protect audiences”, as the 2020 legislation put it, and “certain content standards” should apply equally to online pornography and offline pornography, UK-established video-sharing platforms and video-on-demand services.

Let me give some examples sent to me by Barnardo’s, the children’s charity, which, with CEASE, has done incredibly important work in this area. The names have been changed in these examples, for obvious reasons.

“There are also children who view pornography to try to understand their own sexual abuse. Unfortunately, what these children find is content that normalises the most abhorrent and illegal behaviours, such as 15-year-old Elizabeth, who has been sexually abused by a much older relative for a number of years. The content she found on pornography sites depicted older relatives having sex with young girls and the girls enjoying it. It wasn’t until she disclosed her abuse that she realised that it was not normal.

Carrie is a 16-year-old who was being sexually abused by her stepfather. She thought this was not unusual due to the significant amount of content she had seen on pornography sites showing sexual relationships within stepfamilies.”

That is deeply disturbing evidence from Barnardo’s.

Although in theory the Bill will prevent under-18s from accessing such content, the Minister knows that under-18s will be able to bypass regulation through technology like VPNs, as the DCMS Committee and the Bill Committee—I served on both—were told by experts in various evidence sessions. The amendment does not create a new law; it merely moves existing laws into the online space. There is good cause to regulate and sometimes prohibit certain damaging offline content; I believe it is now our duty to provide consistency with legislation in the online world.

Kirsty Blackman Portrait Kirsty Blackman
- View Speech - Hansard - -

I want to talk about several things, but particularly new clause 7. I am really pleased that the new clause has come back on Report, as we discussed it in the Bill Committee but unfortunately did not get enough support for it there—as was the case with everything we proposed—so I thank the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for tabling it. I also thank my hon. Friend the Member for Inverclyde (Ronnie Cowan) for his lobbying and for providing us with lots of background information. I agree that it is incredibly important that new clause 7 is agreed, particularly the provisions on consent and making sure that participants are of an appropriate age to be taking part. We have heard so many stories of so many people whose videos are online—whose bodies are online—and there is nothing they can do about it because of the lack of regulation. My hon. Friend the Member for Ochil and South Perthshire (John Nicolson) has covered new clause 33 in an awful lot of detail—very good detail—so I will not comment on that.

The right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) mentioned how we need to get the balance right, and specifically talked about the role of the regulator. In many ways, this Bill has failed to get the balance right in its attempts to protect children online. Many people who have been involved in writing this Bill, talking about this Bill, scrutinising this Bill and taking part in every piece of work that we have done around it do not understand how children use the internet. Some people do, absolutely, but far too many of the people who have had any involvement in this Bill do not. They do not understand the massive benefits to children of using the internet, the immense amount of fun they can have playing Fortnite, Fall Guys, Minecraft, or whatever it is they happen to be playing online and how important that is to them in today’s crazy world with all of the social media pressures. Children need to decompress. This is a great place for children to have fun—to have a wonderful time—but they need to be protected, just as we would protect them going out to play in the park, just the same as we would protect them in all other areas of life. We have a legal age for smoking, for example. We need to make sure that the protections are in place, and the protections that are in place need to be stronger than the ones that are currently in the Bill.

I did not have a chance earlier—or I do not think I did—to support the clause about violence against women and girls. As I said in Committee, I absolutely support that being in the Bill. The Government may say, “Oh we don’t need to have this in the Bill because it runs through everything,” but having that written in the Bill would make it clear to internet service providers—to all those people providing services online and having user-generated content on their sites—how important this is and how much of a scourge it is. Young women who spend their time on social media are more likely to have lower outcomes in life as a result of problematic social media use, as a result of the pain and suffering that is caused. We should be putting such a measure in the Bill, and I will continue to argue for that.

We have talked a lot about pornographic content in this section. There is not enough futureproofing in the Bill. My hon. Friend the Member for Ochil and South Perthshire and I tabled amendment 158 because we are concerned about that lack of futureproofing. The amendment edits the definition of “content”. The current definition of “content” says basically anything online, and it includes a list of stuff. We have suggested that it should say “including but not limited to”, on the basis that we do not know what the internet will look like in two years’ time, let alone what it will look like in 20 years’ time. If this Bill is to stand the test of time, it needs to be clear that that list is not exhaustive. It needs to be clear that, when we are getting into virtual reality metaverses where people are meeting each other, that counts as well. It needs to be clear that the sex dungeon that exists in the child’s game Roblox is an issue—that that content is an issue no matter whether it fits the definition of “content” or whether it fits the fact that it is written communication, images or whatever. It does not need to fit any of that. If it is anything harmful that children can find on the internet, it should be included in that definition of “content”, no matter whether it fits any of those specific categories. We just do not know what the internet is going to look like.

I have one other specific thing in relation to the issues of content and pornography. One of the biggest concerns that we heard is the massive increase in the amount of self-generated child sexual abuse images. A significant number of new images of child sexual abuse are self-generated. Everybody has a camera phone these days. Kids have camera phones these days. They have much more potential to get themselves into really uncomfortable and difficult situations than when most of us were younger. There is so much potential for that to be manipulated unless we get this right.

Online Safety Bill (Fifteenth sitting)

Kirsty Blackman Excerpts
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.

When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his

“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]

in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?

Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.

The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.

On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the hon. Member for Worsley and Eccles South, that we want to see these provisions brought into force as quickly as possible, for the reasons that she set out. We are actively thinking about ways of ensuring that these provisions are brought into force as fast as possible. It is something that we have been actively discussing with Ofcom, and that, I hope, will be reflected in the road map that it intends to publish before the summer. That will of course remain an area of close working between the Department for Digital, Culture, Media and Sport and Ofcom, ensuring that these provisions come into force as quickly as possible. Of course, the illegal duties will be brought into force more quickly. That includes the CSEA offences set out in schedule 6.

The hon. Member for Aberdeen North raised questions in relation to the repeal of part 3 of the Digital Economy Act. Although that is on the statute book, it was never commenced. When it is repealed, we will not be removing from force something that is applied at the moment, because the statutory instrument to commence it was never laid. So the point she raised about whether the Bill would come into force the day after the Digital Economy Act is repealed does not apply; but the point she raised about bringing this legislation into force quickly is reasonable and right, and we will work on that.

The hon. Lady asked about the differences in scope between the video-sharing platform and the online safety regime. As I said, the online safety regime does have an increased scope compared with the VSP regime, but I think it is reasonable to keep an eye on that as she suggested, and keep it under review. There is of course a formal review mechanism in clause 149, but I think that more informally, it is reasonable that as the transition is made we keep an eye on it, as a Government and as parliamentarians, to ensure that nothing gets missed out.

I would add that, separately from the Bill, the online advertising programme is taking a holistic look at online advertising in general, and that will also be looking at matters that may also touch on the VSPs and what they regulate.

Question put and agreed to.

Clause 170 accordingly ordered to stand part of the Bill.

Clauses 171 and 172 ordered to stand part of the Bill.

Clause 173

Powers to amend section 36

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Good morning, Sir Roger. As the Minister has outlined, clause 173 gives the Secretary of State the power to amend the list of fraud offences in what will be section 36 in relation to the duties about fraudulent advertising. Although we recognise that this power is subject to some constraints, Labour has concerns about what we consider to be an unnecessary power given to the Secretary of State to amend duties about fraudulent advertising on category 1 services.

We welcome the provisions outlined in clause 173(2), which lists the criteria that any new offences must meet before the Secretary of State may include them in the list of fraud offences in section 36. The Minister outlined some of those. Along the same lines, the provision in clause 173(3) to further limit the Secretary of State’s power to include new fraud offences—it lists types of offences that may not be added to section 36—is a positive step.

However, we firmly believe that delegated law making of this nature, even when there are these minor constraints in place, is a worrying course for the Government to pursue when we have already strongly verbalised our concerns about Ofcom’s independence. Can the Minister alleviate our concerns by clarifying exactly how this process will work in practice? He must agree with the points that colleagues from across the House have made about the importance of Ofcom being truly independent and free from any political persuasion, influence or control. We all want to see the Bill change things for the better so I am keen to hear from the Minister the specific reasoning behind giving the Secretary of State the power to amend this important legislation through what will seemingly be a simple process.

As we all know, clause 174 allows the Secretary of State to make regulations to amend or repeal provisions relating to exempt content or services. Regulations made under this clause can be used to exempt certain content or services from the scope of the regulatory regime, or to bring them into scope. It will come as no surprise to the Minister that we have genuine concerns about the clause, given that it gives the Secretary of State of the day the power to amend the substantive scope of the regulatory regime. In layman’s terms, we see this clause as essentially giving the Secretary of State the power to, through regulations, exempt certain content and services from the scope of the Bill, or bring them into scope. Although we agree with the Minister that a degree of flexibility is crucial to the Bill’s success and we have indeed raised concerns throughout the Bill’s proceedings about the need to future-proof the Bill, it is a fine balance, and we feel that these powers in this clause are in excess of what is required. I will therefore be grateful to the Minister if he confirms exactly why this legislation has been drafted in a way that will essentially give the Secretary of State free rein on these important regulations.

Clauses 175 and 176 seek to give the Secretary of State additional powers, and again Labour has concerns. Clause 175 gives the Secretary of State the power to amend the list in part 2 of schedule 1, specifically paragraph 10. That list sets out descriptions of education and childcare relating to England; it is for the relevant devolved Ministers to amend the list in their respective areas. Although we welcome the fact that certain criteria must be met before the amendments can be made, this measure once again gives the Secretary of State of the day the ability substantively to amend the scope of the regime more broadly.

Those concerns are felt even more strongly when we consider clause 176, which gives the Secretary of State the power to amend three key areas in the Bill—schedules 5, 6 and 7, which relate to terrorism offences, to child sexual exploitation and abuse content offences—except those extending to Scotland—and to priority offences in some circumstances. Alongside stakeholders, including Carnegie, we strongly feel that the Secretary of State should not be able to amend the substantive scope of the regime at this level, unless moves have been initiated by Ofcom and followed by effective parliamentary oversight and scrutiny. Parliament should have a say in this. There should be no room for this level of interference in a regulatory regime, and the Minister knows that these powers are at risk of being abused by a bad actor, whoever the Secretary of State of the day may be. I must, once again, press the Minister to specifically address the concerns that Labour colleagues and I have repeatedly raised, both during these debates and on Second Reading.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of questions, particularly on clause 176 and the powers to amend schedules 6 and 7. I understand the logic for schedule 5 being different—in that terrorism offences are a wholly reserved matter—and therefore why only the Secretary of State would be making any changes.

My question is on the difference in the ways to amend schedules 6 and 7—I am assuming that Government amendment 126, which asks the Secretary of State to consult Scottish Ministers and the Department of Justice in Northern Ireland, and which we have already discussed, will be voted on and approved before we come to clause 176. I do not understand the logic for having different procedures to amend the child sexual exploitation and abuse offences and the priority offences. Why have the Government chosen two different procedures for amending the two schedules?

I understand why that might not be a terribly easy question to answer today, and I would be happy for the Minister to get in touch afterwards with the rationale. It seems to me that both areas are very important, and I do not quite understand why the difference is there.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by addressing the questions the shadow Minister raised about these powers. She used the phrase “free rein” in her speech, but I would not exactly describe it as free rein. If we turn to clause 179, which we will come to in a moment or two, and subsection (1)(d), (e), (f) and (g), we see that all the regulations made under clauses 173 to 176, which we are debating, require an SI under the affirmative procedure. Parliament will therefore get a chance to have its say, to object and indeed to vote down a provision if it wishes to. It is not that the Secretary of State can act alone; changes are subject to the affirmative SI procedure.

It is reasonable to have a mechanism to change the lists of priority offences and so on by affirmative SI, because the landscape will change and new offences will emerge, and it is important that we keep up to date. The only alternative is primary legislation, and a slot for a new Act of Parliament does not come along all that often—perhaps once every few years for any given topic. I think that would lead to long delays—potentially years—before the various exemptions, lists of priority offences and so on could be updated. I doubt that it is Parliament’s intention, and it would not be good for the public if we had to wait for primary legislation to change the lists. The proposed mechanism is the only sensible and proportionate way to do it, and it is subject to a parliamentary vote.

A comment was made about Ofcom’s independence. The way the offences are defined has no impact on Ofcom’s operational independence. That is about how Ofcom applies the rules; this is about what the rules themselves are. It is right that we are able to update them relatively nimbly by affirmative SI.

The hon. Member for Aberdeen North asked about the differences in the way schedules 6 and 7 can be updated. I will happily drop her a line with further thoughts if she wants me to, but in essence we are happy to get the Scottish child sexual exploitation and abuse offences, set out in part 2 of schedule 6, adopted as soon as Scottish Ministers want. We do not want to delay any measures on child exploitation and abuse, and that is why it is done automatically. Schedule 7, which sets out the other priority offences, could cover any topic at all—any criminal offence could fall under that schedule—whereas schedule 6 is only about child sexual exploitation and abuse. Given that the scope of schedule 7 takes in any criminal offence, it is important to consult Scottish Ministers if it is a Scottish offence but then use the statutory instrument procedure, which applies it to the entire UK internet. Does the hon. Lady want me to write to her, or does that answer her question?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is actually incredibly helpful. I do not need a further letter, thanks.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to the hon. Lady for saving DCMS officials a little ink, and electricity for an email.

I hope I have addressed the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 173 accordingly ordered to stand part of the Bill.

Clauses 174 and 175 ordered to stand part of the Bill.

Clause 176

Powers to amend Schedules 5, 6 and 7

Amendment made: 126, in clause 176, page 145, line 4, at end insert—

“(5A) The Secretary of State must consult the Scottish Ministers before making regulations under subsection (3) which—

(a) add an offence that extends only to Scotland, or

(b) amend or remove an entry specifying an offence that extends only to Scotland.

(5B) The Secretary of State must consult the Department of Justice in Northern Ireland before making regulations under subsection (3) which—

(a) add an offence that extends only to Northern Ireland, or

(b) amend or remove an entry specifying an offence that extends only to Northern Ireland.”—(Chris Philp.)

This amendment ensures that the Secretary of State must consult the Scottish Ministers or the Department of Justice in Northern Ireland before making regulations which amend Schedule 7 in connection with an offence which extends to Scotland or Northern Ireland only.

Clause 176, as amended, ordered to stand part of the Bill.

Clause 177

Power to make consequential provision

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Not really. If the hon. Lady has finished with her own amendments, we should, as a courtesy, allow the SNP spokesperson to speak to her amendment first.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger. I thank the shadow Minister for running through some of our shared concerns about the clauses. Similarly, I will talk first about some of the issues and questions that I have about the clauses, and then I will speak to amendment 76. Confusingly, amendment 76 was tabled to clause 189, which we are not discussing right now. I should have raised that when I saw the provisional selection of amendments. I will do my best not to stray too far into clause 189 while discussing the amendment.

I have raised before with the Minister some of the questions and issues that I have. Looking specifically at clause 181, I very much appreciate the clarification that he has given us about users, what the clause actually means, and how the definition of “user” works. To be fair, I agree with the way the definition of “user” is written. My slight concern is that, in measuring the number of users, platforms might find it difficult to measure the number of unregistered users and the number of users who are accessing the content through another means.

Let us say, for example, that someone is sent a WhatsApp message with a TikTok link and they click on that. I do not know whether TikTok has the ability to work out who is watching the content, or how many people are watching it. Therefore, I think that TikTok might have a difficulty when it comes to the child safety duties and working out the percentage or number of children who are accessing the service, because it will not know who is accessing it through a secondary means.

I am not trying to give anyone a get-out clause. I am trying to ensure that Ofcom can properly ensure that platforms that have a significant number of children accessing them through secondary means are still subject to the child safety duties even though there may not be a high number of children accessing the platform or the provider directly. My major concern is assessing whether they are subject to the child safety duties laid out in the Bill.

I will move straight on to our amendment 76, which would amend the definition of “content” in clause 189. I have raised this issue with the Minister already. The clause, as amended, would state that

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including but not limited to”—

and then a list. The reason I suggest that we should add those words “but not limited to” is that if we are to have a list, we should either make an exhaustive list or have clarity that there are other things that may not be on the list.

I understand that it could be argued that the word “including” suggests that the provision actually goes much wider than what is in the list. I understand that that is the argument that the Minister may make, but can we have some more clarity from him? If he is not willing to accept the amendment but he is willing to be very clear that, actually, the provision does include things that we have not thought of and that do not currently exist and that it genuinely includes anything communicated by means of an internet service, that will be very helpful.

I think that the amendment would add something positive to the Bill. It is potentially the most important amendment that I have tabled in relation to future-proofing the Bill, because it does feel as though the definition of “content”, even though it says “including”, is unnecessarily restrictive and could be open to challenge should someone invent something that is not on the list and say, “Well, it’s not mentioned, so I am not going to have to regulate this in the way we have to regulate other types of content.”

I have other questions about the same provision in clause 189, but I will hold on to those until we come to the next grouping.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise briefly to support amendment 76, in the name of the hon. Member for Aberdeen North. Labour supports broadening the definition of “content” in this way. I refer the Minister to our earlier contributions about the importance of including newspaper comments, for example, in the scope of the Bill. This is a clear example of a key loophole in the Bill. We believe that a broadened definition of “content” would be a positive step forward to ensure that there is future-proofing, to prevent any unnecessary harm from any future content.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause sets out the meanings of various terms used in the Bill. Throughout our Committee debates, Labour has raised fundamental concerns on a number of points where we feel the interpretation of the Bill requires clarification. We raised concerns as early as clause 8, when we considered the Bill’s ability to capture harm in relation to newly produced CSEA content and livestreaming. The Minister may feel he has sufficiently reassured us, but I am afraid that simply is not the case. Labour has no specific issues with the interpretations listed in clause 189, but we will likely seek to table further amendments on Report in the areas that we feel require clarification.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

In one of our earlier debates, I asked the Minister about the difference between “oral” and “aural”, and I did not get a very satisfactory answer. I know the difference in their dictionary definition—I understand that they are different, although the words sound the same. I am confused that clause 189 uses “oral” as part of the definition of content, but clause 49 refers to

“one-to-one live aural communications”

in defining things that are excluded.

I do not understand why the Government have chosen to use those two different words in different places in the Bill. It strikes me that, potentially, we mean one or the other. If they do mean two different things, why has one thing been chosen for clause 49 and another thing for clause 189? Why has the choice been made that clause 49 relates to communications that are heard, but clause 189 relates to communications that are said? I do not quite get the Government’s logic in using those two different words.

I know this is a picky point, but in order to have good legislation, we want it to make sense, for there to be a good rationale for everything that is in it and for people to be able to understand it. At the moment, I do not properly understand why the choice has been made to use two different words.

More generally, the definitions in clause 189 seem pretty sensible, notwithstanding what I said in the previous debate in respect of amendment 76, which, with your permission, Sir Roger, I intend to move when we reach the appropriate point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As the hon. Member for Pontypridd said, clause 189 sets out various points of definition and interpretation necessary for the Bill to be understood and applied.

I turn to the question raised by the hon. Member for Aberdeen North. First, I strongly commend and congratulate her on having noticed the use of the two words. Anyone who thinks that legislation does not get properly scrutinised by Parliament has only to look to the fact that she spotted this difference, 110 pages apart, in two different clauses—clauses 49 and 189. That shows that these things do get properly looked at. I strongly congratulate her on that.

I think the best way of addressing her question is probably to follow up with her after the sitting. Clause 49 relates to regulated user-to-user content. We are in clause 49(2)—is that right?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Subsection (5).

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is cross-referenced in subsection (5). The use of the term “aural” in that subsection refers to sound only—what might typically be considered telephony services. “Oral” is taken to cover livestreaming, which includes pictures and voice. That is the intention behind the use of the two different words. If that is not sufficient to explain the point—it may not be—I would be happy to expand in writing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That would be helpful, in the light of the concerns I raised and what the hon. Member for Pontypridd mentioned about gaming, and how those communications work on a one-to-one basis. Having clarity in writing on whether clause 49 relates specifically to telephony-type services would be helpful, because that is not exactly how I read it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Given that the hon. Lady has raised the point, it is reasonable that she requires more detail. I will follow up in writing on that point.

Amendment proposed: 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.(Kirsty Blackman.)

This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Opposition spokesperson has said it all.

Amendment 141 agreed to.

Clause 192, as amended, ordered to stand part of the Bill.

Clause 193

Commencement and transitional provision

None Portrait The Chair
- Hansard -

Amendment 139 was tabled by a Member who is not a member of the Committee, and nobody has claimed it, so we come to amendment 49.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 49, in clause 193, page 161, line 1, leave out subsection (2) and insert—

“(2) Subject to subsection (2A) below, the other provisions of this Act come into force on such day as the Secretary of State may by regulations appoint.

(2A) The provisions of Part 5 shall come into force at the end of the period of three months beginning with the day on which this Act is passed.”

This amendment would bring Part 5 into force three months after the Act is passed.

We all understand the need for the Bill, which is why we have been generally supportive in Committee. I hope we can also agree that the measures that the Bill introduces must come into force as soon as is reasonably possible. That is particularly important for the clauses introducing protections for children, who have been subject to the harms of the online world for far too long already. I was glad to hear the Minister say in our discussions of clauses 31 to 33 that the Government share the desire to get such protections in place quickly.

My hon. Friend the Member for Worsley and Eccles South also spoke about our concerns about the commencement and transitional provisions when speaking to clauses 170 to 172. We fundamentally believe that the provisions on pornography in part 5 cannot, and should not, be susceptible to further delay, because they require no secondary legislation. I will come to that point in my comments on the amendment. More broadly, I will touch briefly on the reasons why we cannot wait for the legislation and make reference to a specific case that I know colleagues across the House are aware of.

My hon. Friend the Member for Reading East (Matt Rodda) has been a powerful voice on behalf of his constituents Amanda and Stuart Stephens, whose beloved son Olly was tragically murdered in a field outside his home. A BBC “Panorama” investigation, shown only a few days ago, investigated the role that social media played in Olly’s death. It specifically highlighted disturbing evidence that some social media algorithms may still promote violent content to vulnerable young people. That is another example highlighting the urgent need for the Bill, along with a regulatory process to keep people safe online.

We also recognise, however, the important balance between the need for effective development of guidance by Ofcom, informed by consultation, and the need to get the duties up and going. In some cases, that will mean having to stipulate deadlines in the Bill, which we feel is a serious omission and oversight at present.

The amendment would bring part 5 of the Bill into force three months after it is enacted. The Minister knows how important part 5 is, so I do not need to repeat myself. The provisions of the amendment, including subsequent amendments that Labour and others will likely table down the line, are central to keeping people safe online. We have heard compelling evidence from experts and speeches from colleagues across the House that have highlighted how vital it is that the Bill goes further on pornographic content. The amendment is simple. It seeks to make real, meaningful change as soon as is practically possible. The Bill is long delayed, and providers and users are desperate for clarity and positive change, which is what led us to tabling the amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

In the interests of not having to make a speech in this debate, I want to let the hon. Member know that I absolutely support the amendment. It is well balanced, brings the most important provisions into force as soon as possible, and allows the Secretary of State to appoint dates for the others.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the hon. Member’s intervention, and I am grateful for her and her party’s support for this important amendment.

It is also worth drawing colleagues’ attention to the history of issues, which have been brought forward in this place before. We know there was reluctance on the part of Ministers when the Digital Economy Act 2017 was on the parliamentary agenda to commence the all-important part 3, which covered many of the provisions now in part 5. Ultimately, the empty promises made by the Minister’s former colleagues have led to huge, record failures, even though the industry is ready, having had years to prepare to implement the policy. I want to place on record my thanks to campaigning groups such as the Age Verification Providers Association and others, which have shown fierce commitment in getting us this far.

It might help if I cast colleagues’ minds back to the Digital Economy Act 2017, which received Royal Assent in April of that year. Following that, in November 2018, the then Minister of State for Digital and Creative Industries told the Science and Technology Committee that part 3 of the DEA would be in force “by Easter next year”. Then, in December 2018, both Houses of Parliament approved the necessary secondary legislation, the Online Pornography (Commercial Basis) Regulations 2018, and the required statutory guidance.

But shortly after, in April 2018, the first delay arose when the Government published an online press release stating that part 3 of the DEA would not come into force until 15 July 2019. However, June 2019 came around and still there was nothing. On 20 June, five days after it should have come into force, the then Under-Secretary of State told the House of Lords that the defendant had failed to notify the European Commission of the statutory guidance, which would need to be done, and that that would result in a delay to the commencement of part 3

“in the region of six months”.—[Official Report, House of Lords, 20 June 2019; Vol. 798, c. 883.]

However, on 16 October 2019, the then Secretary of State announced via a written statement to Parliament that the Government

“will not be commencing part 3 of the Digital Economy Act 2017 concerning age verification for online pornography.”—[Official Report, 16 October 2019; Vol. 666, c. 17WS.]

A mere 13 days later, the Government called a snap general election. I am sure those are pretty staggering realities for the Minister to hear—and defend—but I am willing to listen to his defence. It really is not good enough. The industry is ready, the technology has been there for quite some time, and, given this Government’s fondness for a U-turn, there are concerns that part 5 of the Bill, which we have spent weeks deliberating, could be abandoned in a similar way as part 3 of the DEA was.

The Minister has failed to concede on any of the issues we have raised in Committee. It seems we are dealing with a Government who are ignoring the wide-ranging gaps and issues in the Bill. He has a relatively last-ditch opportunity to at least bring about some positive change, and to signify that he is willing to admit that the legislation as it stands is far from perfect. The provisions in part 5 are critical—they are probably the most important in the entire Bill—so I urge him to work with Labour to make sure they are put to good use in a more than reasonable timeframe.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I do not think that the right hon. Lady has misunderstood what I said. I said that the new clause would allow the Secretary of State to appoint a new or existing body as the statutory user advocate, so it could very much be either.

New clause 3 would also rebalance the interests of children against the vocal and well-resourced regulated companies. I think that is a key argument for having an advocacy body. Without such a counterbalance, large tech companies could attempt to capture independent expert voices, fund highly selective research with the intent to skew the evidence base, and then challenge regulatory decisions with the evidence base they have created.

Those tactics are not new; similar tactics are used in other regulated sectors, such as the tobacco industry. In line with other sectors, the user advocacy body should be funded by a levy on regulated companies. That would be in line with the “polluter pays” principle in part 6 and would be neutral to the Exchequer—another reason to accept it. Compared with the significant benefits and improved outcomes it would create, the levy would represent only a minimal additional burden on companies.

There is strong support for the creation of a user advocate. Research by the NSPCC shows that 88% of UK adults who responded to a YouGov survey think that it is necessary for the Bill to introduce a requirement for an independent body that can protect the interests of children at risk of online harms, including grooming and child sexual abuse.

It is also a popular option among children. YoungMinds has said that young people do not feel they are being included enough in the drafting of the Bill. It evidenced that with research it undertook that found that almost 80% of young people aged 11 to 25 surveyed had never even heard of the Bill.

A young woman told the NSPCC why she felt a children’s advocacy body is needed. She is a survivor of online grooming, and it is worth sharing what she said in full, because it is powerful and we have not shared the voices of young people enough. She said:

“When I was 13, a man in his 30s contacted me on Facebook. I added him because you just used to add anyone on Facebook. He started messaging me and I liked the attention. We’d speak every day, usually late at night for hours at a time…He started asking for photos, so I sent some. Then he asked for some explicit photos, so I did that too, and he reciprocated…In my eyes, telling anyone in my life about this man was not an option. We need to stop putting the responsibility on a vulnerable child to prevent crime and start living in a world which puts keeping children safe first. That means putting child safety at the heart of policy. I want a statutory child user advocacy body funded by the industry levy. This would play a vital role in advocating for children’s rights in regulatory debates. Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side. Having a body stand up for the rights of children in such a vulnerable position is invaluable…it is so rare that voices like mine have a chance to be heard by policy makers. Watching pre legislative debates I’ve been struck by how detached from my lived experience they can be”—

that is very much the point that my hon. Friend the Member for Batley and Spen made—

“and indeed the lived experiences of thousands of others. If we want to protect children, we need to understand and represent what they need.”

I hope that the Committee will recognise the bravery of that young woman in speaking about her experiences as a survivor of online grooming. I hope that the Minister will respect the insights she offers and consider the merits of having a user advocacy body to support children and young people experiencing harms online.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I read new clause 3 in conjunction with the starred new clause 44, because it makes sense to consider the funding of the advocacy body, and the benefits of that funding, when discussing the merits of such a body. Part of that is because the funding of the advocacy body, and the fact that it needs to be funded, is key to its operation, and a key reason why we need it.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Lady is making some excellent points. I wholeheartedly agree with her about funding for bodies that might be able to support the advocacy body or act as part of it. She makes a really important point, which we have not focused on enough during the debate, about the positive aspects of the internet. It is very easy to get bogged down in all the negative stuff, which a lot of the Bill focuses on, but she is right that the internet provides a safe space, particularly for young people, to seek out their own identity. Does she agree that the new clause is important because it specifically refers to protected characteristics and to the Equality Act 2010? I am not sure where else that appears in the Bill, but it is important that it should be there. We are thinking not just about age, but about gender, disability and sexual orientation, which is why this new clause could be really important.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. I had not thought about it in those terms, but the hon. Member is right that the new clause gives greater importance to those protected characteristics and lays that out in the Bill.

I appreciate that, under the risk assessment duties set out in the Bill, organisations have to look at protected characteristics in groups and at individuals with those protected characteristics, which I welcome, but I also welcome the inclusion of protected characteristics in the new clause in relation to the duties of the advocacy body. I think that is really important, especially, as the hon. Member for Batley and Spen just said, in relation to the positive aspects of the internet. It is about protecting free speech for children and young people and enabling them to find community and enjoy life online and offline.

Will the Minister give serious consideration to the possibility of a user advocacy body? Third sector organisations are calling for that, and I do not think Ofcom could possibly have the expertise to match such a body.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I want briefly to interject to underline the point I made in my intervention on the hon. Member for Worsley and Eccles South. I welcome the discussion about victims’ support, which picks up on what we discussed on clause 110. At that point I mentioned the NSPCC evidence that talked about the importance of third party advocacy services, due to the lack of trust in the platforms, as well as for some of the other reasons that the hon. Members for Worsley and Eccles South, for Batley and Spen, and for Aberdeen North have raised.

When we discussed clause 110, the Minister undertook to think about the issue seriously and to talk to the Treasury about whether funding could be taken directly from fines rather than those all going into the Treasury coffers. I hope the debate on new clause 3 will serve to strengthen his resolve, given the strength of support for such a measure, whether that is through a formal user advocacy service or by using existing organisations. I hope he uses the debate to strengthen his arguments about such a measure with the Treasury.

I will not support the new clause tabled by the hon. Member for Worsley and Eccles South, because I think the Minister has already undertaken to look at this issue. As I say, I hope this discussion strengthens his resolve to do so.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

There have not been all that many times during the debate on the Bill when the Minister has so spectacularly missed the point as he has on this section. I understand everything he said about provisions already being in place to protect to children and the provisions regarding the super-complaints, but the new clause is not intended to be a replacement for the super-complaints procedure, which we all support—in fact, we have tried to strengthen that procedure. The new clause is intended to be an addition—another, very important layer.

Unfortunately, I do not have at the front of my mind the legislation that set up the Children’s Commissioner for Scotland, or the one for England. The Minister talked through some of the provisions and phrasing in the Children Act 2004. He said that the role of the Children’s Commissioner for England is to encourage bodies to act positively on behalf of children—to encourage. There is no requirement for the body to act in the way the Children’s Commissioner says it should act. Changes have been made in Wales establishing the Future Generations Commissioner, who has far more power.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As far as I can tell, the user advocacy body proposed in new clause 3 would not have the ability to compel Ofcom either.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

But it would be a statutory consultee that is specifically mentioned in this provision. I cannot find in the Bill a provision giving Ofcom a statutory duty to consult the four Children’s Commissioners. The new clause would make the children’s advocacy body a statutory consultee in decisions that affect children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Bill will require Ofcom to consult people who represent the interests of children. Although not named, it would be astonishing if the first people on that list were not the four Children’s Commissioners when developing the relevant codes of practice. The statutory obligation to consult those groups when developing codes of practice and, indeed, guidance is set out in clauses 37(6)(d) and 69(3)(d).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is very helpful, but there are still shortcomings in what the Minister says. The Bill, as drafted, requires Ofcom to require things of other organisations. Some of the detail is in the Bill, some of the detail will come in secondary legislation and some of the detail will come in the codes of practice published by Ofcom. We broadly agree that the Bill will ensure people are safer on the internet than they currently are, but we do not have all the detail on the Government’s intent. We would like more detail on some things, but we are not saying, “We need every little bit of detail.” If we did, the Bill would not be future-proof. We would not be able to change and update the Bill if we required everything to be in the Bill.

The Bill is not a one-off; it will continually change and grow. Having a user advocacy body would mean that emerging threats can quickly be brought to Ofcom’s attention. Unlike the Children’s Commissioners, who have a hundred other things to do, the entire purpose of this body would be to advocate on behalf of children online. The Children’s Commissioners do an amazing job, but this is not their No. 1 priority. If the Minister wants this to be a world-leading Bill, its No. 1 priority should be to protect the human rights of children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think the hon. Lady is being a little unfair to the Children’s Commissioners. Dame Rachel de Souza is doing a fantastic job of advocating specifically in the digital sphere. She really is doing a fantastic job, and I say that as a Minister. I would not say she is leaving any gaps.

These digital children’s safety issues link to wider children’s safety issues that exist offline, such as sexual exploitation, grooming and so on, so it is useful that the same person advocates for children in both the offline and online worlds.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.

The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.

I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I wholeheartedly agree with what the hon. Member for Aberdeen North just said, but I wish to emphasise some elements because it seems to me that the Minister was not listening, although he has listened to much that has been said. I made some specific points, used quotes and brought forward some evidence. He feels that children have been consulted in the drafting of the Bill; I cited a YoungMinds survey that showed that that was very much not what young people feel. YoungMinds surveyed a large group of young people and a very large proportion of them had not even heard of the Bill.

The evidence of the young survivor of online grooming was very powerful. She very much wanted a user-advocacy body and spoke strongly about that. The Minister is getting it wrong if he thinks that somebody in that situation, who has been groomed, would go to a parent. The quote that I cited earlier was:

“Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side.”

There were clearly adults in her life she could have gone to, but she did not because she was in that vulnerable position—a position of weakness. That is why some kind of independent advocacy body for children is so important.

I do not think children and young people do feel consulted about the Bill because the organisations and charities are telling us that. I join all Opposition Members in supporting and paying tribute to the remarkable job that the Children’s Commissioner does. I quoted her setting out her worries about the Bill. I quoted her saying that

“the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

That is what she said. She did not say, “I’m the person charged with doing this. I’m the person who has the resource and my office has the resource.”