All 9 Baroness Ritchie of Downpatrick contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 1st Feb 2023
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Wed 6th Sep 2023

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow other noble Lords on this issue. This legislation is undoubtedly long overdue. Without doubt, the internet has changed the way in which we live our lives. For many this change has been positive. However, the internet, in particular social media, has created a toxic online world. We have only to listen to the noble Baroness, Lady Kidron, and my noble friend Lady Anderson to realise that. As a result, the internet has become abusive, misogynistic and dangerous. Many noble Lords from across the House have personal experience of this toxic world of online abuse. Any measures that seek to place curbs and limits on that type of content are to be welcomed.

While it is important to protect adults from abuse online, it is more important that we get the Bill’s protections right for children. I welcome its provisions in respect of age verification, but for many across the House it is a surprise that we are even debating age verification. Legislation was passed in 2017 but inexplicably not implemented by the Government. That legislation would have ensured that age verification was in place to protect children over five years ago. While the Bill includes age assurance measures, it is disappointing that its provisions are not as robust as those passed in 2017. Also, it is concerning that age verification is not uniformly applied across Parts 3 and 5. What actions and steps will the Minister and his colleagues take in Committee with government amendments on this issue?

As this Bill makes progress through this House, it will be important to ensure that age verification is robust and consistent, but we must also ensure that what happened to the Digital Economy Act cannot be allowed to happen to this legislation. The Government cannot be allowed to slow down or even abandon age verification measures. This Bill, while welcome, needs to be amended to ensure that age verification is actually implemented and enforced. This must happen as quickly as possible after the Bill becomes law. I believe that age verification should be in place no later than six months after this Bill is passed.

The need for robust age verification is beyond any reasonable argument. Children should be protected from viewing harmful content online. The law in this regard should be simple. If a platform contains pornographic content, children should be prevented from viewing it. More than that, pornography that is prohibited offline should be prohibited online. Reading the provisions of this Bill carefully, it is my belief that the Bill falls short in both regards.

I look forward to the passage of this Bill through the House and, while it is a very welcome development to be discussing and having this Bill, it is important that the provisions and clauses within it are totally strengthened.

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
Moved by
12BA: Clause 6, page 5, line 16, at end insert—
“(g) the duties on regulated provider pornographic content set out in section 72.”Member’s explanatory statement
This amendment requires user-to-user services to comply with duties under Part 5.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- Hansard - -

My Lords, I wish to speak to the amendments in this group, which are in my name and are also supported by the noble Lord, Lord Morrow. There are three interconnected issues raised by these amendments. First, there should be a level playing field across the Bill for regulating pornographic content. The same duties should apply to all content, whether it is found in social media or a user-to-user pornography site, which fall under Part 3, or a commercial pornography site, with producer content that falls within Part 5 of the Bill. Secondly, these common duties in respect of pornography must come into effect at the same time. My requiring that the same duties under Clause 72 apply to both Part 3 and Part 5 services means that they will be regulated for pornographic content at the same time, ensuring uniformity across the Bill.

Thirdly, through Amendment 125A, I wish to probe how Part 5 will function more specifically. Will any website or platform actually be covered by Part 5 if these amendments are not made? I had the privilege of speaking earlier to the Minister on these issues, and one question I would pose at this stage is, how many sites are covered by Part 5? That is one of the questions to which your Lordships’ House requires an answer.

The issue of ensuring that pornography is named as a harm on the face of the Bill, and that all pornographic content is age-verified, is not new. Indeed, it has been raised from the outset of the Bill, including at Second Reading in your Lordships’ House. In pre-legislative scrutiny even, the Joint Committee on the draft Bill recommended that

“key, known risks of harm to children are set out on the face of the Bill. We would expect these to include (but not be limited to) access to or promotion of age-inappropriate material such as pornography”.

To partly address this issue, the Government added Part 5 to the Bill, which sought to ensure that any platform that was not in scope of Part 3 but which included pornographic content should be subject to age-verification measures. I know that other noble Lords have put forward amendments that would add to the list of online harms on the face of the Bill, which we will be debating later in group 10.

Other amendments add to the duties that platforms hosting pornographic content need to comply with. These include Amendment 184, in the name of the noble Baroness, Lady Kidron, which proposes that consent be obtained from performers, and Amendment 185, in the name of the noble Baroness, Lady Benjamin, which seeks to ensure parity between what is permissible offline and online. The amendments I propose in this group are, I believe, complementary to those amendments. My amendments seek to ensure that duties across Part 3 and Part 5 in respect of pornography are aligned. Therefore, those additional duties contained in other amendments would be aligned across the Bill as well. When we get to that stage in Committee, I will be supporting those amendments.

The harms of pornography are well known and I do not propose to go over those again. I do, however, want to highlight one issue raised in a recent report published by the Children’s Commissioner for England. Her report states:

“Pornography is not confined to dedicated adult sites. We found that Twitter was the online platform where young people were most likely to have seen pornography. Fellow mainstream social networking platforms Instagram and Snapchat rank closely after dedicated pornography sites.”


The report found that 41% of children encountered pornography on Twitter, 33% on Instagram and 32% on Snapchat, while only 37% of children encountered pornography on main commercial websites. This means that children are more likely to encounter pornographic content on social media. That is why we need to ensure that standards across all platforms are uniform. The same standards need to apply to social media as to commercial pornography websites.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Perhaps I will speak to the noble Lord afterwards and make sure I have his question right before I do so.

I hope that answers the questions from the noble Baroness, Lady Ritchie, and that on that basis, she will be happy to withdraw her amendment.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, this has been a very wide-ranging debate, concentrating not only on the definition of pornography but on the views of noble Lords in relation to how it should be regulated, and whether it should be regulated, as the noble Baroness, Lady Kidron, the noble Lords, Lord Bethell and Lord Browne, and I myself believe, or whether it should be a graduated response, which seems to be the view of the noble Lords, Lord Allan and Lord Clement-Jones.

I believe that all pornography should be treated the same. There is no graduated response. It is something that is pernicious and leads to unintended consequences for many young people, so therefore it needs to be regulated in all its forms. I think that is the point that the noble Lord, Lord Bethell, was making. I believe that these amendments should have been debated along with those of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, because then we could have an ever wider-ranging debate, and I look forward to that in the further groups in the days to come. The focus should be on the content, not on the platform, and the content is about pornography.

I agree with the noble Baroness, Lady Kidron, that porn is not the only harm, and I will be supporting her amendments. I believe that they should be in the Bill because if we are serious about dealing with these issues, they have to be in there.

I do not think my amendments are suggesting that children will be removed from social media. I agree that it is a choice to remove pornography or to age-gate. Twitter is moving to subscriber content anyway, so it can do it; the technology is already available to do that. I believe you just age-gate the porn content, not the whole site. I agree with the noble Lord, Lord Clement-Jones, as I said. These amendments should have been debated in conjunction with those of the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, as I believe that the amendments in this group are complementary to those, and I think I already said that in my original submission.

I found the Minister’s response interesting. Obviously, I would like time to read Hansard. I think certain undertakings were given, but I want to see clearly spelled out where they are and to discuss with colleagues across the House where we take these issues and what we come back with on Report.

I believe that these issues will be debated further in Committee when the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, are debated. I hope that in the intervening period the Minister will have time to reflect on the issues raised today about Parts 3 and 5 and the issue of pornography, and that he will be able to help us in further sessions in assuaging the concerns that we have raised about pornography. There is no doubt that these issues will come back. The only way that they can be dealt with, that pornography can be dealt with and that all our children throughout the UK can be dealt with is through proper regulation.

I think we all need further reflection. I will see, along with colleagues, whether it is possible to come back on Report. In the meantime, I beg leave to withdraw the amendment.

Amendment 12BA withdrawn.

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
Finally, there is the question of simplicity and clarity. As we discussed on the first day of Committee, business wants clarity, campaigners want clarity, parents want clarity, and Ofcom could do with some clarity. If not the four Cs, my challenge to the Government is to deliver a schedule that has the clarity and simplicity of the amendments in front of us, in which harm is defined by category not by individual content measurements, so that it is flexible now and into the future, and foregrounds the specific role of the system design not only as an accomplice to the named harm but as a harm itself. I beg to move.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I have listened intently today, and there is no doubt that this Bill not only presents many challenges but throws up the complexity of the whole situation. I think it was the noble Lord, Lord Kamall, in an earlier group who raised the issues of security, safety and freedom. I would add the issue of rights, because we are trying to balance all these issues and characterise them in statute, vis-à-vis the Bill.

On Tuesday, we spoke about one specific harm—pornography—on the group of amendments that I had brought forward. But I made clear at that time that I believe this is not the only harm, and I fully support the principles of the amendments from the noble Baroness, Lady Kidron. I would obviously like to get some clarity from her on the amendments, particularly as to how they relate to other clauses in the Bill.

The noble Baroness has been the pioneer in this field, and her expertise is well recognised across the House. I believe that these amendments really take us to the heart of the Bill and what we are trying to achieve—namely, to identify online harms to children, counteract them and provide a level of safety to young people.

As the noble Lord, Lord Clement-Jones, said on Tuesday,

“there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us”.—[Official Report, 25/4/23; col. 1196.]

There is actually not that much between us. I fully agree with the principle of putting some of the known harms to children in the Bill. If we know the harms, there is little point in waiting for them to be defined in secondary legislation by Clause 54.

It is clear to me that there are harms to children that we know about, and those harms will not change. It would be best to name those harms clearly in the Bill when it leaves this House. That would allow content providers, search engines and websites in scope of the Bill to prepare to make any changes they need to keep children safe. Perhaps the Minister could comment on that aspect. We also know that parents will expect some harms to be in the Bill. The noble Baroness, Lady Kidron, laid out what they are, and I agree with her analysis. These issues are known and we should not wait for them to be named.

While known harms should be placed into the Bill, I know, understand and appreciate that the Government are concerned about future-proofing. However, I am of the view that a short list of key topics will not undermine that principle. Indeed, the Joint Committee’s report on the draft Bill stated,

“we recommend that key, known risks of harm to children are set out on the face of the Bill”.

In its report on the Bill, the DCMS Select Committee in the other place agreed, saying

“that age-inappropriate or otherwise inherently harmful content and activity (like pornography, violent material, gambling and content that promotes or is instructive in eating disorders, self-harm and suicide) should appear on the face of the Bill”.

Has there been any further progress in discussions on those issues?

At the beginning of the year, the Children’s Commissioner urged Parliamentarians

“to define pornography as a harm to children on the fact of the … Bill, such that the regulator, Ofcom, may implement regulation of platforms hosting adult content as soon as possible following the passage of the Bill”.

I fully agree with the Children’s Commissioner. While the ways in which pornographic content is delivered will change over time, the fact that pornography is harmful to children will not change. Undoubtedly, with the speed of technology—something that the noble Lord, Lord Allan of Hallam, knows a lot more about than the rest of us, having worked in this field—it will no doubt change and we will be presented with new types of challenges.

I therefore urge the Government to support the principle that the key risks are in the Bill, and I thank the noble Baroness, Lady Kidron, for raising this important principle. However, I hope she will indulge me as I seek to probe some of the detail of her amendments and their interactions with the architecture of other parts of the Bill. As I said when speaking to Clause 49 on Tuesday, the devil is obviously in the detail.

First, Clause 54 defines what constitutes

“Content that is harmful to children”,

and Clause 205 defines harm, and Amendment 93 proposes an additional new list of harms. As I have already said, I fully support the principle of harms being in the Bill, but I raise a question for the noble Baroness. How does she see these three definitions working together? That might refer back to a preliminary discussion that we had in the tearoom earlier.

These definitions of harms are in addition to the content to be defined as primary priority content and priority content. Duties in Clauses 11 and 25 continue to refer to these two types of content for Part 3 services, but Amendments 20 and 74 would remove the need for risk assessments in Clauses 10 and 24 to address these two types of content. It seems that the amendments could create a tension in the Bill, and I am interested to ascertain how the noble Baroness, Lady Kidron, foresees that tension operating. Maybe she could give us some detail in her wind-up about that issue. An explanation of that point may bring some clarity to understanding how the new schedule that the noble Baroness proposes will work alongside the primary priority content and the priority content lists. Will the schedule complement primary priority content, or will it be an alternative?

Secondly, as I said, some harms are known but there are harms that are as yet unknown. Will the noble Baroness, Lady Kidron, consider a function to add to the list of content in her Amendment 93, in advance of us coming back on Report? There is no doubt that the online space is rapidly changing, as this debate has highlighted. I can foresee a time when other examples of harm should be added to the Bill. I accept that the drafting is clear that the list is not exclusive, but it is intended to be a significant guide to what matters to the public and Parliament. I also accept that Ofcom can provide guidance on other content under Amendment 123, but, without a regulatory power added to Amendment 93, it feels that we are perhaps missing a belt-and-braces approach to online harms to children. After all, our principal purpose here is to protect children from online harm.

I commend the noble Baroness, Lady Kidron, on putting these important amendments before the Committee, and I fully support the principle of what she seeks to achieve. But I hope that, on further reflection, she will look at the points I have suggested. Perhaps she might suggest other ideas in her wind-up, and we could have further discussions in advance of Report. I also look forward to the Minister’s comments on these issues.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendments 20, 93 and 123, in my name and those of the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Stevenson. I also support Amendment 74 in the name of the noble Baroness, Lady Kidron. I pay tribute to the courage of all noble Lords and their teams, and of the Minister and the Bill team, for their work on this part of the Bill. This work involves the courage to dare to look at some very difficult material that, sadly, shapes the everyday life of too many young people. This group of amendments is part of a package of measures to strengthen the protections for children in the Bill by introducing a new schedule of harms to children and plugging a chronological gap between Part 3 and Part 5 services, on when protection from pornography comes into effect.

Every so often in these debates, we have been reminded of the connection with real lives and people. Yesterday evening, I spent some time speaking on the telephone with Amanda and Stuart Stephens, the mum and dad of Olly Stephens, who lived in Reading, which is part of the diocese of Oxford. Noble Lords will remember that Olly was tragically murdered, aged 13, in a park near his home, by teenagers of a similar age. Social media played a significant part in the investigation and in the lives of Olly and his friends—specifically, social media posts normalising knife crime and violence, with such a deeply tragic outcome.

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
Moved by
29: Clause 11, page 11, line 25, at end insert—“, except for pornographic content where age verification must always be applied, notwithstanding section 3(3)(a) of the Communications Act 2003.”Member’s explanatory statementThis amendment would require a user-to-user service to apply age verification for pornographic content regardless of their size or capacity.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- Hansard - -

My Lords, I am very happy to move Amendment 29 and to speak to Amendments 83 and 103, which are also in my name. We have just had a debate about the protection of children online, and this clearly follows on from that.

The intention of the Bill is to set general parameters through which different content types can be regulated. The problem with that approach, as the sheer number of amendments highlights, is this: not all content and users are the same, and therefore cannot be treated in the same way. Put simply, not all content online should be legislated for in the same way. That is why the amendments in this group are needed.

Pornography is a type of content that cannot be regulated in general terms; it needs specific provisions. I realise that some of these issues were raised in the debate last Tuesday on amendments in my name, and again on Thursday when we discussed harms to children. I recognise too that, during his response to Thursday’s debate, the Minister made a welcome announcement on primary priority content which I hope will be set out in the Bill, as we have been asking for during this debate. While we wait to see the detail of what that announcement means, I think it safe to assume that pornography will be one of the harms named on the Bill, which makes discussion of these amendments that bit more straightforward.

Given that context, in Clause 11(3), user-to-user services that fall under the scope of Part 3 of the Bill have a duty to prevent children from accessing primary priority content. This duty is repeated in Clause 25(3) for search services. That duty is, however, qualified by the words,

“using proportionate systems and processes”.

It is the word “proportionate” and how that would apply to the regulation of pornography that is at the heart of the issue.

Generally speaking, acting in a proportionate way is a sensible approach to legislation and regulation. For the most part, regulation and safeguards should ensure that a duty is not onerous or that it does not place a disproportionate cost on the service provider that may make their business unviable. While that is the general principle, proportionality is not an appropriate consideration for all policy decisions.

In the offline world, legislation and regulation is not always proportionate. This is even more stark when regulating for children. The noble Lord, Lord Bethell, raised the issue of the corner shop last Tuesday, and that example is apt to highlight my point today. We do not take a proportional approach to the sale of alcohol or cigarettes. We do not treat a corner shop differently from a supermarket. It would be absurd if I were to suggest that a small shop should apply different age checks for children when selling alcohol, compared to the age checks we expect a large supermarket to apply. Therefore, in the same way, we already do not apply proportionality to some online activities. For example, gambling is an activity that is age-verified for children. Indeed, gambling companies are not allowed to make their product attractive to children and must advertise in a regulated way to avoid harm to children and young people. The harm caused to children by gambling is significant, so the usual policy considerations of proportionality do not apply. Clearly, both online and offline, there are some goods and services to which a proportionality test is not applied; there is no subjectivity. A child cannot buy alcohol or gamble and should not be able to access pornography.

In the UK, there is a proliferation of online gambling sites. It would be absurd to argue that the size of a gambling company or the revenue that company makes should be a consideration in whether it should utilise age verification to prevent children placing a bet. In the same way, it would be absurd to argue that the size or revenue of a pornographic website could be used as an argument to override a duty to ensure that age verification is employed to ensure that children do not access that website.

This is not a grey area. It is beyond doubt that exposing children to pornography is damaging to their health and development. The Children’s Commissioner’s report from this year has been much quoted already in Committee but it is worth reminding your Lordships what she found: that pornography was “widespread and normalised”, to the extent that children cannot opt out. The average age at which children first see pornography is 13. By age nine, 10% had seen it, 27% had seen it by age 11 and half had seen it by age 13. The report found that frequent users of pornography are more likely to engage—unfortunately and sadly—in physically aggressive sex acts.

There is nothing proportionate about the damage of pornographic content. The size, number of visitors, financial budget or technical know-how must not be considerations as to whether or not to deploy age checks. If a platform is incapable for any reason of protecting children from harmful exposure to pornography, it must remove that content. The Bill should be clear: if there is pornography on a website, it must use age verification. We know that pornographic websites will do all they can to evade age verification. In France and Germany, which are ahead of us in passing legislation to protect minors from pornography, regulators are tangled up in court action as the pornographic sites they first targeted for enforcement action argue against the law.

We must also anticipate the response of websites that are not dedicated exclusively to pornography, especially social media—a point we touched on during Tuesday’s debate. Reuters reported last year that an internal Twitter presentation stated that 13% of tweets were pornographic. Indeed, the Children’s Commissioner has found that Twitter is the platform where young people are most likely to encounter pornographic content. I know that some of your Lordships are concerned about age-gating social media. No one is suggesting that social media should exclude children, a point that has been made already. What I am suggesting is that pornography on that platform should be subject to age verification. The capabilities already exist to do this. New accounts on Twitter have to opt in to view pornographic content. Why cannot the opt-in function be age-gated? Twitter is moving to subscription content. Why can it not make pornographic content subscription based, with the subscription being age-verified. The solutions exist.

The Minister may seek to reassure the House that the Bill as drafted would not allow any website or search facility regulated under Part 3 that hosts pornographic content to evade its duties because of size, capacity or cost. But, as we have seen in France, these terms will be subject to court action. I therefore trust that the Government will bring forward an amendment to ensure that any platform that hosts pornographic content will employ age verification, regardless of any other factors. Perhaps the Minister in his wind-up can provide us with some detail or a hint of a future amendment at Report. I look forward to hearing and considering the Minister’s response. I beg to move.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I wish to speak in support of Amendments 29, 83 and 103 in the name of the noble Baroness, Lady Ritchie. I am extremely pleased that the Minister said last Tuesday that pornography will be within primary priority content; he then committed on Thursday to naming primary priority content in the Bill. This is good news. We also know that pornography will come within the child safety duties in Clause 11. This makes me very happy.

In the document produced for the Government in January 2021, the BBFC said that there were millions of pornographic websites—I repeat, millions—and many of these will come within Part 3 of the Bill because they allow users to upload videos, make comments on content and chat with other users. Of course, some of these millions of websites will be very large, which means by definition that we expect them to come within the scope of the Bill. Under Clause 11(3) user-to-user services have a duty to prevent children accessing primary priority content. The duty is qualified by the phrase

“using proportionate systems and processes”.

The facts of deciding what is proportionate are set out in Clause 11(11): the potential harm of the content based on the children’s risk assessment, and the size and capacity of the provider of the service. Amendments 29, 83 and 103 tackle the issue of size and capacity.

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I think those last two comments were what are known in court as leading questions.

As the noble Baroness, Lady Ritchie of Downpatrick, said herself, some of the ground covered in this short debate was covered in previous groups, and I am conscious that we have a later grouping where we will cover it again, including some of the points that were made just now. I therefore hope that noble Lords will understand if I restrict myself at this point to Amendments 29, 83 and 103, tabled by the noble Baroness, Lady Ritchie.

These amendments seek to mandate age verification for pornographic content on a user-to-user or search service, regardless of the size and capacity of a service provider. The amendments also seek to remove the requirement on Ofcom to have regard to proportionality and technical feasibility when setting out measures for providers on pornographic content in codes of practice. While keeping children safe online is the top priority for the Online Safety Bill, the principle of proportionate, risk-based regulation is also fundamental to the Bill’s framework. It is the Government’s considered opinion that the Bill as drafted already strikes the correct balance between these two.

The provisions in the Bill on proportionality are important to ensure that the requirements in the child-safety duties are tailored to the size and capacity of providers. It is also essential that measures in codes of practice are technically feasible. This will ensure that the regulatory framework as a whole is workable for service providers and enforceable by Ofcom. I reassure your Lordships that the smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services, and will need to make sure that these systems and processes achieve the required outcomes of the child safety duty. Wherever in the Bill they are regulated, companies will need to take steps to ensure that they cannot offer pornographic content online to those who should not see it. Ofcom will set out in its code of practice the steps that companies in the scope of Part 3 can take to comply with their duties under the Bill, and will take a robust approach to sites that pose the greatest risk of harm to children, including sites hosting online pornography.

The passage of the Bill should be taken as a clear message to providers that they need to begin preparing for regulation now—indeed, many are. Responsible providers should already be factoring in regulatory compliance as part of their business costs. Ofcom will continue to work with providers to ensure that the transition to the new regulatory framework will be as smooth as possible.

The Government expect companies to use age-verification technologies to prevent children accessing services that pose the highest risk of harm to children, such as online pornography. The Bill will not mandate that companies use specific technologies to comply with new duties because, as noble Lords have heard me say before, what is most effective in preventing children accessing pornography today might not be equally effective in future. In addition, age verification might not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For instance, if a user-to-user service, such as a particular social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. This would allow content to be better detected and taken down, instead of restricting children from seeing content which is not allowed on the service in the first place. Companies may also use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.

In addition, the amendments in the name of the noble Baroness, Lady Ritchie, risk inadvertently shutting children out of large swathes of the internet that are entirely appropriate for them to access. This is because it is impossible totally to eliminate the risk that a single piece of pornography or pornographic material might momentarily appear on a site, even if that site prohibits it and has effective systems in place to prevent it appearing. Her amendments would have the effect of essentially requiring every service to block children through the use of age verification.

Those are the reasons why the amendments before us are not ones that we can accept. Mindful of the fact that we will return to these issues in a future group, I invite the noble Baroness to withdraw her amendment.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, I thank all noble Lords who have participated in this wide-ranging debate, in which various issues have been raised.

The noble Baroness, Lady Benjamin, made the good point that there needs to be a level playing field between Parts 3 and 5, which I originally raised and which other noble Lords raised on Tuesday of last week. We keep coming back to this point, so I hope that the Minister will take note of it on further reflection before we reach Report. Pornography needs to be regulated on a consistent basis across the Bill.

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is such a privilege to follow the noble Baroness, Lady Benjamin. I pay tribute to her years of campaigning on this issue and the passion with which she spoke today. It is also a privilege to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, in supporting all the amendments in this group. They are vital to this Bill, as all sides of this Committee agree. They all have my full support.

When I was a child, my grandparents’ home, like most homes, was heated by a coal fire. One of the most vital pieces of furniture in any house where there were children in those days was the fireguard. It was there to prevent children getting too near to the flame and the smoke, either by accident or by design. It needed to be robust, well secured and always in position, to prevent serious physical harm. You might have had to cut corners on various pieces of equipment for your house, but no sensible family would live without the best possible fireguard they could find.

We lack any kind of fireguard at present and the Bill currently proposes an inadequate fireguard for children. A really important point to grasp on this group of amendments is that children cannot be afforded the protections that the Bill gives them unless they are identified as children. Without that identification, the other protections fail. That is why age assurance is so foundational to the safety duties and mechanisms in the Bill. Surely, I hope, the Minister will acknowledge both that we have a problem and that the present proposals offer limited protection. We have a faulty fireguard.

These are some of the consequences. Three out of five 11 to 13 year-olds have unintentionally viewed pornography online. That is most of them. Four out of five 12 to 15 year-olds say they have had a potentially harmful experience online. That is almost universal. Children as young as seven are accessing pornographic content and three out of five eight to 11 year-olds—you might want to picture a nine year-old you know—have a social media profile, when they should not access those sites before the age of 13. That profile enables them to view adult content. The nation’s children are too close to the fire and are being harmed.

There is much confusion about what age assurance is. As the noble Baroness, Lady Kidron, has said, put simply it is the ability to estimate or verify an individual’s age. There are many different types of age assurance, from facial recognition to age verification, which all require different levels of information and can give varying levels of assurance. At its core, age assurance is a tool which allows services to offer age-appropriate experiences to their users. The principle is important, as what might be appropriate for a 16 year-old might be inappropriate for a 13 year-old. That age assurance is absolutely necessary to give children the protections they deserve.

Ofcom’s research shows that more than seven out of 10 parents of children aged 13 to 17 were concerned about their children seeing age-inappropriate content or their child seeing adult or sexual content online. Every group I have spoken to about the Bill in recent months has shared this concern. Age assurance would enable services to create age-appropriate experiences for children online and can help prevent children’s exposure to this content. The best possible fireguard would be in place.

Different levels of age assurance are appropriate in different circumstances. Amendments 161 and 142 establish that services which use age assurance must do so in line with the basic rules of the road. They set out that age assurance must be proportionate to the level of risk of a service. For high-risk services, such as pornography, sites much establish the age of their users beyond reasonable doubt. Equally, a service which poses no risk may not need to use age assurance or may use a less robust form of age assurance to engage with children in an age-appropriate manner—for example, serving them the terms and conditions in a video format.

As has been said, age assurance must be privacy-preserving. It must not be used as an excuse for services to use the most intrusive technology for data-extractive purposes. These are such common-sense amendments, but vital. They will ensure that children are prevented from accessing the most high-risk sites, enable services to serve their users age-appropriate experiences, and ensure that age assurance is not used inappropriately in a way that contravenes a user’s right to privacy.

As has also been said, there is massive support for this more robust fireguard in the country at large, across this House and, I believe, in the other place. I have not yet been able to understand, or begin to understand, the Government’s reasons for not providing the best protection for our children, given the aim of the Bill. Better safeguards are technically possible and eminently achievable. I would be grateful if the Minister could attempt to explain what exactly he and the Government intend to do, given the arguments put forward today and the ongoing risks to children if these amendments are not adopted.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the right reverend Prelate the Bishop of Oxford. He used an interesting analogy of the fireguard; what we want in this legislation is a strong fireguard to protect children.

Amendments 183ZA and 306 are in my name, but Amendment 306 also has the name of the noble Lord, Lord Morrow, on it. I want to speak in support of the general principles raised by the amendments in this group, which deal with five specific areas, namely: the definition of pornography; age verification; the consent of those participating in pornographic content; ensuring that content which is prohibited offline is also prohibited online; and the commencement of age verification. I will deal with each of these broad topics in turn, recognising that we have already dealt with many of the issues raised in this group during Committee.

As your Lordships are aware, the fight for age verification has been a long one. I will not relive that history but I remind the Committee that when the Government announced in 2019 that they would not implement age verification, the Minister said:

“I believe we can protect children better and more comprehensively through the online harms agenda”.—[Official Report, Commons, 17/10/19; col. 453.]


Four years later, the only definition for pornography in the Bill is found in Clause 70(2). It defines pornographic content as

“produced solely or principally for the purpose of sexual arousal”.

I remain to be convinced that this definition is more comprehensive than that in the Digital Economy Act 2017.

Amendment 183ZA is a shortened version of the 2017 definition. I know that the Digital Economy Act is out of vogue but it behoves us to have a debate about the definition, since what will be considered as pornography is paramount. If we get that wrong, age verification will be meaningless. Everything else about the protections we want to put in place relies on a common understanding of when scope of age verification will be required. Put simply, we need to know what it is we are subjecting to age verification and it needs to be clear. The Minister stated at Second Reading that he believed the current definition is adequate. He suggested that it ensured alignment across different pieces of legislation and other regulatory frameworks. In reviewing other legislation, the only clear thing is this: there is no standard definition of pornography across the legislative framework.

For example, Section 63 of the Criminal Justice and Immigration Act 2008 uses the definition in the Bill, but it requires a further test to be applied: meeting the definition of “extreme” material. Section 368E of the Communications Act 2003 regulates online video on demand services. That definition uses the objective tests of “prohibited material”, meaning material too extreme to be classified by the British Board of Film Classification, and “specially restricted material”, covering R18 material, while also using a subjective test that covers material that

“might impair the physical, mental or moral development”

of under-18s.

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
Amendment 237 in my name introduces a delegated power to update and amend these lists. This is essential for ensuring that the legislation remains flexible to change and that new and emerging risks of harm can be captured swiftly. Amendment 238, also in my name, ensures that the draft affirmative procedure will apply except in cases where there is an urgent need to update the lists, when the affirmative procedure can be used. This ensures that Parliament will retain the appropriate degree of oversight over any changes. I beg to move.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, we spent a lot of time in Committee raising concerns about how pornography and age verification were going to operate across all parts of the Bill. I have heard what the Minister has said in relation to this group, priority harms to children, which I believe is one of the most important groups under discussion in the Bill. I agree that children must be protected from the most harmful content online and offline.

I am grateful to the Government for having listened carefully to the arguments put forward by the House in this regard and commend the Minister for all the work he and his team have done since them. I also commend the noble Lord, Lord Bethell. He and I have been in some discussion between Committee and now in relation to these amendments.

In Committee, I argued for several changes to the Bill which span three groups of amendments. One of my concerns was that pornography should be named as a harm in the Bill. I welcome the Government’s Amendment 171, which names pornography as a primary priority content. I also support Amendment 174 in the name of the noble Baroness, Lady Kidron. She is absolutely right that sexualised content can be harmful to children if not age appropriate and, in that regard, before she even speaks, I ask the Minister tousb reconsider his views on this amendment and to accept it.

Within this group are the amendments which move the definition of “pornographic content” from Part 5 to Clause 211. In that context, I welcome the Government’s announcement on Monday about a review of the regulation, legislation and enforcement of pornography offences.

In Committee, your Lordships were very clear that there needed to be a consistent approach across the Bill to the regulation of pornography. I am in agreement with the amendments tabled in Committee to ensure that consistency applies across all media. In this regard, I thank the noble Baroness, Lady Benjamin, for her persistence in raising this issue. I also thank my colleagues on the Opposition Front Bench, the noble Lord, Lord Stevenson, and the noble Baroness, Lady Merron.

I appreciate that the Government made this announcement only three days ago, but I hope the Minister will set out a timetable for publishing the terms of reference and details of how this review will take place. The review is too important to disappear into the long grass over the Summer Recess, never to be heard of again, so if he is unable to answer my question today, will he commit to writing to your Lordships with the timeframe before the House rises for the summer? Will he consider the active involvement of external groups in this review, as much expertise lies outside government in this area? In that regard, I commend CARE, CEASE and Barnardo’s for all their input into the debates on the Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I think the noble Baroness’s comments relate to the next group of amendments, on pornography. She might have skipped ahead, but I am grateful for the additional thinking time to respond to her questions. Perhaps she will save the rest of her remarks for that group.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- Hansard - -

I thank the Minister for that. In conclusion, I hope he will reflect on those issues and come back, maybe at the end of the next group. I remind the House that in February the APPG on Commercial Sexual Exploitation, in its inquiry on pornography, recommended that the regulation of pornography should be consistent across all online platforms and between the online and offline spheres. I hope we can incorporate the voices I have already mentioned in the NGO sphere in order to assist the Government and both Houses in ensuring that we regulate the online platforms and that children are protected from any harms that may arise.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak briefly to Amendment 174 in my name and then more broadly to this group—I note that the Minister got his defence in early.

On the question of misinformation and disinformation, I recognise what he said and I suppose that, in my delight at hearing the words “misinformation and disinformation”, I misunderstood to some degree what he was offering at the Dispatch Box, but I make the point that this poses an enormous risk to children. As an example, children are the fastest-growing group of far-right believers/activists online, and there are many areas in which we are going to see an exponential growth in misinformation and disinformation as large language models become the norm. So I ask him, in a tentative manner, to look at that.

On the other issue, I have to push back at the Minister’s explanation. Content classification around sexual content is a well-established norm. The BBFC does it and has done it for a very long time. There is an absolute understanding that what is suitable for a U, a PG, a 12 or a 12A are different things, and that as children’s capacities evolve, as they get older, there are things that are more suitable for older children, including, indeed, stronger portrayals of sexual behaviour as the age category rises. So I cannot accept that this opens a new can of worms: this is something that we have been doing for many, many years.

I think it is a bit wrongheaded to imagine that if we “solve” the porn problem, we have solved the problem—because there is still sexualisation and the commercialisation of sex. Now, if you say something about feet to a child, they start to giggle uproariously because, in internet language, you get paid for taking pictures of feet and giving them to strange people. There are such detailed and different areas that companies should be looking at. This amendment in my name and the names of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, should be taken very seriously. It is not new ground, so I would ask the Minister to reconsider it.

More broadly, the Minister will have noticed that I liberally added my name to the amendments he has brought forward to meet some of the issues we raised in Committee, and I have not added my name to the schedule of harms. I want to be nuanced about this and say I am grateful to the Government for putting them in the Bill, I am grateful that the content harms have been discussed in this Chamber and not left for secondary legislation, and I am grateful for all the conversations around this. However, harm cannot be defined only as content, and the last grouping got to the core of the issue in the House. Even when the Minister was setting out this amendment, he acknowledged that the increase in harm to users may be systemic and by design. In his explanation, he used the word “harm”; in the Bill, it always manifests as “harmful content”.

While the systemic risk of increasing the presence of harmful content is consistently within the Bill, which is excellent, the concept that the design of service may in and of itself be harmful is absent. In failing to do that, the Government, and therefore the Bill, have missed the bull’s-eye. The bull’s-eye is what is particular about this method of communication that creates harm—and what is particular are the features, functionalities and design. I draw noble Lords back to the debate about Wikipedia. It is not that we all love Wikipedia adoringly; it is that it does not pursue a system of design for commercial purposes that entraps people within its grasp. Those are the harms we are trying to get at. I am grateful for the conversations I have had, and I look forward to some more. I have laid down some other amendments for Monday and beyond that would, I hope, deal with this—but until that time, I am afraid this is an incomplete picture.

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
With a final word of optimism, I ask my noble friend the Minister what work will be done to bring alignment with other jurisdictions and to promote Britain as a well-regulated destination for investment, much as we do for life sciences.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, I reiterate what the noble Lord, Lord Bethell, has said and thank him for our various discussions between Committee and Report, particularly on this set of amendments to do with age verification. I also welcome the Government’s responsiveness to the concerns raised in Committee. I welcome these amendments, which are a step forward.

In Committee, I was arguing that there should be a level playing field for regulating any online platform with pornographic content, whether it falls under Part 3 or Part 5 of the Bill. I welcome the Government’s significant changes to Clauses 11 and 72 to ensure that robust age verification or estimation must be used and that standards are consistent across the Bill.

I have a few minor concerns that I wish to highlight. I am thoughtful about whether enough is required of search services in preventing young people from accessing pornography in Clause 25. I recognise the Government believe they have satisfied the need. I fear they may have done enough in the short term, but there is a real concern that this clause is not sufficiently future-proofed. Of course, only time will tell. Maybe the Minister could advise us further in that particular regard.

In Committee, I also argued that the duties in respect of pornography in Parts 3 and 5 must come into effect at the same time. I welcome the government commitment to placing a timeframe for the codes of practice and guidance on the face of the Bill through amendments including Amendment 230. I hope that the Minister will reassure us today that it is the Government’s intention that the duties in Clauses 11 and 72 will come into effect at the same time. Subsection (3) of the new clause proposed in Amendment 271 specifically states that the duties could come into effect at different times, which leaves a loophole for pornography to be regulated differently, even if only for a short time, between Part 3 and Part 5 services. This would be extremely regrettable.

I would also like to reiterate what I said last Thursday, in case the Minister missed my contribution when he intervened on me. I say once again that I commend the Minister for the announcement of the review of the regulation, legislation and enforcement of pornography offences, which I think was this time last week. I once again ask the Minister: will he set out a timetable for publishing the terms of reference and details of how this review will take place? If he cannot set out that timetable today, will he write to your Lordships setting out the timetable before the Recess, and ensure a copy is placed in the Library?

Finally, all of us across the House have benefited from the expertise of expert organisations as we have considered this Bill. I repeat my request to the Minister that he consider establishing an external reference group to support the review, consisting of those NGOs with particular and dedicated expertise. Such groups would have much to add to the process—they have much learning and advice, and there is much assistance there to the Government in that regard.

Once again, I thank the Minister for listening and responding. I look forward to seeing the protections for children set out in these amendments implemented. I shall watch implementation very closely, and I trust and hope that the regulator will take robust action once the codes of practice and guidance are published. Children above all will benefit from a safer internet.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome the government amendments in this group, which set out the important role that age assurance will play in the online safety regime. I particularly welcome Amendment 210, which states that companies must employ systems that are “highly effective” at correctly determining whether a particular user is a child to prevent access to pornography, and Amendment 124, which sets out in a code of practice principles which must be followed when implementing age assurance—principles that ensure alignment of standards and protections with the ICO’s age appropriate design code and include, among other things, that age assurance systems should be easy to use, proportionate to the risk and easy to understand, including to those with protected characteristics, as well as aiming to be interoperable. The code is a first step from current practice, in which age verification is opaque, used to further profile children and related adults and highly ineffective, to a world in which children are offered age-appropriate services by design and default.

I pay tribute again to the noble Lord, Lord Bethell, and the noble Baroness, Lady Benjamin, and I associate myself with the broad set of thanks that the noble Lord, Lord Bethell, gave in his opening speech. I also thank colleagues across your Lordships’ House and the other place for supporting this cause with such clarity of purpose. On this matter, I believe that the UK is world-beating, and it will be a testament to all those involved to see the UK’s age verification and estimation laws built on a foundation of transparency and trust so that those impacted feel confident in using them—and we ensure their role in delivering the online world that children and young people deserve.

I have a number of specific questions about government Amendment 38 and Amendment 39. I would be grateful if the Minister were able to answer them from the Dispatch Box and in doing so give a clear sign of the Government’s intent. I will also speak briefly to Amendments 125 and 217 in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, as well as Amendment 184 in the names of the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan. All three amendments address privacy.

Government Amendment 38, to which I have added my name, offers exemptions in new subsections (3A) and (3B) that mean that a regulated company need not use age verification or estimation to prevent access to primary priority content if they already prevent it by means of its terms of service. First, I ask the Minister to confirm that these exemptions apply only if a service effectively upholds its terms of service on a routine basis, and that failure to do so would trigger enforcement action and/or an instruction from Ofcom to apply age assurance.

--- Later in debate ---
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, the final issue I raised in Committee is dealt with in this group on so-called proportionality. I tabled amendments in Committee to ensure that under Part 3 no website or social media service with pornographic content could argue that it should be exempt from implementing age verification under Clause 11 because to do so would be disproportionate based on its size and capacity. I am pleased today to be a co-signatory to Amendment 39 tabled by the noble Lord, Lord Bethell, to do just that.

The noble Lord, Lord Russell, and the noble Baroness, Lady Kidron, have also tabled amendments which raise similar points. I am disappointed that despite all the amendments tabled by the Minister, the issue of proportionality has not been addressed; maybe he will give us some good news on that this evening. It feels like the job is not quite finished and leaves an unnecessary and unhelpful loophole.

I will not repeat all the arguments I made in Committee in depth but will briefly recap that we all know that in the offline world, we expect consistent regulation regardless of size when it comes to protecting children. We do not allow a small corner shop to act differently from a large supermarket on the sale of alcohol or cigarettes. In a similar online scenario, we do not expect small or large gambling websites to regulate children’s access to gambling in a different way.

We know that the impact of pornographic content on children is the same whether it is accessed on a large pornographic website or a small social media platform. We know from the experience of France and Germany that pornographic websites will do all they can to evade age verification. As the noble Lord, Lord Stevenson, said on the eighth day of Committee, whether pornography

“comes through a Part 3 or Part 5 service, or accidently through a blog or some other piece of information, it has to be stopped. We do not want our children to receive it. That must be at the heart of what we are about, and not just something we think about as we go along”.—[Official Report, 23/5/23; col. 821.]

By not shutting off the proportionality argument, the Government are allowing different-sized online services to act differently on pornography and all the other primary priority content, as I raised in Committee. At that stage, the noble Baroness, Lady Kidron, said,

“we do not need to take a proportionate approach to pornography”.—[Official Report, 2/5/23; col. 1481.]

Amendment 39 would ensure that pornographic content is treated as a separate case with no loopholes for implementing age verification based on size and capacity. I urge the Minister to reflect on how best we can close this potential loophole, and I look forward to his concluding remarks.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will briefly address Amendments 43 and 87 in my name. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to these amendments. They are complementary to the others in this group, on which the noble Lord, Lord Bethell, and the noble Baroness, Lady Ritchie, have spoken.

In Committee the Minister argued that it would be unfair to place the same child safety duties across all platforms. He said:

“This provision recognises that what it is proportionate to require of providers at either end of that scale will be different”.—[Official Report, 2/5/23; col. 1443.]


Think back to the previous group of amendments we debated. We talked about functionality and the way in which algorithms drive these systems. They drive you in all directions—to a large platform with every bell and whistle you might anticipate because it complies with the legislation, but also, willy-nilly, without any conscious thought because that is how it is designed, to a much smaller site. If we do not amend the legislation as it stands, they will take you to smaller sites that do not require the same level of safety duties, particularly towards children. I think we all fail to understand the logic behind that argument.

Online Safety Bill

Baroness Ritchie of Downpatrick Excerpts
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very concerned to hear the contribution from the noble Lord, Lord Rooker. I certainly look forward to hearing what the Minister says in reply. I confess that I was not aware of the Delegated Powers and Regulatory Powers Committee’s report to which he referred, and I wish to make myself familiar with it. I hope that he gets a suitable response from the Minister when he comes to wind up.

I am very grateful to the Minister for the amendments he tabled to Clause 44—Amendments 1 and 2. As he said, they ensure that there is transparency in the way that the Secretary of State exercises her power to issue a direction to Ofcom over its codes of practice. I remind the House—I will not detain your Lordships for very long—that the Communications and Digital Select Committee, which I have the privilege to chair, was concerned with the original Clause 39 for three main reasons: first, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything; secondly, those directions could be made without Parliament knowing; and, thirdly, the process of direction could involve a form of ping-pong between government and regulator that could go on indefinitely.

However, over the course of the Bill’s passage, and as a result of our debates, I am pleased to say that, taken as a package, the various amendments tabled by the Government—not just today but at earlier stages, including on Report—mean that our concerns have been met. The areas where the Secretary of State can issue a direction now follow the precedent set by the Communications Act 2003, and the test for issuing them is much higher. As of today, via these amendments, the directions must be published and laid before Parliament. That is critical and is what we asked for on Report. Also, via these amendments, if the Secretary of State has good reason not to publish—namely, if it could present a risk to national security—she will still be required to inform Parliament that the direction has been made and of the reasons for not publishing. Once the code is finalised and laid before Parliament for approval, Ofcom must publish what has changed as a result of the directions. I would have liked to have seen a further amendment limiting the number of exchanges, so that there is no danger of infinite ping-pong between government and regulator, but I am satisfied that, taken together, these amendments make the likelihood of that much lower, and the transparency we have achieved means that Parliament can intervene.

Finally, at the moment, the platforms and social media companies have a huge amount of unaccountable power. As I have said many times, for me, the Bill is about ensuring greater accountability to the public, but that cannot be achieved by simply shifting power from the platforms to a regulator. Proper accountability to the public means ensuring a proper balance of power between the corporations, the regulator, government and Parliament. The changes we have made to the Bill ensure the balance is now much better between government and the regulator. Where I still think we have work to do is on parliamentary oversight of the regulator, in which so much power is being invested. Parliamentary oversight is not a matter for legislation, but it is something we will need to return to. In the meantime, I once again thank the Minister and his officials for their engagement and for the amendments that have been made.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - -

My Lords, I, too, thank the Minister for his engagement and for the amendments he has tabled at various stages throughout the passage of the Bill.

Amendment 15 provides a definition:

““age assurance” means age verification or age estimation”.

When the Minister winds up, could he provide details of the framework or timetable for its implementation? While we all respect that implementation must be delivered quickly, age verification provisions will be worthless unless there is swift enforcement action against those who transgress the Bill’s provisions. Will the Minister comment on enforcement and an implementation framework with direct reference to Amendment 15?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, as this is a new stage of the Bill, I need to refer again to my entry in the register of interests. I have no current financial interest in any of the regulated companies for which I used to work, in one of which I held a senior role for a decade.

I welcome Amendment 7 and those following from it which change the remote access provision. The change from “remote access” to “view remotely” is quite significant. I appreciate the Minister’s willingness to consider it and particularly the Bill team’s creativity in coming up with this new phrasing. It is much simpler and clearer than the phrasing we had before. We all understand what “view remotely” means. “Access” could have been argued over endlessly. I congratulate the Minister and the team for simplifying the Bill. It again demonstrates the value of some of the scrutiny we carried out on Report.

It is certainly rational to enable some form of viewing in some circumstances, not least where the operations of the regulated entities are outside the United Kingdom and where Ofcom has a legitimate interest in observing tests that are being carried out. The remote access, or the remote viewing facility as it now is, will mean it can do this without necessarily sending teams overseas. This is more efficient, as the Minister said. As this entire regime is going to be paid for by the regulated entities, they have an interest in finding cheaper and more efficient methods of carrying out the supervision than teams going from London to potentially lots of overseas destinations. Agreement between the provider and Ofcom that this form of remote viewing is the most efficient will be welcomed by everybody. It is certainly better than the other option of taking data off-site. I am glad to see that, through the provisions we have in place, we will minimise the instances where Ofcom feels it needs data from providers to be taken off-site to some other facility, which is where a lot of the privacy risks come from.

Can the Minister give some additional assurances at some stage either in his closing remarks or through any follow-up correspondence? First, the notion of proportionality is implicit, but it would help for it to be made explicit. Whenever Ofcom is using the information notices, it should always use the least intrusive method. Yes, it may need to view some tests remotely, but only where the information could not have been provided in written form, for example, or sent as a document. We should not immediately escalate to remote viewing if we have not tried less intrusive methods. I hope that notion of proportionality and least intrusion is implicit within it.

Secondly, concerns remain around live user data. I heard the Minister say that the intention is to use test data sets. That needs to be really clear. It is natural for people to be concerned that their live user data might be exposed to anyone, be it a regulator or otherwise. Of course, we expect Ofcom staff to behave with propriety, but there have sadly been instances where individuals have taken data that they have observed, whether they were working for the police, the NHS or any other entity, and abused it. The safest safeguard is for there to be no access to live user data. I hope the Minister will go as far as he can in saying that that is not the intention.