Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, this group of amendments looks at the treatment of legal content accessed by adults. The very fact that Parliament feels that legislation has a place in policing access to legal material is itself worrying. This door was opened by the Government in the initial draft Bill, but, as we have already heard, after a widespread civil liberties backlash against the legal but harmful clauses, we are left with Clause 65. As has been mentioned, I am worried that this clause, and some of the amendments, might well bring back legal but harmful for adults by the back door. One of the weasel words here is “harmful”. As I have indicated before, it is difficult to work out from the groupings when to raise which bit, so I am keeping that for your Lordships until later and will just note that I am rather nervous about the weasel word “harmful”.

Like many of us, I cheered at the removal of the legal but harmful provisions, but I have serious reservations about their replacement with further duties via terms of service, which imposes a duty on category 1 services to have systems and processes in place to take down or restrict access to content, and to ban or suspend users in accordance with terms of service, as the noble Lord, Lord Moylan, explained. It is one of the reasons I support his amendment. It seems to me to be the state outsourcing the grubby job of censorship to private multinational companies with little regard for UK law.

I put my name to Amendment 155 in the name of the noble Lord, Lord Moylan, because I wanted to probe the Government’s attitude to companies’ terms of service. Platforms have no obligation to align their terms of service with freedom of expression under UK law. It is up to them. I am not trying to impose on them what they do with their service users. If a particular platform wishes to say, “We don’t want these types of views on our platform”, fine, that is its choice. But when major platforms’ terms of service, which are extensive, become the basis on which UK law enforces speech, I get nervous. State regulators are to be given the role of ensuring that all types of lawful speech are suppressed online, because the duty applies to all terms of service, whatever they are, regarding the platforms’ policies on speech suppression, censorship, user suspension, bans and so on. This duty is not restricted to so-called harmful content; it is whatever content the platform wishes to censor.

What is more, Clause 65 asks Ofcom to ensure that individuals who express lawful speech are suspended or banned from platforms if in breach of the platforms’ Ts & Cs, and that means limiting those individuals from expressing themselves more widely, beyond the specific speech in question. That is a huge green light to interfere in UK citizens’ freedom of expression, in my opinion.

I stress that I am not interested in interfering in the terms and conditions of private companies, although your Lordships will see later that I have an amendment demanding that they introduce free-speech clauses. That is because of the way we seem to be enacting the law via the terms of service of private companies. They should of course be free to dictate their own terms of service, and it is reasonable that members of the public should know what they are and expect them to be upheld. But that does not justify the transformation of these private agreements into statutory duties—that is my concern.

So, why are we allowing this Bill to ask companies to enforce censorship policies in the virtual public square that do not exist in UK law? When companies’ terms of service permit the suppression of speech, that is up to them, but when they supress speech far beyond the limitations of speech in UK law and are forced to do so by a government regulator such as Ofcom, are we not in trouble? It means that corporate terms of service, which are designed to protect platforms’ business interests, are trumping case law on free speech that has evolved over many years.

Those terms of service are also frequently in flux, according to fashion or ownership; one only has to look at the endless arguments, which I have yet to understand, about Twitter’s changing terms of service after the Elon Musk takeover. Is Ofcom’s job to follow Elon Musk’s ever-changing terms of service and enforce them on the British public as if they are law?

The terms and conditions are therefore no longer simply a contract between a company and the user; their being brought under statute means that big tech will be exercising public law functions, with Ofcom as the enforcer, ensuring that lawful speech is suppressed constantly, in line with private companies’ terms of service. This is an utter mess and not in any way adequate to protect free speech. It is a fudge by the Government: they were unpopular on “lawful but harmful”, so they have outsourced it to someone else to do the dirty work.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, it has been interesting to hear so many noble Lords singing from the same hymn sheet—especially after this weekend. My noble friend Lord McNally opened this group by giving us his wise perspective on the regulation of new technology. Back in 2003, as he mentioned, the internet was not even mentioned in the Communications Act. He explained how regulation struggles to keep up and how quantum leaps come with a potential social cost; all that describes the importance of risk assessment of these novel technologies.

As we have heard from many noble Lords today, on Report in the Commons the Government decided to remove the adult safety duties—the so-called “legal but harmful” aspect of the Bill. I agree with the many noble Lords who have said that this has significantly weakened the protection for adults under the Bill, and I share the scepticism many expressed about the triple shield.

Right across the board, this group of amendments, with one or two exceptions, rightly aims to strengthen the terms of service and user empowerment duties in the Bill in order to provide a greater baseline of protection for adults, without impinging on others’ freedom of speech, and to reintroduce some risk-assessment requirement on companies. The new duties will clearly make the largest and riskiest companies expend more effort on enforcing their terms of service for UK users. However, the Government have not yet presented any modelling on what effect this will have on companies’ terms of service. I have some sympathy with what the noble Lord, Lord Moylan, said: the new duties could mean that terms of service become much longer and lawyered. This might have an adverse effect on freedom of expression, leading to the use of excessive takedown measures rather than looking at other more systemic interventions to control content such as service design. We heard much the same argument from the noble Baroness, Lady Fox. They both made a very good case for some of the amendments I will be speaking to this afternoon.

On the other hand, companies that choose to do nothing will have an easier life under this regime. Faced with stringent application of the duties, companies might make their terms of service shorter, cutting out harms that are hard to deal with because of the risk of being hit with enforcement measures if they do not. Therefore, far from strengthening protections via this component of the triple shield, the Bill risks weakening them, with particular risks for vulnerable adults. As a result, I strongly support Amendments 33B and 43ZA, which my noble friend Lord McNally spoke to last week at the beginning of the debate on this group.

Like the noble Baroness, Lady Kidron, I strongly support Amendments 154, 218 and 160, tabled by the noble Lord, Lord Stevenson, which would require regulated services to maintain “adequate and appropriate” terms of service, including provisions covering the matters listed in Clause 12. Amendment 44, tabled by the right reverend Prelate the Bishop of Oxford and me, inserts a requirement that services to which the user empowerment duties apply

“must make a suitable and sufficient assessment of the extent to which they have carried out the duties in this section including in each assessment material changes from the previous assessment such as new or removed user empowerment features”.

The noble Viscount, Lord Colville, spoke very well to that amendment, as did the noble Baronesses, Lady Fraser and Lady Kidron.

Amendment 158, also tabled by me and the right reverend Prelate, inserts a requirement that services

“must carry out a suitable and sufficient assessment of the extent to which they have carried out the duties under sections 64 and 65 ensuring that assessment reflects any material changes to terms of service”.

That is a very good way of meeting some of the objections that we have heard to Clause 65 today.

These two amendments focus on risk assessment because the new duties do not have an assessment regime to work out whether they work, unlike the illegal content and children’s duties, as we have heard. Risk assessments are vital to understanding the environment in which the services are operating. A risk assessment can reduce bureaucracy by allowing companies to rule out risks which are not relevant to them, and it can increase user safety by revealing new risks and future-proofing a regime.

The Government have not yet provided, in the Commons or in meetings with Ministers, any proper explanation of why risk assessment duties have been removed along with the previous adult safety duties, and they have not explained in detail why undertaking a risk assessment is in any way a threat to free speech. They are currently expecting adults to manage their own risks, without giving them the information they need to do so. Depriving users of basic information about the nature of harms on a service prevents them taking informed decisions as to whether they want to be on it at all.

Without these amendments, the Bill cannot be said to be a complete risk management regime. There will be no requirement to explain to Ofcom or to users of a company’s service the true nature of the harms that occur on its service, nor the rationale behind the decisions made in these two fundamental parts of the service. This is a real weakness in the Bill, and I very much hope that the Minister will listen to the arguments being made this afternoon.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank noble Lords from all sides of the House for their contributions and for shining a light on the point the noble Lord, Lord Clement-Jones, made near the end of his remarks about the need to equip adults with the tools to protect themselves.

It is helpful to have these amendments, because they give the Minister the opportunity to accept—as I hope he will—a number of the points raised. It seems a long time since the noble Lord, Lord McNally, introduced this group, but clearly it has given us all much time to reflect. I am sure we will see the benefits of that in the response from the Minister. Much of the debate on the Bill has focused on child safety and general practicalities, but this group helpfully allows us to focus on adults and the operation of the Government’s replacement for the legal but harmful section of the Bill. As the noble Baroness, Lady Fraser, rightly said, perhaps some tightening up of the legislation before us would be helpful. These amendments give us that chance.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

I am very grateful to the noble Lords who have spoken on the amendments in this group, both this afternoon and last Tuesday evening. As this is a continuation of that debate, I think my noble friend Lord Moylan is technically correct still to wish the noble Baroness, Lady Kidron, a happy birthday, at least in procedural terms.

We have had a very valuable debate over both days on the Bill’s approach to holding platforms accountable to their users. Amendments 33B, 41A, 43ZA, 138A and 194A in the names of the noble Lords, Lord Lipsey and Lord McNally, and Amendment 154 in the name of the noble Lord, Lord Stevenson of Balmacara, seek to bring back the concept of legal but harmful content and related adult risk assessments. They reintroduce obligations for companies to consider the risk of harm associated with legal content accessed by adults. As noble Lords have noted, the provisions in the Bill to this effect were removed in another place, after careful consideration, to protect freedom of expression online. In particular, the Government listened to concerns that the previous legal but harmful provisions could create incentives for companies to remove legal content from their services.

In place of adult risk assessments, we introduced new duties on category 1 services to enable users themselves to understand how these platforms treat different types of content, as set out in Clauses 64 and 65. In particular, this will allow Ofcom to hold them to account when they do not follow through on their promises regarding content they say that they prohibit or to which they say that they restrict access. Major platforms already prohibit much of the content listed in Clause 12, but these terms of service are often opaque and not consistently enforced. The Bill will address and change that.

I would also like to respond to concerns raised through Amendments 41A and 43ZA, which seek to ensure that the user empowerment categories cover the most harmful categories of content to adults. I reassure noble Lords that the user empowerment list reflects input from a wide range of interested parties about the areas of greatest concern to users. Platforms already have strong commercial incentives to tackle harmful content. The major technology companies already prohibit most types of harmful and abusive content. It is clear that most users do not want to see that sort of content and most advertisers do not want their products advertised alongside it. Clause 12 sets out that providers must offer user empowerment tools with a specified list of content to the extent that it is proportionate to do so. This will be based on the size or capacity of the service as well as the likelihood that adult users will encounter the listed content. Providers will therefore need internally to assess the likelihood that users will encounter the content. If Ofcom disagrees with the assessment that a provider has made, it will have the ability to request information from providers for the purpose of assessing compliance.

Amendments 44 and 158, tabled by the right reverend Prelate the Bishop of Oxford, seek to place new duties on providers of category 1 services to produce an assessment of their compliance with the transparency, accountability, freedom of expression and user empowerment duties as set out in Clauses 12, 64 and 65 and to share their assessments with Ofcom. I am sympathetic to the aim of ensuring that Ofcom can effectively assess companies’ compliance with these duties. But these amendments would enable providers to mark their own homework when it comes to their compliance with the duties in question. The Bill has been designed to ensure that Ofcom has responsibility for assessing compliance and that it can obtain sufficient information from all regulated services to make judgments about compliance with their duties. The noble Baroness, Lady Kidron, asked about this—and I think the noble Lord, Lord Clement-Jones, is about to.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I hope the Minister will forgive me for interrupting, but would it not be much easier for Ofcom to assess compliance if a risk assessment had been carried out?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will come on to say a bit more about how Ofcom goes about that work.

The Bill will ensure that providers have the information they need to understand whether they are in compliance with their duties under the Bill. Ofcom will set out how providers can comply in codes of practice and guidance that it publishes. That information will help providers to comply, although they can take alternative action if they wish to do so.

The right reverend Prelate’s amendments also seek to provide greater transparency to Ofcom. The Bill’s existing duties already account for this. Indeed, the transparency reporting duties set out in Schedule 8 already enable Ofcom to require category 1, 2A and 2B services to publish annual transparency reports with relevant information, including about the effectiveness of the user empowerment tools, as well as detailed information about any content that platforms prohibit or restrict, and the application of their terms of service.

Amendments 159, 160 and 218, tabled by the noble Lord, Lord Stevenson, seek to require user-to-user services to create and abide by minimum terms of service recommended by Ofcom. The Bill already sets detailed and binding requirements on companies to achieve certain outcomes. Ofcom will set out more detail in codes of practice about the steps providers can take to comply with their safety duties. Platforms’ terms of service will need to provide information to users about how they are protecting users from illegal content, and children from harmful content.

These duties, and Ofcom’s codes of practice, ensure that providers take action to protect users from illegal content and content that is harmful to children. As such, an additional duty to have adequate and appropriate terms of service, as envisaged in the amendments, is not necessary and may undermine the illegal and child safety duties.

I have previously set out why we do not agree with requiring platforms to set terms of service for legal content. In addition, it would be inappropriate to delegate this much power to Ofcom, which would in effect be able to decide what legal content adult users can and cannot see.

Amendment 155, tabled by my noble friend Lord Moylan, seeks to clarify whether and how the Bill makes the terms of service of foreign-run platforms enforceable by Ofcom. Platforms’ duties under Clause 65 apply only to the design, operation and use of the service in the United Kingdom and to UK users, as set out in Clause 65(11). Parts or versions of the service which are used in foreign jurisdictions—

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I had not intended to speak in this debate because I now need to declare an unusual interest, in that Amendment 38A has been widely supported outside this Chamber by my husband, the Member of Parliament for Weston-super-Mare. I am not intending to speak on that amendment but, none the less, I mention it just in case.

I rise to speak because I have been so moved by the speeches, not least the right reverend Prelate’s speech. I would like just to briefly address the “default on” amendments and add my support. Like others, on balance I favour the amendments in the name of the noble Lord, Lord Clement-Jones, but would willingly throw my support behind my noble friend Lady Morgan were that the preferred choice in the Chamber.

I would like to simply add two additional reasons why I ask my noble friend the Minister to really reflect hard on this debate. The first is that children become teenagers, who become young adults, and it is a gradual transition—goodness, do I feel it as the mother of a 16 year-old and a 17 year-old. The idea that on one day all the protections just disappear completely and we require our 18 year-olds to immediately reconfigure their use of all digital tools just does not seem a sensible transition to adulthood to me, whereas the ability to switch off user empowerment tools as you mature as an adult seems a very sensible transition.

Secondly, I respect very much the free speech arguments that the noble Baroness, Lady Fox, made but I do not think this is a debate about the importance of free speech. It is actually about how effective the user empowerment tools are. If they are so hard for non-vulnerable adults to turn off, what hope have vulnerable adults to be able to turn them on? For the triple shield to work and the three-legged stool to be effective, the onus needs to be on the tech companies to make these user empowerment tools really easy to turn on and turn off. Then “default on” is not a restriction on freedom of speech at all; it is simply a means of protecting our most vulnerable.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, this has been a very thoughtful and thought-provoking debate. I start very much from the point of view expressed by the noble Baroness, Lady Kidron, and this brings the noble Baroness, Lady Buscombe, into agreement—it is not about the content; this is about features. The noble Baroness, Lady Harding, made exactly the same point, as did the noble Baroness, Lady Healy—this is not about restriction on freedom of speech but about a design feature in the Bill which is of crucial importance.

When I was putting together the two amendments that I have tabled, I was very much taken by what Parent Zone said in a recent paper. It described user empowerment tools as “a false hope”, and rightly had a number of concerns about undue reliance on tools. It said:

“There is a real danger of users being overwhelmed and bewildered”.


It goes on to say that

“tools cannot do all the work, because so many other factors are in play—parental styles, media literacy and technological confidence, different levels of vulnerability and, crucially, trust”.

The real question—this is why I thought we should look at it from the other side of things in terms of default—is about how we mandate the use of these user empowerment tools in the Bill for both children and adults. In a sense, my concerns are exactly the opposite of those of the noble Baroness, Lady Fox—for some strange, unaccountable reason.

The noble Baroness, Lady Morgan, the noble Lord, Lord Griffiths, the right reverend Prelate and, notably, my noble friend Lady Parminter have made a brilliant case for their amendment, and it is notable that these amendments are supported by a massive range of organisations. They are all in this area of vulnerable adults: the Mental Health Foundation, Mind, the eating disorder charity Beat, the Royal College of Psychiatrists, the British Psychological Society, Rethink Mental Illness, Mental Health UK, and so on. It is not a coincidence that all these organisations are discussing this “feature”. This is a crucial aspect of the Bill.

Again, I was very much taken by some of the descriptions used by noble Lords during the debate. The right reverend Prelate the Bishop of Oxford said that young people do not suddenly become impervious to content when they reach 18, and he particularly described the pressures as the use of AI only increases. I thought the way the noble Baroness, Lady Harding, described the progression from teenagehood to adulthood was extremely important. There is not some sort of point where somebody suddenly reaches the age of 18 and has full adulthood which enables then to deal with all this content.

Under the Bill as it stands, adult users could still see and be served some of the most dangerous content online. As we have heard, this includes pro-suicide, pro-anorexia and pro-bulimia content. One has only to listen to what my noble friend Lady Parminter had to say to really be affected by the operation, if you like, of social media in those circumstances. This is all about the vulnerable. Of course, we know that anorexia has the highest mortality rate of any mental health problem; the NHS is struggling to provide specialist treatment to those who need it. Meanwhile, suicide and self-harm-related content remains common and is repeatedly implicated in deaths. All Members here who were members of the Joint Committee remember the evidence of Ian Russell about his daughter Molly. I think that affected us all hugely.

We believe now you can pay your money and take your choice of whichever amendment seems appropriate. Changing the user empowerment provisions to require category 1 providers to have either the safest options as default for users or the terms of my two amendments is surely a straightforward way of protecting the vast majority of internet users who do not want this material served to them.

You could argue that the new offence of encouragement to serious self-harm, which the Government have committed to introducing, might form part of the solution here, but you cannot criminalise all the legal content that treads the line between glorification and outright encouragement. Of course, we know the way the Bill has been changed. No similar power is proposed, for instance, to address eating disorder content.

The noble Baroness, Lady Healy, quoted our own Communications and Digital Committee and its recommendations about a comprehensive toolkit of settings overseen by Ofcom, allowing users to decide what types of content they see and from whom. I am very supportive of Amendment 38A from the noble Lord, Lord Knight, which gives a greater degree of granularity about the kind of user, in a sense, that can communicate to users.

Modesty means that of course I prefer my own amendments and I agree with the noble Baronesses, Lady Fraser, Lady Bull and Lady Harding, and I am very grateful for their support. But we are all heading in the same direction. We are all arguing for a broader “by default” approach. The onus should not be on these vulnerable adults in particular to switch them on, as the noble Baroness, Lady Bull, said. It is all about those vulnerable adults and we must, as my noble friend Lady Burt, said, have their best interests at heart, and that is why we have tabled these amendments.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

That is right. Platforms are not in the public sector, so the public sector equality duty does not apply to them. However, that duty applies to Ofcom, taking into account the ways in which people with certain characteristics can be affected through the codes of practice and the user empowerment duties that it is enforcing. So it suffuses the thinking there, but the duty is on Ofcom as a public sector body.

We talk later in Clause 12(11) of some of the characteristics that are similar in approach to the protected characteristics in the Equality Act 2010. I will come to that again shortly in response to points made by noble Lords.

I want to say a bit about the idea of there being a cliff edge at the age of 18. This was raised by a number of noble Lords, including the noble Lord, Lord Griffiths, my noble friends Lady Morgan and Lady Harding and the noble Baroness, Lady Kidron. The Bill’s protections recognise that, in law, people become adults when they turn 18—but it is not right to say that there are no protections for young adults. As noble Lords know, the Bill will provide a triple shield of protection, of which the user empowerment duties are the final element.

The Bill already protects young adults from illegal content and content that is prohibited in terms and conditions. As we discussed in the last group, platforms have strong commercial incentives to prohibit content that the majority of their users do not want to see. Our terms of service duties will make sure that they are transparent about and accountable for how they treat this type of content.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, what distinguishes young adults from older adults in what the Minister in saying?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

In law, there is nothing. I am engaging with the point that there is no cliff edge. There are protections for people once they turn 18. People’s tastes and risk appetites may change over time, but there are protections in the Bill for people of all ages.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Surely, this is precisely the point that the noble Baroness, Lady Kidron, was making. As soon as you reach 18, there is no graduation at all. There is no accounting for vulnerable adults.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

There is not this cliff edge which noble Lords have feared—that there are protections for children and then, at 18, a free for all. There are protections for adult users—young adults, older adults, adults of any age—through the means which I have just set out: namely, the triple shield and the illegal content provisions. I may have confused the noble Lord in my attempt to address the point. The protections are there.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

There is an element of circularity to what the Minister is saying. This is precisely why we are arguing for the default option. It allows this vulnerability to be taken account of.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Perhaps it would help if the Minister wanted to just set out the difference for us. Clearly, this Committee has spent some time debating the protection for children, which has a higher bar than protection for adults. It is not possible to argue that there will be no difference at the age of 18, however effective the first two elements of the triple shield are. Maybe the Minister needs to think about coming at it from the point of view of a child becoming an adult, and talk us through what the difference will be.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I understand all of that—I think—but that is not the regime being applied to children. It is really clear that children have a safer, better experience. The difference between those experiences suddenly happening on an 18th birthday is what we are concerned about.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Before the Minister stands up—a new phrase—can he confirm that it is perfectly valid to have a choice to lift the user empowerment tool, just as it is to impose it? Choice would still be there if our amendments were accepted.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It would be, but we fear the chilling effect of having the choice imposed on people. As the noble Baroness, Lady Fox, rightly put it, one does not know what one has not encountered until one has engaged with the idea. At the age of 18, people are given the choice to decide what they encounter online. They are given the tools to ensure that they do not encounter it if they do not wish to do so. As the noble Lord has heard me say many times, the strongest protections in the Bill are for children. We have been very clear that the Bill has extra protections for people under the age of 18, and it preserves choice and freedom of expression online for adult users—young and old adults.

My noble friend Lady Buscombe asked about the list in Clause 12(11). We will keep it under constant review and may consider updating it should compelling evidence emerge. As the list covers content that is legal and designed for adults, it is right that it should be updated by primary legislation after a period of parliamentary scrutiny.

Amendments 42 and 38A, tabled by the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, respectively, seek to change the scope of user empowerment content features. Amendment 38A seeks to expand the user empowerment content features to include the restriction of content the provenance of which cannot be authenticated. Amendment 42 would apply features to content that is abusive on the basis of characteristics protected under the Equality Act 2010.

The user empowerment content list reflects areas where there is the greatest need for users to be offered choice about reducing their exposure to types of content. While I am sympathetic to the intention behind the amendments, I fear they risk unintended consequences for users’ rights online. The Government’s approach recognises the importance of having clear, enforceable and technically feasible duties that do not infringe users’ rights to free expression. These amendments risk undermining this. For instance, Amendment 38A would require the authentication of the provenance of every piece of content present on a service. This could have severe implications for freedom of expression, given its all-encompassing scope. Companies may choose not to have anything at all.