Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fraser of Craigmaddie
Main Page: Baroness Fraser of Craigmaddie (Conservative - Life peer)Department Debates - View all Baroness Fraser of Craigmaddie's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Lords ChamberAn offence in this Bill is an offence under the law of any part of the UK. There is a complex interplay between online safety, which is reserved, and devolved matters such as child and adult protection, education, justice and policing. I realise that the legislative differences between Scotland and England are quite topical. The offence, for example, protecting people with epilepsy does not cover Scotland as Scottish law already covers this behaviour, as is the case with the new cyberflashing offence.
However, the Bill does give Scottish Ministers the powers to amend regulations relating to priority offences in Part 2 of Schedule 6. I think government amendments in the other place mean that Scotland’s hate crime Act will not affect what people can and cannot say online in the rest of the UK, since it was passed by a devolved authority without the Government’s consent. But I believe a loophole remains whereby a future Government could simply approve that or any other law that has been passed in Holyrood, so Nicola Sturgeon could still become the content moderator for the whole of the UK. How should online providers therefore respond where there are differences in legislation across the four nations?
Access to data is clearly essential to ensure that the dynamic landscape of online harms is understood in the Scottish context. I am thinking of issues for rural and remote communities, how online platforms respond to sectarian content, or understanding the online experiences of people with drug or gambling addictions. Are there any differences across the UK? In terms of the transparency reports required by the Bill, will Ofcom be able to see that data in a nation-specific way?
Scotland has a thriving gaming industry, but it is unclear if there is industry awareness or involvement in this Bill and its implications for gaming platforms. I declare an interest as a board member of Creative Scotland. Will the Minister elaborate on what consultation there has been with gaming companies across the UK, including in Scotland?
The Bill rightly recognises that children are a vulnerable group, but has thought been given to the definition of a child throughout the United Kingdom, because in Scotland it varies. The 2014 Act includes all children up to the age of 18, but there are instances where someone aged 16 may legally be treated as an adult, and other circumstances where disabled or care-experienced children can be included in children’s services until their 26th birthday. As other noble Lords have mentioned, people with physical disabilities, learning disabilities or mental health issues, people in care, people with addictions and many more of all ages could be classed as being vulnerable online. What is the data on looking at online harms from purely an age perspective?
I note that there is an obligation to consult disabled people on decision-making, but should not all those within the CRPD definition of disabled be within the scope of the consultation requirements of the Bill? I would like to see the consultation duties under Clauses 36 and 69 strengthened. I also support calls from other noble Lords for requirements to be placed on providers to risk-assess their customer base, and to provide basic safety settings set to “on” by default.
However, I do welcome the Bill. It is, as others have said, a landmark piece of legislation. We will be far better off with it on the statute book than we are now, but I hope we can get some of the details right as it makes its way through your Lordships’ House.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fraser of Craigmaddie
Main Page: Baroness Fraser of Craigmaddie (Conservative - Life peer)Department Debates - View all Baroness Fraser of Craigmaddie's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I support the amendments in this group that, with regard to safety by design, will address functionality and harms—whatever exactly we mean by that—as well as child safety duties and codes of practice. The noble Lord, Lord Russell, and the noble Baronesses, Lady Harding and Lady Kidron, have laid things out very clearly, and I wish the noble Baroness, Lady Kidron, a happy birthday.
I also support Amendment 261 in the name of my right reverend friend the Bishop of Oxford and supported by the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville. This amendment would allow the Secretary of State to consider safety by design, and not just content, when reviewing the regime.
As we have heard, a number of the amendments would amend the safety duties to children to consider all harms, not just harmful content, and we have begun to have a very interesting debate on that. We know that service features create and amplify harms to children. These harms are not limited to spreading harmful content; features in and of themselves may cause harm—for example, beautifying filters, which can create unrealistic body ideals and pressure on children to look a certain way. In all of this, I want us to listen much more to the voices of children and young people—they understand this issue.
Last week, as part of my ongoing campaign on body image, including how social media can promote body image anxiety, I met a group of young people from two Gloucestershire secondary schools. They were very good at saying what the positives are, but noble Lords will also be very familiar with many of the negative issues that were on their minds, which I will not repeat here. While they were very much alive to harmful content and the messages it gives them, they were keen to talk about the need to address algorithms and filters that they say feed them strong messages and skew the content they see, which might not look harmful but, because of design, accentuates their exposure to issues and themes about which they are already anxious. Suffice to say that underpinning most of what they said to me was a sense of powerlessness and anxiety when navigating the online world that is part of their daily lives.
The current definition of content does not include design features. Building in a safety by design principle from the outset would reduce harms in a systematic way, and the amendments in this group would address that need.
My Lords, I support this group of amendments. Last week, I was lucky—that is not necessarily the right word—to participate in a briefing organised by the noble Lord, Lord Russell of Liverpool, with the 5Rights Foundation on its recent research, which the noble Lord referred to. As the mother of a 13 year-old boy, I came away wondering why on earth you would not want to ensure safety by design for children.
I am aware from my work with disabled children that we know, as Ofcom knows from its own research, that children—or indeed anyone with a long-term health impact or a disability—are far more likely to encounter and suffer harm online. As I say, I struggle to see why you would not want to have safety by design.
This issue must be seen in the round. In that briefing we were taken through how quickly you could get from searching for something such as “slime” to extremely graphic pornographic content. As your Lordships can imagine, I went straight back to my 13 year-old son and said, “Do you know about slime and where you have you seen it?” He said, “Yes, Mum, I’ve watched it on YouTube”. That echoes the point made by the noble Baroness, Lady Kidron—to whom I add my birthday wishes—that these issues have to be seen in the round because you do not just consume content; you can search on YouTube, shop on Google, search on Amazon and all the rest of it. I support this group of amendments.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fraser of Craigmaddie
Main Page: Baroness Fraser of Craigmaddie (Conservative - Life peer)Department Debates - View all Baroness Fraser of Craigmaddie's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Moylan, for his words—I thought I was experiencing time travel there—and am sympathetic to many of the issues that he has raised, although I think that some of the other amendments in the group tackle those issues in a slightly different way.
I support Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. Requiring a post-rollout assessment to ensure that the triple shield acts as we are told it will seems to be a classic part of any regulatory regime that is fit for purpose: it needs to assess whether the system is indeed working. The triple shield is an entirely new concept, and none of the burgeoning regulatory systems around the world is taking this approach, so I hope that both the Government and Ofcom welcome this very targeted and important addition to the Bill.
I will also say a few words about Amendments 154 and 218. It seems to me that, in moving away from legal but harmful—which as a member of the pre-legislative committee I supported, under certain conditionality that has not been met, but none the less I did support it—not enough time and thought have been given to the implications of that. I do not understand, and would be grateful to the Minister if he could help me understand, how Ofcom is to determine whether a company has met its own terms and conditions—and by any means, not only by the means of a risk assessment.
I want to make a point that the noble Baroness, Lady Healy, made the other day—but I want to make it again. Taking legal but harmful out and having no assessment of whether a company has met its general safety duties leaves the child safety duties as an island. They used to be something that was added on to a general system of safety; now they are the first and only port of call. Again, because of the way that legal but harmful fell out of the Bill, I am not sure whether we have totally understood how the child risk assessments sit without a generally cleaned up or risk-assessed digital environment.
Finally, I will speak in support of Amendment 160, which would have Ofcom say what “adequate and appropriate” terms are. To a large degree, that is my approach to the problem that the noble Lord, Lord Moylan, spoke about: let Parliament and the regulator determine what we want to see—as was said on the data protection system, that is how it is—and let us have minimum standards that we can rightly expect, based on UK law, as the noble Lord suggested.
I am not against the triple shield per se, but it radically replaced an entire regime of assessment, enforcement and review. I think that some of the provisions in this group really beg the Government’s attention, in order to make sure that there are no gaping holes in the regime.
My Lords, I will speak to Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. I also note my support for the amendments in the name of the noble Lord, Lord Stevenson of Balmacara, to ensure the minimum standard for a platform’s terms of service. My noble friend Lord Moylan has just given an excellent speech on the reasons why these amendments should be considered.
I am aware that the next group of amendments relates to the so-called user empowerment tools, so it seems slightly bizarre to be speaking to Amendment 44, which seeks to ensure that these user empowerment tools actually work as the Government hope they will, and Amendment 158, which seeks to risk assess whether providers’ terms of service duties do what they say and report this to Ofcom. Now that the Government have watered down the clauses that deal with protection for adults, like other noble Lords, I am not necessarily against the Government’s replacement—the triple shield—but I believe that it needs a little tightening up to ensure that it works properly. These amendments seem a reasonable way of doing just that. They would ensure greater protection for adults without impinging on others’ freedom of expression.
The triple shield relies heavily on companies’ enforcement of terms of service and other vaguely worded duties, as the noble Viscount mentioned, that user empowerment tools need to be “easily accessible” and “effective”—whatever that means. Unlike with other duties in the Bill, such as those on illegal content and children’s duties, there is no mechanism to assess whether these new measures are working; whether the way companies are carrying out these duties is in accordance with the criteria set out; and whether they are indeed infringing freedom of expression. Risk assessments are vital to doing just that, because they are vital to understanding the environment in which services operate. They can reduce bureaucracy by allowing companies to rule out risks which are not relevant to them, and they can increase user safety by revealing new risks, thereby enabling the future-proofing of a regime. Can the Minister give us an answer today as to why risk assessment duties on these two strands of the triple shield—terms of service and user empowerment tools—were removed? If freedom of speech played a part in this, perhaps he could elaborate why he thinks undertaking a risk assessment is in any way a threat.
Without these amendments, the Bill cannot be said to be a complete risk management regime. Companies will, in effect, be marking their own homework when designing their terms of service and putting their finger in the air when it comes to user empowerment tools. There will be no requirement for them to explain either to Ofcom or indeed to service users the true nature of the harms that occur on their service, nor the rationale behind any decisions they might make in these two fundamental parts of their service.
Since the Government are relying so heavily on their triple shield to ensure protection for adults, to me, not reviewing two of the three strands that make up the triple shield seems like fashioning a three-legged stool with completely uneven legs: a stool that will not stand up to the slightest pressure when used. Therefore, I urge the Minister to look again and consider reinstating these protections in the Bill.
My Lords, I contribute to this debate on the basis of my interests as laid out in the register: as chief executive of Cerebral Palsy Scotland; my work with the Scottish Government on people with neurological conditions; and as a trustee of the Neurological Alliance of Scotland. It is an honour to follow the right reverend Prelate, whose point about the inequality people experience in the online world is well made. I want to be clear that when I talk about ensuring online protection for people with disabilities, I do not assume that all adults with disabilities are unable to protect themselves. As the right reverend Prelate and the noble Lord, Lord Griffiths of Burry Port, pointed out, survey after survey demonstrates how offline vulnerabilities translate into the online world, and Ofcom’s own evidence suggests that people with physical disabilities, learning disabilities, autism, mental health issues and others can be classed as being especially vulnerable online.
The Government recognise that vulnerable groups are at greater risk online, because in its previous incarnations, this Bill included greater protection for such groups. We spoke in a previous debate about the removal of the “legal but harmful” provisions and the imposition of the triple shield. The question remains from that debate: does the triple shield provide sufficient protection for these vulnerable groups?
As I have said previously this afternoon, user empowerment tools are the third leg of the triple shield, but they put all the onus on users and no responsibility on the platforms to prevent individuals’ exposure to harm. Amendments 36, 37 and 38A, in the name of the noble Lord, Lord Clement-Jones, seek simply to make the default setting for the proposed user empowerment tools to be “on”. I do not pretend to understand how, technically, this will happen, but it clearly can, because the Bill requires platforms to ensure that this is the default position to ensure protection for children. The default position in those amendments protects all vulnerable people, and that is why I support them—unlike, I fear, Amendment 34 from my noble friend Lady Morgan, which lists specific categories of vulnerable adults. I would prefer that all vulnerable people be protected from being exposed to harm in the first place.
Nobody’s freedom of expression is affected in any way by this default setting, but the overall impact on vulnerable individuals in the online environment would, I assure your Lordships, be significant. Nobody’s ability to explore the internet or to go into those strange rooms at the back of bookshops that the noble Baroness, Lady Fox, was talking about would be curtailed. The Government have already stated that individuals will have the capacity to seek out these tools and turn them on and off, and that they must be easily accessible. So individuals with capacity will be able to find the settings and set them to explore whatever legal content they choose.
However, is it not our duty to remember those who do not have capacity? What about adults with learning difficulties and people at a point of crisis—the noble Baroness, Lady Parminter, movingly spoke about people with eating disorders—who might not be able to turn to those tools due to their affected mental state, or who may not realise that what they are seeing is intended to manipulate? Protecting those users from encountering such content in the first place surely tips the balance in favour of turning the tools on by default.
I am very sad that the noble Baroness, Lady Campbell of Surbiton, cannot be here, because her contribution to this debate would be powerful. But, from her enormous experience of work with disabled people, this is her top priority for the Bill.
In preparing to speak to these amendments, I looked back to the inquiry in the other place into online abuse and the experience of disabled people that was prompted by Katie Price’s petition after the shocking abuse directed at her disabled son Harvey. In April 2019 the Government responded to that inquiry by saying that they were
“aware of the disproportionate abuse experienced by disabled people online and the damage such abuse can have on people’s lives, career and health”—
and the Government pledged to act.
The internet is a really important place for disabled people, and I urge the Government to ensure that it remains a safe place for all of us and to accept these amendments that would ensure the default settings are set to on.
My Lords, I rise to support the amendments in the name of the noble Baroness, Lady Morgan. I do so somewhat reluctantly, not because I disagree with anything that she said but because I would not necessarily start from here. I want to briefly say three very quick things about that and then move on to Amendments 42 and 45, which are also in this group.
We already have default settings, and we are pretending that this is a zero-sum game. The default settings at the moment are profiling us, filtering us and rewarding us; and, as the right reverend Prelate said in his immensely powerful speech, we are not starting at zero. So I do share the concerns of the noble Baroness, Lady Fox, about who gets to choose—some of us on this side of the debate are saying, “Can we define who gets to choose? Can Parliament choose? Can Ofcom choose? Can we not leave this in the hands of tech companies?” So on that I fully agree. But we do have default settings already, and this is a question of looking at some of the features as well as the content. It is a weakness of the Government’s argument that it keeps coming back to the content rather than the features, which are the main driver of what we see.
The second thing I want to say—this is where I am anxious about the triple shield—is: does not knowing you are being abused mean that you are not abused? I say that as someone with some considerable personal abuse. I have my filter on and I am not on social media, but my children, my colleagues and some of the people I work with around the world do see what is said about me—it is a reputational thing, and for some of them it is a hurtful thing, and that is why I am reluctant in my support. However, I do agree with all the speakers who have said that our duty is to start with those people who are most vulnerable.
I want to mention the words of one of the 5Rights advisers—a 17 year-old girl—who, when invited to identify changes and redesign the internet, said, “Couldn’t we do all the kind things first and gradually get to the horrible ones?” I think that this could be a model for us in this Chamber. So, I do support the noble Baroness.
I want to move briefly to Amendment 42, which would see an arbitrary list of protected characteristics replaced by the Equality Act 2010. This has a lot to do with a previous discussion we had about human rights, and I want to say urgently to the Minister that the offer of the Online Safety Bill is not to downgrade human rights, children’s rights and UK law, but rather to bring forward a smart and comprehensive regime to hold companies accountable for human rights, children’s rights and UK law. We do not want to have a little list of some of our children’s rights or of some of our legislation; we would like our legislation and our rights embedded in the Bill.
I have to speak for Amendment 45. I express my gratitude to the noble Lord, Lord Stevenson, for tabling it. It would require Ofcom, six months after the event, to ask whether children need these user empowerment tools. It is hugely important. I remind the Committee that children have not only rights but an evolving capacity to be out there in the world. As I said earlier, the children’s safety duties have a cliff-edge feel to them. As children go out into the world on the cusp of adulthood, maybe they would like to have some of these user empowerment tools.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fraser of Craigmaddie
Main Page: Baroness Fraser of Craigmaddie (Conservative - Life peer)Department Debates - View all Baroness Fraser of Craigmaddie's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberI will not detain noble Lords very long either. Two things have motivated me to be involved in this Bill. One is protection for vulnerable adults and the second is looking at this legislation with my Scottish head on, because nobody else seems to be looking at it from the perspective of the devolved Administrations.
First, on protection for vulnerable adults, we have already debated the fact that in an earlier iteration of this Bill, there were protections. These have been watered down and we now have the triple shield. Whether they fit here, with the amendment from my noble friend Lady Stowell, or fit earlier, what we are all asking for is the reinstatement of risk assessments. I come at this from a protection of vulnerable groups perspective, but I recognise that others come at it from a freedom of expression perspective. I do not think the Minister has answered my earlier questions. Why have risk assessments been taken out and why are they any threat? It seems to be the will of the debate today that they do nothing but strengthen the transparency and safety aspects of the Bill, wherever they might be put.
I speak with trepidation to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. I flatter myself that his amendment and mine are trying to do a similar thing. I will speak to my amendment when we come to the group on devolved issues, but I think what both of us are trying to establish is, given that the Bill is relatively quiet on how freedom of expression is defined, how do platforms balance competing rights, particularly in the light of the differences between the devolved Administrations?
The Minister will know that the Hate Crime and Public Order (Scotland) Act 2021 made my brain hurt when trying to work out how this Bill affects it, or how it affects the Bill. What is definitely clear is that there are differences between the devolved Administrations in how freedom of expression is interpreted. I will study the noble and learned Lord’s remarks very carefully in Hansard; I need a little time to think about them. I will listen very carefully to the Minister’s response and I look forward to the later group.
My Lords, I too will be very brief. As a member of the Communications and Digital Committee, I just wanted to speak in support of my noble friend Lady Stowell of Beeston and her extremely powerful speech, which seems like it was quite a long time ago now, but it was not that long. I want to highlight two things. I do not understand how, as a number of noble Lords have said, having risk assessments is a threat to freedom of expression. I think the absolute opposite is the case. They would enhance all the things the noble Baroness, Lady Fox, is looking to see in the Bill, just as much as they would enhance the protections that my noble friend, who I always seem to follow in this debate, is looking for.
Like my noble friend, I ask the Minister: why not? When the Government announced the removal of legal but harmful and the creation of user empowerment tools, I remember thinking—in the midst of being quite busy with Covid—“What are user empowerment tools and what are they going to empower me to do?” Without a risk assessment, I do not know how we answer that question. The risk is that we are throwing that question straight to the tech companies to decide for themselves. A risk assessment provides the framework that would enable user empowerment tools to do what I think the Government intend.
Finally, I too will speak against my noble friend Lord Moylan’s Amendment 294 on psychological harm. It is well documented that tech platforms are designed to drive addiction. Addiction can be physiological and psychological. We ignore that at our peril.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fraser of Craigmaddie
Main Page: Baroness Fraser of Craigmaddie (Conservative - Life peer)Department Debates - View all Baroness Fraser of Craigmaddie's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I am delighted to propose this group of amendments on devolution issues. I am always delighted to see the Committee so full to talk about devolution issues. I will speak particularly to Amendments 58, 136, 225A and 228 in this group, all in my name. I am very grateful to the noble Lord, Lord Foulkes of Cumnock, for supporting them.
As I have said before in Committee, I have looked at the entire Bill from the perspective of a devolved nation, in particular at the discrepancies and overlaps of Scots law, UK law and ECHR jurisprudence that I was concerned had not been taken into account or addressed by the Bill as it stands. Many have said that they are not lawyers; I am also not. I am therefore very grateful to the Law Society of Scotland, members of Ofcom’s Advisory Committee for Scotland, and other organisations such as the Carnegie Trust and Legal to Say, Legal to Type, which have helped formulate my thinking. I also thank the Minister and the Bill team for their willingness to discuss these issues in advance with me.
When the first proposed Marshalled List for this Committee was sent round, my amendments were dotted all over the place. When I explained to the Whips that they were all connected to devolved issues and asked that they be grouped together, that must have prompted the Bill team to go and look again; the next thing I know, there is a whole raft of government amendments in this group referring to Wales, Northern Ireland, the Bailiwick of Guernsey and the Isle of Man—though not Scotland, I noted. These government amendments are very welcome; if nothing else, I am grateful to have achieved that second look from the devolved perspective.
In the previous group, we heard how long the Bill had been in gestation. I have the impression that, because online safety decision-making is a centralised and reserved matter, the regions are overlooked and engaged only at a late stage. The original internet safety Green Paper made no reference to Scotland at all; it included a section on education describing only the English education system and an annexe of legislation that did not include Scottish legislation. Thankfully, this oversight was recognised by the White Paper, two years later, which included a section on territorial scope. Following this, the draft Bill included a need for platforms to recognise the differences in legislation across the UK, but this was subsequently dropped.
I remain concerned that the particular unintended consequences of the Bill for the devolved Administrations have not been fully appreciated or explored. While online safety is a reserved issue, many of the matters that it deals with—such as justice, the police or education —are devolved, and, as many in this House appreciate, Scots law is different.
At the moment, the Bill is relatively quiet on how freedom of expression is defined; how it applies to the providers of user-to-user services and their duties to protect users’ rights to freedom of expression; and how platforms balance those competing rights when adjudicating on content removal. My Amendment 58 has similarities to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. It seeks to ensure that phrases such as “freedom of expression” are understood in the same way across the United Kingdom. As the noble and learned Lord pointed out when speaking to his Amendment 63 in a previous group, words matter, and I will therefore be careful to refer to “freedom of expression” rather than “freedom of speech” throughout my remarks.
Amendment 58 asks the Government to state explicitly which standards of speech platforms apply in each of the jurisdictions of the UK, because at this moment there is a difference. I accept that the Human Rights Act is a UK statute already, but, under Article 10—as we have heard—freedom of expression is not an absolute right and may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law.
The noble Lord, Lord Moylan, argued last week that the balance between freedom of expression and any condition or restriction was not an equal one but was weighted in favour of freedom of expression. I take this opportunity to take some issue with my noble friend, who is not in his place, on this. According to the Equality and Human Rights Commission, the British Institute of Human Rights and Supreme Court judgments, human rights are equal and indivisible, neither have automatic priority, and how they are balanced depends on the context and the particular facts.
In Scotland, the Scottish Government believe that they are protecting freedom of expression, but the Hate Crime and Public Order (Scotland) Act 2021 criminalises speech that is not illegal elsewhere in the UK. Examples from the Scottish Government’s own information note state that it is now an offence in Scotland
“if the urging of people to cease practising their religion is done in a threatening or abusive manner or, alternatively, … if a person were to urge people not to engage in same-sex sexual activity while making abusive comments about people who identify as lesbian, gay or bisexual”.
The Lord Advocate’s guidance to the police says that
“an incident must be investigated as a hate crime if it is perceived, by the victim or any other person, to be aggravated by prejudice”.
I stress that I make no absolutely comment about the merits, or otherwise, of the Hate Crime and Public Order (Scotland) Act. I accept that it is yet to be commenced. However, commencement is in the hands of the Scottish Parliament, not the Minister and his team, and I highlight it here as an illustration of the divergence of interpretation that is happening between the devolved nations now, and as an example of what could happen in the future.
So, I would have thought that we would want to take a belt-and-braces approach to ensuring that there cannot be any differences in interpretation of what we mean by freedom of expression, and I hope that the Minister will accept my amendment for the sake of clarity. Ofcom is looking for clarity wherever possible, and clarity will be essential for platforms. Amendment 58 would allow platforms to interpret freedom of expression as a legal principle, rather than having to adapt considerations for Scotland, and it would also help prevent Scottish users’ content being censored more than that of English users, as platforms could rely on a legally certain basis for decision-making.
The hate crime Act was also the motivation for my Amendment 136, which asks why the Government did not include it on the list of priority offences in Schedule 7. I understand that the Scottish Government did not ask for it to be included, but since when did His Majesty’s Government do what the Scottish Government ask of them?
I have assumed that the Scottish Government did not ask for it because the hate crime Act is yet to be commenced in Scotland and there are, I suspect, multiple issues to be worked out with Police Scotland and others before it can be. I stress again that it is not my intention that the Hate Crime and Public Order (Scotland) Act should dictate the threshold for illegal and priority illegal content in this Bill—Amendment 136 is a probing amendment—but the omission of the hate crime Act does raise the question of a devolution deficit because, while the definition of “illegal content” varies, people in areas of the UK with more sensitive thresholds would have to rely on the police to enforce some national laws online rather than benefiting from the additional protections of the Ofcom regime.
Clause 53(5)(c) of this Bill states that
“the offence is created by this Act or, before or after this Act is passed, by”—
this is in sub-paragraph (iv)—
“devolved subordinate legislation made by a devolved authority with the consent of the Secretary of State or other Minister of the Crown”.
How would this consent be granted? How would it involve this Parliament? What consultation should be required, and with whom—particularly since the devolved offence might change the thresholds for the offence across the whole of the UK? The phrase “consent of the Secretary of State” implies that a devolved authority would apply to seek consent. Should not this application process be set out in the Bill? What should the consultation process with devolved authorities and Ofcom be if the Secretary of State wishes to initiate the inclusion of devolved subordinate legislation? Do we not need a formal framework for parliamentary scrutiny—an equivalent of the Grimstone process, perhaps? I would be very happy to work with the Minister and his team on a Parkinson process between now and Report.
Amendments 225A and 228 seek to ensure that there is an analysis of users’ online experiences in the different nations of the UK. Amendment 225A would require Ofcom to ensure that its research into online experiences was analysed in a nation-specific way while Amendment 228 would require Ofcom’s transparency reporting to be reported via each nation. The fact is that, at this moment in time, we do not know whether there is a difference in the online experience across the four nations. For example, are rural or remote communities at greater risk of online harm because they have a greater dependence on online services? How would online platforms respond to harmful sectarian content? What role do communication technologies play in relation to offline violence, such as knife crime?
We can compare other data by nation, for example on drug use or gambling addiction. Research and transparency reporting are key to understanding nation-specific harms online, but I fear that Ofcom will have limited powers in this area if they are not specified in the Bill. Ofcom has good working relationships from the centre with the regions, and part of this stems from the fact that legislation in other sectors, such as broadcasting, requires it to have advisory committees in each of the nations to ensure that English, Scottish, Northern Irish and Welsh matters are considered properly. Notably, those measures do not exist in this Bill.
The interplay between the high-level and reserved nature of internet services and online safety will require Ofcom to develop a range of new, wider partnerships in Scotland—for example with Police Scotland—and to collaborate closely at a working level with a wide range of interests within the Scottish Government, where such interests will be split across a range of ministerial portfolios. In other areas of its regulatory responsibility, Ofcom’s research publications provide a breakdown of data by nation. Given the legislative differences that already exist between the four nations, it is an omission that such a breakdown is not explicitly required in the Bill.
I have not touched—and I am not going to touch—on how this Bill might affect other devolved Administrations. The noble Baroness, Lady Foster of Aghadrumsee, apologises for being unable to be in the Chamber to lend her voice from a Northern Ireland perspective— I understand from her that the Justice (Sexual Offences and Trafficking Victims) Act (Northern Ireland) 2022 might be another example of this issue—but she has indicated her support here. As my noble friend Lady Morgan of Cotes said last Thursday:
“The Minister has done a very good job”
of
“batting away amendments”.—[Official Report, 11/5/23; col. 2043.]
However, I am in an optimistic mood this afternoon, because the Minister responded quite positively to the request from the noble and learned Lord, Lord Hope, that we should define “freedom of expression”. There is great benefit to be had from ensuring that this transparency of reporting and research can be broken down by nation. I am hopeful, therefore, that the Minister will take the points that I have raised through these amendments and that he will, as my noble friend Lady Morgan of Cotes hoped, respond by saying that he sees my points and will work with me to ensure that this legislation works as we all wish it to across the whole of the UK. I beg to move.
I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.
My Lords, I thank the Minister for his response. I take it as a win that we have been offered a meeting and further discussion, and the noble Lord, Lord Foulkes, agreeing with every word I said. I hope we can continue in this happy vein in my time in this House.
The suggestion from the noble Lord, Lord Stevenson, of a table is a welcome one. Something that has interested me is that some of the offences the Minister mentioned were open goals: there were holes leaving it open in Northern Ireland and not in England and Wales, or whatever. For example, epilepsy trolling is already a criminal offence in Scotland, but I am not sure that was appreciated when we started this discussion.
I look forward to the meeting and I thank the Minister for his response. I am still unconvinced that we have the right consultation process for any devolved authority wanting to apply for a subordinate devolved Administration to be included under this regime.
It concerns me that the Minister talked about leaving requesting data that Ofcom deemed to be appropriate. The feeling on the ground is that Ofcom, which is based in London, may not understand what is or is not necessarily appropriate in the devolved Administrations. The fact that in other legislation—for example, on broadcasting—it is mandated that it is broken down nation by nation is really important. It is even more important because of the interplay between the devolved and the reserved matters. The fact that there is no equivalent Minister in the Scottish Government to talk about digital and online safety things with means that a whole raft of different people will need to have relationships with Ofcom who have not hitherto.
I thank the Minister. On that note, I withdraw my amendment.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fraser of Craigmaddie
Main Page: Baroness Fraser of Craigmaddie (Conservative - Life peer)Department Debates - View all Baroness Fraser of Craigmaddie's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, the business of the internet is data. Whether it is a retail business, a media business or any other kind of business, the internet is all about data. The chiefs of our internet companies know more about noble Lords than anyone else—more than any government agency, your doctor and almost anyone—because the number of data points that big internet companies have on people is absolutely enormous, and they use them to very great effect.
Some of those effects are entirely benign. I completely endorse what the noble Baroness, Lady Fox, said. As a champion of innovation and business, I totally recognise the good that is done by the world’s internet companies to make our lives richer, create jobs and improve the world, but some of what they do is not good. Either inadvertently or by being passive enablers of harm, internet companies have been responsible for huge societal harms. I do not want to go through the full list, but when I think about the mental health of our teenagers, the extremism in our politics, the availability of harmful information to terrorists and what have you, there is a long catalogue of harms to which internet companies have contributed. We would be naive if we did not recognise.
However, almost uniquely among commercial businesses, internet companies guard access to that data incredibly jealously. They will not let you peek in and share their insights. I know from my experience in the health field that we work very closely with the pharmaceutical industry—there is a whole programme of pharmacovigilance that any pharma company has to participate in in order to explain, measure and justify the good and disbenefits of its medicines. We have no similar programme to pharmacovigilance for the tech industry. Instead, we are completely blind. Policy makers, the police and citizens are flying blind when it comes to the data that is held on us on both an individual and a demographic basis. That is extremely unusual.
That is why I really welcome my noble friend’s amendments that give Ofcom what seems to me to be extremely proportionate and thoughtful powers in order to look into this data, because without it, we do not know what is going on in this incredibly important part of our lives.
The role that researchers, including academic, civil society and campaigning researchers, play in helping Ofcom, policymakers and politicians to arrive at sensible, thoughtful and proportionate policy is absolutely critical. I pay enormous tribute to them; I am grateful to those noble Lords who have also done so. I am extremely grateful to my noble friend the Minister for his amendments on this subject, Amendments 272B and 272C, which address the question of giving researchers better access to some of this data. They would reduce the timeline for the review on data from 24 months to 18 months, which would be extremely helpful, and would changing “may” to “must”, which represents an emphatic commitment to the outcome of this review.
However, issues remain around the question of granting access to data for researchers. What happens to the insights from the promised review once it is delivered? Where are the powers to deliver the review’s recommendations? That gap is not currently served by the government amendments, which is why I and the noble Lord, Lord Clement-Jones, have tabled Amendments 237ZA, 237DB, 262AA and 272AB. Their purpose is to put in the Bill reasonable, proportionate powers to bring access to data for researchers along the lines that the research review will recommend.
The feelings on this matter are extremely strong because we all recognise the value here. We are concerned that any delay may completely undermine this sector. As we debated in Committee, there is a substantial and valuable UK sector in this research area that is likely to move lock, stock and barrel to other countries where these kinds of powers may be in place; for instance, in EU or US legislation. The absence of these powers will, I think, leave Britain in the dark and competitively behind other countries, which is why I want to push the Minister hard on these amendments. I am grateful for his insight that this matter is something that the Government may look to in future Bills, but those Bills are far off. I would like to hear from him what more he could do to try to smooth the journey from this Bill and this review to any future legislation that comes through this House in order to ensure that this important gap is closed.
My Lords, Amendments 270 and 272 are in my name; I thank the noble Lord, Lord Stevenson of Balmacara, for adding his name to them. They are the least controversial amendments in this group, I think. They are really simple. Amendment 270 would require Ofcom’s research about online interests and users’ experiences of regulated services under Clause 143 to be broken down by nation, while Amendment 272 relates to Clause 147 and would require Ofcom’s transparency reports also to be broken down in a nation-specific way.
These amendments follow on from our debates on devolution in Committee. Both seek to ensure that there is analysis of users’ online experiences in the different nations of the UK, which I continue to believe is essential to ensuring that the Bill works for the whole of the UK and is both future-proofed—a word we have all used lots—and able to adapt to different developments across each of the four nations. I have three reasons why I think these things are important. The first concerns the interplay between reserved and devolved matters. The second concerns the legal differences that already exist across the UK. The third concerns the role of Ofcom.
In his much-appreciated email to me last week, the Minister rightly highlighted that internet services are a reserved matter and I absolutely do not wish to impose different standards of regulation across the UK. Regarding priority offences, I completely support the Government’s stance that service providers must treat any content as priority illegal content where it amounts to a criminal offence anywhere in the UK regardless of where that act may have taken place or where the user is. However, my amendments are not about regulation; they are about research and transparency reporting, enabling us to understand the experience across the UK and to collect data—which we have just heard, so powerfully, will be more important as we continue.
I am afraid that leaving it to Ofcom’s discretion to understand the differences in the online experiences across the four nations over time is not quite good enough. Many of the matters we are dealing with in the online safety space—such as children, justice, police and education—are devolved. Government policy-making in devolved areas will increasingly rely on data about online behaviours, harms and outcomes. These days, I cannot imagine creating any kind of public policy without understanding the online dimension. There are areas where either the community experience and/or the policy approach is markedly different across the nations—take drug abuse, for example. No data means uninformed policy-making or flying blind, as my noble friend Lord Bethell has just said. But how easy will it be for the devolved nations to get this information if we do not specify it in the Bill?
In many of the debates, we have already heard of the legal differences across the four nations, and I am extremely grateful to the noble and learned Lord, Lord Hope of Craighead, who is not in his place, the noble Lord, Lord Stevenson of Balmacara, and the Minister for supporting my amendment last week when I could not be here. I am terribly sorry. I was sitting next to the noble Viscount, Lord Camrose, at the time. The amendment was to ensure that there is a legal definition of “freedom of expression” in the Bill that can be understood by devolved Administrations across the UK.
The more I look at this landscape, the more challenges arise. The creation of legislation around intimate abuse images is a good example. The original English legislation was focused on addressing the abusive sharing of intimate images after a relationship breakdown. It required the sharing to have been committed with the intent to cause harm, which has a very easy defence: “I did not mean to cause any harm”. The Scottish legislation, drafted slightly later, softened this to an intent to cause harm or being reckless as to whether harm was caused, which is a bit better because you do not need to prove intent. Now the English version is going to be updated in the Bill to create an offence simply by sharing, which is even better.
Other differences in legislation have been highlighted, such as on deepfakes and upskirting. On the first day of Report, the noble Baroness, Lady Kennedy of The Shaws, highlighted a difference in the way cyberflashing offences are understood in Northern Ireland. So the issue is nuanced, and the Government’s responses change as we learn about harmful behaviours in practice. Over time, we gradually see these offences refined as we learn more about how technology is used to abuse in practice. The question really is: what will such offences look like online in five years’ time? Will the user experience and government policy across the four nations be the same? I will not pretend to try to answer that, but to answer it we will need the data.
I am concerned that the unintended consequences of the Bill in the devolved Administrations have not been fully appreciated or explored. Therefore, I am proposing a belt and braces approach in the reporting regime. When we come to post-legislative scrutiny, with reports being laid before this Parliament and the devolved Administrations in Edinburgh, Cardiff and Belfast—if there is one—we will want to have the data to understand the online experiences of each nation. That is why my very little amendments are seeking to ensure that we capture this experience and that is why it is so important.
I am very happy to discuss this further with noble Lords, but I will reserve the right, pending that discussion, to decide whether we need to return to this at Third Reading.
Amendments 270 and 272, tabled by my noble friend Lady Fraser of Craigmaddie, to whom I am very grateful for her careful scrutiny of the devolved aspects of the Bill, seek to require Ofcom to include separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in the research about users’ experiences of regulated services and in Ofcom’s transparency reports. While I am sympathetic to her intention—we have corresponded on it, for which I am grateful—it is important that Ofcom has and retains the discretion to prioritise information requests that will best shed light on the experience of users across the UK.
My noble friend and other noble Lords should be reassured that Ofcom has a strong track record of using this discretion to produce data which are representative of people across the whole United Kingdom. Ofcom is committed to reflecting the online experiences of users across the UK and intends, wherever possible, to publish data at a national level. When conducting research, Ofcom seeks to gather views from a representative sample of the United Kingdom and seeks to set quotas that ensure an analysable sample within each of the home nations.
It is also worth noting the provisions in the Communications Act 2003 that require Ofcom to operate offices in each of the nations of the UK, to maintain advisory committees for each, and to ensure their representation on its various boards and panels—and, indeed, on the point raised by the noble Baroness, Lady Kidron, to capture the experiences of children and users of all ages. While we must give Ofcom the discretion it needs to ensure that the framework is flexible and remains future-proofed, I hope that I have reassured my noble friend that her point will indeed be captured, reported on and be able to be scrutinised, not just in this House but across the UK.
I am grateful to the Minister for giving way. My premise is that the reason Ofcom reports in a nation-specific way in broadcasting and in communications is because there is a high-level reference in both the Communications Act 2003 and the BBC charter that requires it to do so, because it feeds down into national quotas and so on. There is currently nothing of that equivalence in the Online Safety Bill. Therefore, we are relying on Ofcom’s discretion, whereas in the broadcasting and communications area we have a high-level reference to insisting that there is a breakdown by nation.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fraser of Craigmaddie
Main Page: Baroness Fraser of Craigmaddie (Conservative - Life peer)Department Debates - View all Baroness Fraser of Craigmaddie's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, I rise to support the amendment in the name of the noble Baroness, Lady Kidron. She has been such a forceful voice throughout the passage of this Bill, driven by her passion to protect children, and no more so than with the amendment in her name. That is why I feel compelled to speak up to support her. So far, we have all worked with the Government to see the safe passage of the Online Safety Bill, with strong protections for children. These amendments would be yet another excellent and unique opportunity to protect children. This is what we have been fighting for for years, and it is so uplifting that the Government have listened to us throughout the passage of this Bill—so why stop now? If the Government are saying that the Bill is being clear about harms, they should have no objection to making it explicit.
These amendments press for safety by design to be embedded in later clauses of the Bill and go hand in hand with the earlier amendment that the House so clearly supported. It is clear that the design of services and algorithms is responsible for orchestrating and manipulating the behaviour, feelings, emotions and thoughts of children who, because they are at a vulnerable stage in their development, are easily influenced. We have all witnessed the disastrous impact of the new technology which is fast encroaching upon us, and our children will not be spared from it. So it is imperative that Ofcom have the tools with which to consider and interrogate system design separately from content because, as has been said, it is not only content that is harmful: design is too. We therefore need to take a holistic approach and leave nowhere to hide for the tech companies when it comes to harms affecting our children.
As I have said before, these amendments would send a loud and clear message to the industry that it is responsible for the design of its products and has to think of the consequences for our children’s mental health and well-being when considering design. What better way to do that than for the Government to accept these amendments, in order to show that they are on the side of our children, not the global tech companies, when it comes to protecting them from harm? They need to put measures in place to ensure that the way a service is designed is subject to the online safety regime we have all fought for over the years and during the passage of this Bill.
If the Government do not accept the amendment, perhaps the issue of harmful design could be included in the welcome proposed review of pornography. It would be good to hear the Minister’s thoughts on this idea—but I am not giving him a let-off. I hope he will listen to the strength of feeling and that the Government will reconsider their position, support the amendment and complete the one main task they set out to complete with this Bill, which is to protect children from harm no matter where it rears its ugly head online.
My Lords, I rise briefly to support my noble friend Lady Harding and to associate myself with everything she has just said. It strikes me that if we do not acknowledge that there is harm from functionality, not just content, we are not looking to the future, because functionality protects vulnerable people before the harm has happened; content relies on us having to take it down afterwards. I want to stress that algorithms and functionality disproportionately harm not just vulnerable children but vulnerable adults as well. I do not understand why, since we agreed to safety by design at the beginning of the Bill, it is not running throughout it, rather than just in the introduction. I want to lend my support these amendments this evening.
“freedom of expression | section 211” |