Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Lords ChamberMy Lords, I am humbled to speak in this debate among many noble Lords who have spent years involved in or campaigning for this landmark legislation. I salute all of them and their work.
Like many, I support some parts of this Bill and am sceptical about others. The tension between free speech, privacy and online safety is not an easy one to resolve. We all accept, however reluctantly, that one Bill cannot cure all social ills—indeed, neither should it try. In fact, when it comes to online regulation, this is not the only legislation that is urgent and necessary: the digital markets, competition and consumer Bill is a critical, yet still missing, piece of the jigsaw to us achieving a strong regulatory framework. I hope the Government will bring it forward swiftly.
As my noble friend Lord Vaizey has already said, I see this Bill as the beginning of online regulation and not the end. I see it as our opportunity to make a strong start. For me, the top priority is to get the regulatory fundamentals right and to ensure we can keep updating the regime as needed in the years ahead. With my chair of the Communications and Digital Committee hat on, I will focus on key changes we believe are needed to achieve that. As I cannot do that justice in the time available, I direct any keen readers to our committee’s website, where my letter to the Secretary of State is available.
First, the regulator’s independence is of fundamental importance, as the noble Baroness, Lady Merron, and others have already mentioned. The separation of powers between the Executive and the regulator is the cornerstone of media regulation in western Europe. Any government powers to direct or give guidance should be clearly defined, justified and limited in scope. The Online Safety Bill, as it stands, gives us the opposite. Future Governments will have sweeping powers to direct and interfere with Ofcom’s implementation of the regulations.
I will come, in a moment, to my noble friend the Minister’s proposed remedy, which he mentioned in his opening remarks, but I stress that this is not a general complaint from me or the committee about executive overreach. Many of the Bill’s executive powers are key to ensuring the regime is responsive to changing needs, but there are some powers that are excessive and troubling. Clause 39 allows the Secretary of State to direct Ofcom to change its codes of practice on regulating social media firms. That is not about setting priorities; it is direct and unnecessary interference. In our view, the Government’s proposed amendment to clarify this clause, as my noble friend described, remains inadequate and does not respect the regulator’s independence. Clause 39 also empowers the Secretary of State to direct Ofcom in a private form of ping-pong as it develops codes of practice. This process could in theory go on for ever before any parliamentary oversight comes into play. Other powers are equally unnecessary. Clause 157 contains unconstrained powers to give “guidance” to Ofcom about any part of its work, to which it must have regard. Again, I fail to see the need, especially since the Government can already set strategic priorities and write to Ofcom.
Moving on, my committee is also calling for risk assessments for adult users to be reinstated, and this has already been mentioned by other noble Lords. That would have value for both supporters and critics of “legal but harmful”, by requiring platforms to be transparent about striking the balance between allowing adult users to filter out harmful content and protecting freedom of speech and privacy.
Finally, given the novel nature of the Bill, I hope the Government will reconsider their unwillingness to support the setting up of a Joint Committee of Parliament to scrutinise digital regulation across the board. This would address many general and specific concerns about implementation and keeping pace with digital developments that have been raised recently. Parliament needs to properly discharge its responsibilities, and fragmented oversight via a range of committees will not be good enough in this new, modern world.
Overall, and with all that said, I commend my noble friend and his colleagues for getting us to this point. I look forward to, and will support him in, completing the passage of this legislation in good order.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I have had a helpful reminder about declarations of interest. I once worked for Facebook; I divested myself of any financial interest back in 2020, but of course a person out there may think that what I say today is influenced by the fact that I previously took the Facebook shilling. I want that to be on record as we debate the Bill.
My Lords, I have not engaged with this amendment in any particular detail—until the last 24 hours, in fact. I thought that I would come to listen to the debate today and see if there was anything that I could usefully contribute. I have been interested in the different points that have been raised so far. I find myself agreeing with some points that are perhaps in tension or conflict with each other. I emphasise from the start, though, my complete respect for the Joint Committee and the work that it did in the pre-legislative scrutiny of the Bill. I cannot compare my knowledge and wisdom on the Bill with those who, as has already been said, have spent so much intensive time thinking about it in the way that they did at that stage.
Like my noble friend Lady Harding, I always have a desire for clarity of purpose. It is critical for the success of any organisation, or anything that we are trying to do. As a point of principle, I like the idea of setting out at the start of this Bill its purpose. When I looked through the Bill again over the last couple of weeks in preparation for Committee, it was striking just how complicated and disjointed a piece of work it is and so very difficult to follow.
There are many reasons why I am sympathetic towards the amendment. I can see why bringing together at the beginning of the Bill what are currently described as “Purposes” might be for it to meet its overall aims. But that brings me to some of the points that the noble Baroness, Lady Fox, has just made. The Joint Committee’s report recommends that the objectives of the Bill
“should be that Ofcom should aim to improve online safety for UK citizens by ensuring that service providers”—
it then set out objectives aimed at Ofcom rather than them actually being the purposes of the Bill.
I was also struck by what the noble Lord, Lord Allen, said about what we are looking for. Are we looking for regulation of the type that we would expect of airlines, or of the kind we would expect from the car industry? If we are still asking that question, that is very worrying. I think we are looking for something akin to the car industry model as opposed to the airline model. I would be very grateful if my noble friend the Minister was at least able to give us some assurance on that point.
If I were to set out a purpose of the Bill at the beginning of the document, I would limit myself to what is currently in proposed new subsection (1)(g), which is
“to secure that regulated internet services operate with transparency and accountability in respect of online safety”.
That is all I would say, because that, to me, is what this Bill is trying to do.
The other thing that struck me when I looked at this—I know that there has been an approach to this legislation that sought to adopt regulation that applies to the broadcasting world—was the thought, “Somebody’s looked at the BBC charter and thought, well, they’ve got purposes and we might adopt a similar sort of approach here.” The BBC charter and the purposes set out in it are important and give structure to the way the BBC operates, but they do not give the kind of clarity of purpose that my noble friend Lady Harding is seeking—which I too very much support and want to see—because there is almost too much there. That is my view on what the place to start would be when setting out a very simple statement of purpose for this Bill.
My Lords, this day has not come early enough for me. I am pleased to join others on embarking on the Committee stage of the elusive Online Safety Bill, where we will be going on an intrepid journey, as we have heard so far. Twenty years ago, while I was on the Ofcom content board, I pleaded for the internet to be regulated, but was told that it was mission impossible. So this is a day I feared might not happen, and I thank the Government for making it possible.
I welcome Amendment 1, in the names of the noble Lords, Lord Stevenson, Lord Clement-Jones, and others. It does indeed encapsulate the overarching purpose of the Bill. But it also sets out the focus of what other amendments will be needed if the Bill is to achieve the purpose set out in that amendment.
The Bill offers a landmark opportunity to protect children online, and it is up to us to make sure that it is robust, effective and evolvable for years to come. In particular, I welcome subsection (1)(a) and (b) of the new clause proposed by Amendment 1. Those paragraphs highlight an omission in the Bill. If the purposes set out in them are to be met, the Bill needs to go much further than it currently does.
Yes, the Bill does not go far enough on pornography. The amendment sets out a critical purpose for the Bill: children need a “higher level of protection”. The impact that pornography has on children is known. It poses a serious risk to their mental health and their understanding of consent, healthy sex and relationships. We know that children as young as seven are accessing pornographic content. Their formative years are being influenced by hardcore, abusive pornography.
As I keep saying, childhood lasts a lifetime, so we need to put children first. This is why I have dedicated my life to the protection of children and their well-being. This includes protection from pornography, where I have spent over a decade campaigning to prevent children easily accessing online pornographic content.
I know that others have proposed amendments that will be debated in due course which meet this purpose. I particularly support the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. Those amendments meet the purpose of the Bill by ensuring that children are protected from pornographic content wherever it is found through robust, anonymous age verification that proves the user’s age beyond reasonable doubt.
Online pornographic content normalises abusive sexual acts, with the Government’s own research finding
“substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women”
and children. This problem is driven largely by the types of content that are easily available online. Pornography is no longer the stereotype that we might imagine from the 1970s and 1980s. It is now vicious, violent and pervasive. Content that would be prohibited offline is readily available online for free with just a few clicks. The Online Safety Bill comes at a crucial moment to regulate online pornography. That is why I welcome the amendment introducing a purpose to the Bill that ensures that internet companies “comply with UK law”.
We have the Obscene Publications Act 1959 and UK law does not allow the offline distribution of material that sexualises children—such as “barely legal” pornography, where petite-looking adult actors are made to look like children—content which depicts incest and content which depicts sexual violence, including strangulation. That is why it is important that the Bill makes that type of material illegal online as well. Such content poses a high risk to children as well as women and girls. There is evidence that such content acts as a gateway to more hardcore material, including illegal child sexual abuse material. Some users spiral out of control, viewing content that is more and more extreme, until the next click is illegal child sexual abuse material, or even going on to contact and abuse children online and offline.
My amendment would require service providers to exclude from online video on-demand services any pornographic content that would be classified as more extreme than R18 and that would be prohibited offline. This would address the inconsistency between online and offline regulation of pornographic content—
If I may, I will prevail upon the noble Lord, Lord Clement-Jones, to wait just another few seconds before beginning his winding-up speech. I have found this an extremely interesting and worthwhile debate, and there seems to be an enormous amount of consensus that the amendment is a good thing to try to achieve. It is also true that this is a very complex Bill. My only point in rising is to say to the Minister—who is himself about to speak, telling us why the Government are not going to accept Amendment 1—that, as a result of the very long series of debates we are going to have on this Bill over a number of days, perhaps the Government might still be able, at the end of this very long process, to rethink the benefits of an having amendment of this kind at the beginning of the Bill. I hope that, just because he is going to ask us that the amendment be withdrawn today, he will not lose sight of the benefits of such an amendment.
My Lords, just before the noble Lord, Lord Clement-Jones gets to wind up, I wanted to ask a question and make a point of clarification. I am grateful for the contribution from the noble Baroness, Lady Chakrabarti; that was a helpful point to make.
My question, which I was going to direct to the noble Lord, Lord Stevenson—although it may be one that the noble Lord, Lord Clement-Jones, wants to respond to if the noble Lord, Lord Stevenson, is not coming back—is about the use of the word “purpose” versus “objective”. The point I was trying to make in referring to the Joint Committee’s report was that, when it set out the limbs of this amendment, it was referring to them as objectives for Ofcom. What we have here is an amendment that is talking about purposes of the Bill, and in the course of this debate we have been talking about the need for clarity of purpose. The point I was trying to make was not that I object to the contents of this amendment, but that if we are looking for clarity of purpose to inform the way we want people to behave as a result of this legislation, I would make it much shorter and simpler, which is why I pointed to subsection (g) of the proposed clause.
It may be that the content of this amendment—and this is where I pick up the point the noble Baroness, Lady Chakrabarti, was making—is not objectionable, although I take the point made by the noble Baroness, Lady Fox. However, the noble Baroness, Lady Chakrabarti, is right: at the moment, let us worry less about the specifics. Then, we can be clearer about what bits of the amendment are meant to be doing what, rather than trying to get all of them to offer clarity of purpose. That is my problem with it: there are purposes, which, as I say, are helpful structurally in terms of how an organisation might go about its work, and there is then the clarity of purpose that should be driving everything. The shorter, simpler and more to the point we can make that, the better.
My Lords, I thank the noble Baroness. I hope I have not appeared to rush the proceedings, but I am conscious that there are three Statements after the Bill. I thank the noble Lord, Lord Stevenson, for tabling this amendment, speaking so cogently to it and inspiring so many interesting and thoughtful speeches today. He and I have worked on many Bills together over the years, and it has been a real pleasure to see him back in harness on the Opposition Front Bench, both in the Joint Committee and on this Bill. Long may that last.
It has been quite some journey to get to this stage of the Bill; I think we have had four Digital Ministers and five Prime Ministers since we started. It is pretty clear that Bismarck never said, “Laws are like sausages: it’s best not to see them being made”, but whoever did say it still made a very good point. The process leading to today’s Bill has been particularly messy, with Green and White Papers; a draft Bill; reports from the Joint Committee and Lords and Commons Select Committees; several versions of the Bill itself; and several government amendments anticipated to come. Obviously, the fact that the Government chose to inflict last-minute radical surgery on the Bill to satisfy what I believe are the rather unjustified concerns of a small number in the Government’s own party made it even messier.
It is extremely refreshing, therefore, to start at first principles, as the noble Lord, Lord Stevenson, has done. He has outlined them and the context in which we should see them—namely, we should focus essentially on the systems, what is readily enforceable and where safety by design and transparency are absolutely the essence of the purpose of the Bill. I share his confidence in Ofcom and its ability to interpret those purposes. I say to the noble Baroness, Lady Stowell, that I am not going to dance on the heads of too many pins about the difference between “purpose” and “objective”. I think it is pretty clear what the amendment intends, but I do have a certain humility about drafting; the noble Baroness, Lady Chakrabarti, reminded us of that. Of course, one should always be open to change and condensation of wording if we need to do that. But we are only at Amendment 1 in Committee, so there is quite a lot of water to flow under the bridge.
It is very heartening that there is a great deal of cross-party agreement about how we must regulate social media going forward. These Benches—and others, I am sure—will examine the Bill extremely carefully and will do so in a cross-party spirit of constructive criticism, as we explained at Second Reading. Our Joint Committee on the draft Bill exemplified that cross-party spirit, and I am extremely pleased that all four signatories to this amendment served on the Joint Committee and readily signed up to its conclusions.
Right at the start of our report, we made a strong case for the Bill to set out these core objectives, as the noble Lord, Lord Stevenson, has explained, so as to provide clarity—that word has been used around the Committee this afternoon—for users and regulators about what the Bill is trying to achieve and to inform the detailed duties set out in the legislation. In fact, I believe that the noble Lord, Lord Stevenson, has improved on that wording by including a duty on the Secretary of State, as well as Ofcom, to have regard to the purposes.
We have heard some very passionate speeches around the Committee for proper regulation of harms on social media. The case for that was made eloquently to the Joint Committee by Ian Russell and by witnesses such as Edleen John of the FA and Frances Haugen, the Facebook whistleblower. A long line of reports by Select Committees and all-party groups have rightly concluded that regulation is absolutely necessary given the failure of the platforms even today to address the systemic issues inherent in their services and business models.
The introduction to our Joint Committee report makes it clear that without the original architecture of a duty of care, as the White Paper originally proposed, we need an explicit set of objectives to ensure clarity for Ofcom when drawing up the codes and when the provisions of the Bill are tested in court, as they inevitably will be. Indeed, in practice, the tests that many of us will use when judging whether to support amendments as the Bill passes through the House are inherently bound up with these purposes, several of which many of us mentioned at Second Reading. Decisions may need to be made on balancing some of these objectives and purposes, but that is the nature of regulation. I have considerable confidence, as I mentioned earlier, in Ofcom’s ability to do this, and those seven objectives—as the right reverend Prelate reminded us, the rule of seven is important in other contexts—set that out.
In their response to the report published more than a year ago, the Government repeated at least half of these objectives in stating their own intentions for the Bill. Indeed, they said:
“We are pleased to agree with the Joint Committee on the core objectives of the Bill”,
and, later:
“We agree with all of the objectives the Joint Committee has set out, and believe that the Bill already encapsulates and should achieve these objectives”.
That is exactly the point of dispute: we need this to be explicit, and the Government seem to believe that it is implicit. Despite agreeing with those objectives, at paragraph 21 of their response the Government say:
“In terms of the specific restructure that the Committee suggested, we believe that using these objectives as the basis for Ofcom’s regulation would delegate unprecedented power to a regulator. We do not believe that reformulating this regulatory framework in this way would be desirable or effective. In particular, the proposal would leave Ofcom with a series of high-level duties, which would likely create an uncertain and unclear operating environment”.
That is exactly the opposite of what most noble Lords have been saying today.
It has been an absolute pleasure to listen to so many noble Lords across the Committee set out their ambitions for the Bill and their support for this amendment. It started with the noble Baroness, Lady Kidron, talking about this set of purposes being the “North Star”. I pay tribute to her tireless work, which drove all of us in the Joint Committee on in an extremely positive way. I am not going to go through a summing-up process, but what my noble friend had to say about the nature of the risk we are undertaking and the fact that we need to be clear about it was very important. The whole question of clarity and certainty for business and the platforms, in terms of making sure that they understand the purpose of the Bill—as the noble Baroness, Lady Harding, and many other noble Lords mentioned—is utterly crucial.
If noble Lords look at the impact assessment, they will see that the Government seem to think the cost of compliance is a bagatelle—but, believe me, it will not be. It will be a pretty expensive undertaking to train people in those platforms, across social media start-ups and so on to understand the nature of their duties.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, it is a pleasure to follow the two noble Baronesses. I remind the Committee of my background as a board member of the Centre for Data Ethics and Innovation. I also declare an indirect interest, as my oldest son is the founder and studio head of Mediatonic, which is now part of Epic Games and is the maker of “Fall Guys”, which I am sure is familiar to your Lordships.
I speak today in support of Amendments 2 and 92 and the consequent amendments in this group. I also support the various app store amendments proposed by the noble Baroness, Lady Harding, but I will not address them directly in these remarks.
I was remarkably encouraged on Wednesday by the Minister’s reply to the debate on the purposes of the Bill, especially by the priority that he and the Government gave to the safety of children as its primary purpose. The Minister underlined this point in three different ways:
“The main purposes of the Bill are: to give the highest levels of protection to children … The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children … Children’s safety is prioritised throughout this Bill”.—[Official Report, 19/4/23; col. 724.]
The purpose of Amendments 2 and 92 and consequent amendments is to extend and deepen the provisions in the Bill to protect children against a range of harms. This is necessary for both the present and the future. It is necessary in the present because of the harms to which children are exposed through a broad range of services, many of which are not currently in the Bill’s scope. Amendment 2 expands the scope to include any internet service that meets the child user condition and enables or promotes harmful activity and content as set out in the schedule provided. Why would the Government not take this step, given the aims and purposes of the Bill to give the highest protection to children?
Every day, the diocese of Oxford educates some 60,000 children in our primary and secondary schools. Almost all of them have or will have access to a smartphone, either late in primary, hopefully, or early in secondary school. The smartphone is a wonderful tool to access educational content, entertainment and friendship networks, but it is also a potential gateway for companies, children and individuals to access children’s inner lives, in secret, in the dead of night and without robust regulation. It therefore exposes them to harm. Sometimes that harm is deliberate and sometimes unintentional. This power for harm will only increase in the coming years without these provisions.
The Committee needs to be alert to generational changes in technology. When I was 16 in secondary school in Halifax, I did a computer course in the sixth form. We had to take a long bus ride to the computer building in Huddersfield University. The computer filled several rooms in the basement. The class learned how to program using punch cards. The answers to our questions came back days later, on long screeds of printed paper.
When my own children were teenagers and my oldest was 16, we had one family computer in the main living room of the house. The family was able to monitor usage. Access to the internet was possible, but only through a dial-up modem. The oldest of my grandchildren is now seven and many of his friends have smartphones now. In a few years, he will certainly carry a connected device in his pocket and, potentially, have access to the entire internet 24/7.
I want him and millions of other children to have the same protection online as he enjoys offline. That means recognising that harms come in a variety of shapes and sizes. Some are easy to spot, such as pornography. We know the terrible damage that porn inflicts on young lives. Some are more insidious and gradual: addictive behaviours, the promotion of gambling, the erosion of confidence, grooming, self-harm and suicidal thoughts, encouraging eating disorders, fostering addiction through algorithms and eroding the barriers of the person.
The NSPCC describes many harms to children on social networks that we are all now familiar with, but it also highlights online chat, comments on livestream sites, voice chat in games and private messaging among the vectors for harm. According to Ofcom, nine in 10 children in the UK play video games, and they do so on devices ranging from computers to mobile phones to consoles. Internet Matters says that most children’s first interaction with someone they do not know online is now more likely to be in a video game such as “Roblox” than anywhere else. It also found that parents underestimate the frequency with which their children are contacted by strangers online.
The Gambling Commission has estimated that 25,000 children in the UK aged between 11 and 16 are problem gamblers, with many of them introduced to betting via computer games and social media. Families have been left with bills, sometimes of more than £3,000, after uncontrolled spending on loot boxes.
Online companies, we know, design their products with psychological principles of engagement firmly in view, and then refine their products by scraping data from users. According to the Information Commissioner, more than 1 million underage children could have been exposed to underage content on TikTok alone, with the platform collecting and using their personal data.
As the noble Baroness, Lady Kidron, has said, we already have robust and tested definitions of scope in the ICO’s age-appropriate design code—definitions increasingly taken up in other jurisdictions. To give the highest protection to children, we need to build on these secure definitions in this Bill and find the courage to extend robust protection across the internet now.
We also need to future-proof this Bill. These key amendments would ensure that any development, any new kind of service not yet imagined which meets the child user condition and enables or promotes harmful activity and content, would be in scope. This would give Ofcom the power to develop new guidance and accountabilities for the applications that are certain to come in the coming years.
We have an opportunity and a responsibility, as the Minister has said, to build the highest protection into this Bill. I support the key amendments standing in my name.
My Lords, first, I beg the indulgence of the Committee to speak briefly at this juncture. I know that no one from the Lib Dem or Labour Benches has spoken yet, but I need to dash over to the Moses Room to speak to some amendments I am moving on the Bill being considered there. Secondly, I also ask the Committee that, if I do not get back in time for the wind-ups, I be forgiven on this occasion.
I simply wanted to say something briefly in support of Amendments 19, 22, 298 and 299, to which I have added my name. My noble friend Lady Harding has already spoken to them comprehensively, so there little I want to add; I just want to emphasise a couple of points. But first, if I may, I will pick up on something the right reverend Prelate said. I think I am right in saying that the most recent Ofcom research shows that 57% of 7 year-olds such as his grandchild have their own phone, and by the time children reach the age of 12 they pretty much all have their own phone. One can only imagine that the age at which children possess their own device is going to get lower.
Turning to app stores, with which these amendments are concerned, currently it is the responsibility of parents and developers to make sure that children are prevented from accessing inappropriate content. My noble friend’s amendments do not dilute in any way the responsibility that should be held by those two very important constituent groups. All we are seeking to do is ensure that app stores, which are currently completely unregulated, take their share of responsibility for making sure that those seeking to download and then use such apps are in the age group the apps are designed for.
As has already been very powerfully explained by my noble friend and by the noble Baroness, Lady Kidron, different age ratings are being given by the two different app stores right now. It is important for us to understand, in the context of the digital markets and competition Bill, which is being introduced to Parliament today—I cannot tell noble Lords how long we have waited for that legislation and how important it is, not least because it will open up competition, particularly in app stores—that the more competition there will be across app stores and the doorways through which children can go to purchase or download apps, the more important it is that there is consistency and some regulation. That is why I support my noble friend and was very happy to add my name to her amendments.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberThe noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.
My Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.
I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice
“to a regulated service which offers private messaging with end-to-end encryption”;
and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.
Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.
Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.
My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.
If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.
I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.
I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.
Before my noble friend moves on, when he is reviewing that back in the office, could he also satisfy himself that the concerns coming from the journalism and news organisations in the context of RIPA are also understood and have been addressed? That is another angle which, from what my noble friend has said so far, I am not sure has really been acknowledged. That is not a criticism but it is worth him satisfying himself on it.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI ask the Committee to have a level of imagination here because I have been asked to read the speech of the noble Viscount, Lord Colville—
I do not know who advised the noble Baroness—and forgive me for getting up and getting all former Leader on her—but this is a practice that we seem to have adopted in the last couple of years and that I find very odd. It is perfectly proper for the noble Baroness to deploy the noble Viscount’s arguments, but to read his speech is completely in contravention of our guidance.
I beg the pardon of the Committee. I asked about it and was misinformed; I will do as the noble Baroness says.
The noble Viscount, Lord Colville, is unable to be with us. He put his name to Amendments 273, 275, 277 and 280. His concern is that the Bill sets the threshold for illegality too low and that in spite of the direction provided by Clause 170, the standards for determining illegality are too vague.
I will make a couple of points on that thought. Clause 170(6) directs that a provider must have
“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”,
but that does not mean that the platform has to be certain that the content is illegal before it takes it down. This is concerning when you take it in combination with what or who will make judgments on illegality.
If a human moderator makes the decision, it will depend on the resources and time available to them as to how much information they gather in order to make that judgment. Unlike in a court case, when a wide range of information and context can be gathered, when it comes to decisions about content online, these resources are very rarely available to human moderators, who have a vast amount of content to get through.
If an automated system makes the judgment, it is very well established that algorithms are not good at context—the Communications and Digital Committee took evidence on this repeatedly when I was on it. AI simply uses the information available in the content itself to make a decision, which can lead to significant missteps. Clause 170(3) provides the requirement for the decision-makers to judge whether there is a defence for the content. In the context of algorithms, it is very unclear how they will come to such a judgment from the content itself.
I understand that these are probing amendments, but I think the concern is that the vagueness of the definition will lead to too much content being taken down. This concern was supported by Parliament’s Joint Committee on Human Rights, which wrote to the former Culture Secretary, Nadine Dorries, on that matter. I apologise again.
That is very helpful.
I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.
The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.
Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.
Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.
I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?
The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.
My Lords, my contribution will be less officious than my intervention earlier in this group. In the last couple of years since I returned to the House—as I describe it—having spent time at the Charity Commission, I have noticed a new practice emerging of noble Lords reading out other people’s speeches. Every time I had seen it happen before, I had not said anything, but today I thought, “I can’t sit here and not say anything again”. I apologise for my intervention.
I am grateful to my noble friend Lord Moylan for bringing forward his amendments and for introducing them in the incredibly clear way he did; they cover some very complex and diverse issues. I know that there are other amendments in the group which might be described as similar to his.
There are a couple of things I want to highlight. One interesting thing about the debate on this group is the absence of some of our legal friends—I apologise to my noble friend Lady Buscombe, who is of course a very distinguished lawyer. The point I am making is that we are so often enriched by a lot of legal advice and contributions on some of the more challenging legal issues that we grapple with, but we do not have that today, and this is a very difficult legal issue.
It is worth highlighting again, as has been touched on a little in some of the contributions, the concern, as I understand it, with how the Bill is drafted in relation to illegal content and the potential chilling effect of these clauses on social media platforms. As has already been said, there is a concern that it might lead them to take a safety-first approach in order to avoid breaking the law and incurring the sanctions and fines that come with the Bill, which Ofcom will have the power to apply. That is the point we are concerned with here. It is the way in which this is laid out, and people who are much better equipped than I am have already explained the difference between evidence versus reasonable grounds to infer.
What the noble Lord, Lord Allan, hit on in his contribution is also worth taking into account, and that is the role of Ofcom in this situation. One of the things I fear, as we move into an implementation phase and the consequences of the Bill start to impact on the social media firms, is the potential for the regulator to be weaponised in a battle on the cultural issues that people are becoming increasingly exercised about. I do not have an answer to this, but I think it is important to understand the danger of where we might get to in the expectations of the regulator if we create a situation where the social media platforms are acting in a way that means people are looking for recourse or a place to generate further an argument and a battle that will not be helpful at all.
I am not entirely sure, given my lack of legal expertise —this is why I would have been very grateful for some legal expertise on this group—whether what my noble friend is proposing in his amendments is the solution, but I think we need to be very clear that this is a genuine problem. I am not sure, as things stand in the Bill, that we should be comfortable that it is not going to create problems. We need to find a way to be satisfied that this has been dealt with properly.
It is a great honour to follow my noble friend. I completely agree with her that this is a powerful discussion and there are big problems in this area. I am grateful also to my noble friend Lord Moylan for raising this in the first place. It has been a very productive discussion.
I approach the matter from a slightly different angle. I will not talk about the fringe cases—the ones where there is ambiguity, difficulty of interpretation, or responsibility or regulatory override, all of which are very important issues. The bit I am concerned about is where primary priority content that clearly demonstrates some kind of priority offence is not followed up by the authorities at all.
The noble Lord, Lord Allan, referred to this point, although he did slightly glide over it, as though implying, if I understood him correctly, that this was not an area of concern because, if a crime had clearly been committed, it would be followed up on. My fear and anxiety is that the history of the internet over the last 25 years shows that crimes—overt and clear crimes that are there for us to see—are very often not followed up by the authorities. This is another egregious example of where the digital world is somehow exceptionalised and does not have real-world rules applied to it.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, this has been a really interesting debate. I started out thinking that we were developing quite a lot of clarity. The Government have moved quite a long way since we first started debating senior manager liability, but there is still a bit of fog that needs dispelling—the noble Baronesses, Lady Kidron and Lady Harding, have demonstrated that we are not there yet.
I started off by saying yes to this group, before I got to grips with the government amendments. I broadly thought that Amendment 33, tabled by the noble Lord, Lord Stevenson, and Amendment 182, tabled by the noble Lord, Lord Bethell, were heading in the right direction. However, I was stopped short by Trustpilot’s briefing, which talked about a stepped approach regarding breaches and so on—that is a very strong point. It says that it is important to recognise that not all breaches should carry the same weight. In fact, it is even more than that: certain things should not even be an offence, unless you have been persistent or negligent. We have to be quite mindful as to how you formulate criminal offences.
I very much liked what the noble Lord, Lord Bethell, had to say about the tech view of its own liability. We have all seen articles about tech exceptionalism, and, for some reason, that seems to have taken quite a hold—so we have to dispel that as well. That is why I very much liked what the noble Lord, Lord Curry, said. It seemed to me that that was very much part of a stepped approach, while also being transparent to the object of the exercise and the company involved. That fits very well with the architecture of the Bill.
The noble Baroness, Lady Harding, put her finger on it: the Bill is not absolutely clear. In the Government’s response to the Joint Committee’s report, we were promised that, within three to six months, we would get that senior manager liability. On reading the Bill, I am certainly still a bit foggy about it, and it is quite reassuring that the noble Baroness, Lady Harding, is foggy about it too. Is that senior manager liability definitely there? Will it be there?
The Joint Committee made two other recommendations which I thought made a lot of sense: the obligation to report on risk assessment to the main board of a company, and the appointment of a safety controller, which the noble Lord, Lord Knight, mentioned. Such a controller would make it very clear—as with GDPR, you would have a senior manager who you can fix the duty on.
Like the noble Baroness, Lady Harding, I would very much like to hear from the Minister on the question of personal liability, as well as about Ofcom. It is important that any criminal prosecution is mediated by Ofcom; that is cardinal. You cannot just create criminal offences where you can have a prosecution without the intervention of Ofcom. That is extraordinarily important.
I have just a couple of final points. The noble Baroness, Lady Fox, comes back quite often to this point about regulation being the enemy of innovation. It very much depends what kind of innovation we are talking about. Technology is not necessarily neutral. It depends how the humans who deploy it operate it. In circumstances such as this, where we are talking about children and about smaller platforms that can do harm, I have no qualms about having regulation or indeed criminal liability. That is a really important factor. We are talking about a really important area.
I very strongly support Amendment 219. It deals with a really important aspect which is completely missing from the Bill. I have a splendid briefing here, which I am not going to read out, but it is all about Mastodon being one example of a new style of federated platform in which the app or hub for a network may be category 1 owing to the size of its user base but individual subdomains or networks sitting below it could fall under category 2 status. I am very happy to give a copy of the briefing to the Minister; it is a really well-written brief, and demonstrates entirely some of the issues we are talking about here.
I reassure the noble Lord, Lord Knight, that I think the amendment is very well drafted. It is really quite cunning in the way that it is done.
My Lords, I wonder whether I can make a brief intervention—I am sorry to do so after the noble Lord, Lord Clement-Jones, but I want to intervene before my noble friend the Minister stands up, unless the Labour Benches are about to speak.
I have been pondering this debate and have had a couple of thoughts. Listening to the noble Lord, Lord Clement-Jones, I am reminded of something which was always very much a guiding light for me when I chaired the Charity Commission, and therefore working in a regulatory space: regulation is never an end in itself; you regulate for a reason.
I was struck by the first debate we had on day one of Committee about the purpose of the Bill. If noble Lords recall, I said in that debate that, for me, the Bill at its heart was about enhancing the accountability of the platforms and the social media businesses. I felt that the contribution from my noble friend Lady Harding was incredibly important. What we are trying to do here is to use enforcement to drive culture change, and to force the organisations not to never think about profit but to move away from profit-making to focusing on child safety in the way in which they go about their work. That is really important when we start to consider the whole issue of enforcement.
It struck me at the start of this discussion that we have to be clear what our general approach and mindset is about this part of our economy that we are seeking to regulate. We have to be clear about the crimes we think are being committed or the offences that need to be dealt with. We need to make sure that Ofcom has the powers to tackle those offences and that it can do so in a way that meets Parliament’s and the public’s expectations of us having legislated to make things better.
I am really asking my noble friend the Minister, when he comes to respond on this, to give us a sense of clarity on the whole question of enforcement. At the moment, it is insufficiently clear. Even if we do not get that level of clarity today, when we come back later on and look at enforcement, it is really important that we know what we are trying to tackle here.
My Lords, I will endeavour to give that clarity, but it may be clearer still if I flesh some points out in writing in addition to what I say now.
My Lords, as a former Deputy Leader of this House, if I were sitting on the Front Bench, I would have more gumption than to try to start a debate only 10 minutes before closing time. But I realise that the wheels grind on—perhaps things are no longer as flexible as they were in my day—so noble Lords will get my speech. The noble Lord, Lord Grade, who is at his post—it is very encouraging to see the chair of Ofcom listening to this debate—and I share a love of music hall. He will remember Eric Morecambe saying that one slot was like the last slot at the Glasgow Empire on a Friday night. That is how I feel now.
A number of references have been made to those who served on the Joint Committee and what an important factor it has been in their thinking. I have said on many occasions that one of the most fulfilling times of my parliamentary life was serving on the Joint Committee for the Communications Act 2003. The interesting thing was that we had no real idea of what was coming down the track as far as the internet was concerned, but we did set up Ofcom. At that time, a lot of the pundits and observers were saying, “Murdoch’s lawyers will have these government regulators for breakfast”. Well, they did not. Ofcom has turned into a regulator for which—at some stages this has slightly worried me—for almost any problem facing the Government, they say, “We’ll give it to Ofcom”. It has certainly proved that it can regulate across a vast area and with great skill. I have every confidence that the noble Lord, Lord Grade, will take that forward.
Perhaps it is to do with the generation I come from, but I do not have this fear of regulation or government intervention. In some ways, the story of my life is that of government intervention. If I am anybody’s child, I am Attlee’s child—not just because of the reforms of the Labour Party, but the reforms of the coalition Government, the Butler Education Act and the bringing in of the welfare state. So I am not afraid of government and Parliament taking responsibility in addressing real dangers.
In bringing forward this amendment, along with my colleague the noble Lord, Lord Lipsey, who cannot be here today, I am referring to legislation that is 20 years old. That is a warning to newcomers; it could be another 20 years before parliamentary time is found for a Bill of this complexity, so we want to be sure that we get its scope right.
The Minister said recently that the Bill is primarily a child safety Bill, but it did not start off that way. Five years ago, the online harms White Paper was seen as a pathfinder and trailblazer for broader legislation. Before we accept the argument that the Bill is now narrowed down to more specific terms, we should think about whether there are other areas that still need to be covered.
These amendments are in the same spirit as those in the names of the noble Baronesses, Lady Stowell, Lady Bull, and Lady Featherstone. We seek to reinstate an adult risk assessment duty because we fear that the change in title signals a reduction in scope and a retreat from the protections which earlier versions of the Bill intended to provide.
It was in this spirit, and to enable us to get ahead of the game, that in 2016 I proposed a Private Member’s Bill on this subject: the Online Harms Reduction Regulator (Report) Bill, which asked Ofcom to publish, in advance of the anticipated legislation, assessments of what action was needed to reduce harm to users and wider society from social networks. I think we can all agree that, if that work had been done in advance of the main legislation, such evidence would be very useful now.
I am well aware that there are those who, in the cause of some absolute concepts of freedom, believe that to seek to broaden the scope of the Bill takes us into the realms of the nanny state. But part of the social contract which enables us to survive in this increasingly complex world is that the ordinary citizen, who is busy struggling with the day-to-day challenges of normal life, does trust his Government and Parliament to keep an anticipatory weather eye on what is coming down the track and what dangers lie therein for the ordinary citizen.
When there have been game-changing advances in technology in the past, it has often taken a long time for societies to adapt and adjust. The noble Lord, Lord Moylan, referred to the invention of the printing press. That caused the Reformation, the Industrial Revolution and around 300 years of war, so we have to be careful how we handle these technological changes. Instagram was founded in 2010, and the iPhone 4 was released then too. One eminent social psychologist wrote:
“The arrival of smartphones rewired social life.”
It is not surprising that liberal democracies, with their essentially 18th-century construct of democracy, struggle to keep up.
The record of big tech in the last 20 years has, yes, been an amazing leap in access to information. However, that quantum leap has come with a social cost in almost every aspect of our lives. Nevertheless, I refuse to accept the premise that these technologies are too global and too powerful in their operation for them not to come within the reach of any single jurisdiction or the rule of law. I am more impressed by efforts by big tech companies to identify and deal with real harms than I am by threats to quit this or that jurisdiction if they do not get the light-touch regulation they want so as to be able to profit maximise.
We know by their actions that some companies and individuals simply do not care about their social responsibilities or the impact of what they sell and how they sell it on individuals and society as a whole. That is why the social contract in our liberal democracies means a central role for Parliament and government in bringing order and accountability into what would otherwise become a jungle. That is why, over the last 200 years, Parliament has protected its citizens from the bad behaviour of employers, banks, loan sharks, dodgy salesmen, insanitary food, danger at work and so on. In this new age, we know that companies large and small, British and foreign, can, through negligence, indifference or malice, drive innocent people into harmful situations. The risks that people face are complex and interlocking; they cannot be reduced to a simple list, as the Government seek to do in Clause 12.
When I sat on the pre-legislative committee in 2003, we could be forgiven for not fully anticipating the tsunami of change that the internet, the world wide web and the iPhone were about to bring to our societies. That legislation did, as I said, establish Ofcom with a responsibility to promote media literacy, which it has only belatedly begun to take seriously. We now have no excuse for inaction or for drawing up legislation so narrowly that it fails to deal with the wide risks that might befall adults in the synthetic world of social media.
We have tabled our amendments not because they will solve every problem or avert every danger but because they would be a step in the right direction and so make this a better Bill.
I am very grateful to the noble Lord, Lord McNally, for namechecking me and the amendments I have tabled with the support of the noble Baronesses, Lady Featherstone and Lady Bull, although I regret to inform him that they are not in this group. I understand where the confusion has come from. They were originally in this group, but as it developed I felt that my amendments were no longer in the right place. They are now in the freedom of expression group, which we will get to next week. What he has just said has helped, because the amendments I am bringing forward are not similar to the ones he has tabled. They have a very different purpose. I will not pre-empt the debate we will have when we get to freedom of expression, but I think it is only proper that I make that clear. I am very grateful to the noble Lord for the trail.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, in introducing this group, I will speak directly to the three amendments in my name—Amendments 46, 47 and 64. I will also make some general remarks about the issue of freedom of speech and of expression, which is the theme of this group. I will come to these in a moment.
The noble Lord, Lord McNally, said earlier that I had taken my amendments out of a different group— I hope from my introductory remarks that it will be clear why—but, in doing so, I did not realise that I would end up opening on this group. I offer my apologies to the noble Lord, Lord Stevenson of Balmacara, for usurping his position in getting us started.
I am grateful to the noble Baronesses, Lady Bull and Lady Featherstone, for adding their names. The amendments represent the position of the Communications and Digital Select Committee of your Lordships’ House. In proposing them, I do so with that authority. My co-signatories are a recent and a current member. I should add sincere apologies from the noble Baroness, Lady Featherstone, for not being here this evening. If she is watching, I send her my very best wishes.
When my noble friend Lord Gilbert of Panteg was its chair, the committee carried out an inquiry into freedom of speech online. This has already been remarked on this evening. At part of that inquiry, the committee concluded that the Government’s proposals in the then draft Bill—which may have just been a White Paper at that time—for content described as legal but harmful were detrimental to freedom of speech. It called for changes. Since then, as we know, the Government have dropped legal but harmful and instead introduced new user empowerment tools for adults to filter out harmful content. As we heard in earlier groups this evening, these would allow people to turn off or on content about subjects such as eating disorders and self-harm.
Some members of our committee might favour enhanced protection for adults. Indeed, some of my colleagues have already spoken in support of amendments to this end in other groups. Earlier this year, when the committee looked at the Bill as it had been reintroduced to Parliament, we agreed that, as things stood, these new user empowerment tools were a threat to freedom of speech. Whatever one’s views, there is no way of judging their impact or effectiveness—whether good or bad.
As we have heard already this evening, the Government have dropped the requirement for platforms to provide a public risk assessment of how these tools would work and their impact on freedom of speech. To be clear, for these user empowerment tools to be effective, the platforms will have to identify the content that users can switch off. This gives the platforms great power over what is deemed harmful to adults. Amendments 46, 47 and 64 are about ensuring that tech platforms are transparent about how they balance the principles of privacy, safety and freedom of speech for adults. These amendments would require platforms to undertake a risk assessment and publish a summary in their terms of service. This would involve them being clear about the effect of user empowerment tools on the users’ freedom of expression. Without such assessments, there is a risk that platforms would do either too much or too little. It would be very difficult to find out how they are filtering content and on what basis, and how they are addressing the twin imperatives of ensuring online safety without unduly affecting free speech.
To be clear, these amendments, unlike amendments in earlier groups, are neither about seeking to provide greater protection to adults nor about trying to reopen or revisit the question of legal but harmful. They are about ensuring transparency to give all users confidence about how platforms are striking the right balance. While their purpose is to safeguard freedom of speech, they would also bring benefits to those adults who wanted to opt in to the user empowerment tool because they would be able to assess what it was they were choosing not to see.
It is because of their twin benefits—indeed, their benefit to everyone—that we decided formally, as a committee, to recommend these amendments to the Government and for debate by your Lordships’ House. That said, the debate earlier suggests support for a different approach to enhancing protection for adults, and we may discover through this debate a preference for other amendments in this group to protect freedom of speech—but that is why we have brought these amendments forward.
My Lords, your Lordships will want me to be brief, bearing in mind the time. I am very grateful for the support I received from my noble friends Lady Harding and Lady Fraser and the noble Baronesses, Lady Kidron and Lady Bull, for the amendments I tabled. I am particularly grateful to the noble Baroness, Lady Bull, for the detail she added to my description of the amendments. I can always rely on the noble Baroness to colour in my rather broad-brush approach to these sorts of things.
I am pleased that the noble Lord, Lord Stevenson, made his remarks at the beginning of the debate. That was very helpful in setting the context that followed. We have heard a basic theme come through from your Lordships: a lack of certainty that the Government have struck the right balance between privacy protection and freedom of expression. I never stop learning in your Lordships’ House. I was very pleased to learn from the new Milton—my noble friend Lord Moylan—that freedom of expression is a fundamental right. Therefore, the balance between that and the other things in the Bill needs to be considered in a way I had not thought of before.
What is clear is that there is a lack of confidence from all noble Lords—irrespective of the direction they are coming from in their contributions to this and earlier debates— either that the balance has been properly struck or that some of the clauses seeking to address freedom of speech in the Bill are doing so in a way that will deliver the outcome and overall purpose of this legislation as brought forward by the Government.
I will make a couple of other points. My noble friend Lord Moylan’s amendments about the power of Ofcom in this context were particularly interesting. I have some sympathy for what he was arguing. As I said earlier, the question of power and the distribution of it between the various parties involved in this new regime will be one we will look at in broad terms certainly in later groups.
On the amendments of the noble Lord, Lord Stevenson, on Clauses 13, 14 and so on and the protections and provisions for news media, I tend towards the position of my noble friend Lord Black, against what the noble Lord, Lord Stevenson, argued. As I said at the beginning, I am concerned about the censorship of our news organisations by the tech firms. But I also see his argument, and that of the noble Viscount, Lord Colville, that it is not just our traditional legacy media that provides quality journalism now—that is an important issue for us to address.
I am grateful to my noble friend the Minister for his round-up and concluding remarks. Although it is heartening to hear that he and the Bill team will consider the amendment from the noble and learned Lord, Lord Hope, in this group, we are looking—in the various debates today, for sure—for a little more responsiveness and willingness to consider movement by the Government on various matters. I hope that he is able to give us more encouraging signs of this, as we proceed through Committee and before we get to further discussions with him—I hope—outside the Chamber before Report. With that, I of course withdraw my amendment.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, His Majesty’s Government are committed to defending the invaluable role of our free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information. That is why this Bill includes strong protections for recognised news publishers. The Bill does not impose new duties on news publishers’ content, which is exempt from the Bill’s safety duties. In addition, the Bill includes strong safeguards for news publisher content, set out in Clause 14. In order to benefit from these protections, publishers will have to meet a set of stringent criteria, set out in Clause 50.
I am aware of concerns in your Lordships’ House and another place that the definition of news publishers is too broad and that these protections could therefore create a loophole to be exploited. That is why the Government are bringing forward amendments to the definition of “recognised news publisher” to ensure that sanctioned entities cannot benefit from these protections. I will shortly explain these protections in detail but I would like to be clear that narrowing the definition any further would pose a critical risk to our commitment to self-regulation of the press. We do not want to create requirements which would in effect put Ofcom in the position of a press regulator. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit.
Government Amendments 126A and 127A propose changes to the criteria for recognised news publishers. These criteria already exclude any entity that is a proscribed organisation under the Terrorism Act 2000 or the purpose of which is to support a proscribed organisation under that Act. We are clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections either. The amendments we are tabling today will therefore tighten the recognised news publisher criteria further by excluding entities that have been designated for sanctions imposed by both His Majesty’s Government and the United Nations Security Council. I hope noble Lords will accept these amendments, in order to ensure that content from publishers which pose a security threat to this country cannot benefit from protections designed to defend a free press.
In addition, the Government have also tabled amendments 50B, 50C, 50D, 127B, 127C and 283A, which are aimed at ensuring that the protections for news publishers in Clause 14 are workable and do not have unforeseen consequences for the operation of category 1 services. Clause 14 gives category 1 platforms a duty to notify recognised news publishers and offer a right of appeal before taking action against any of their content or accounts.
Clause 14 sets out the circumstances in which companies must offer news publishers an appeal. As drafted, it states that platforms must offer this before they take down news publisher content, before they restrict users’ access to such content or where they propose to “take any other action” in relation to publisher content. Platforms must also offer an appeal if they propose to take action against a registered news publisher’s account by giving them a warning, suspending or banning them from using a service or in any way restricting their ability to use a service.
These amendments provide greater clarity about what constitutes “taking action” in relation to news publisher content, and therefore when category 1 services must offer an appeal. They make it clear that a platform must offer this before they take down such content, add a warning label or take any other action against content in line with any terms of service that allow or prohibit content. This will ensure that platforms are not required to offer publishers a right of appeal every time they propose to carry out routine content curation and similar routine actions. That would be unworkable for platforms and would be likely to inhibit the effectiveness of the appeal process.
As noble Lords know, the Bill has a strong focus on user empowerment and enabling users to take control of their online experience. The Government have therefore tabled amendments to Clause 52 to ensure that providers are required only to offer publishers a right of appeal in relation to their own moderation decisions, not where a user has voluntarily chosen not to view certain types of content. For example, if a user has epilepsy and has opted not to view photo-sensitive content, platforms will not be required to offer publishers a right of appeal before restricting that content for the user in question.
In addition, to ensure that the Bill maintains strong protections for children, the amendments make it clear that platforms are not required to offer news publishers an appeal before applying warning labels to content viewed by children. The amendments also make it clear that platforms would be in breach of the legislation if they applied warning labels to content encountered by adults without first offering news publishers an appeal, but in order to ensure that the Bill maintains strong protections for children, that does not apply to warning labels on content encountered by children. I beg to move.
My Lords, I welcome the amendments the Government have tabled, but I ask the Minister to clarify the effect of Amendment 50E. I declare an interest as chair of the Communications and Digital Select Committee, which has discussed Amendment 50E and the labelling of content for children with the news media organisations. This is a very technical issue, but from what my noble friend was just saying, it seems that content that would qualify for labelling for child protection purposes, and which therefore does not qualify for a right of appeal before the content is so labelled, is not content that would normally be encountered by adults but might happen to appeal to children. I would like to be clear that we are not giving the platforms scope for adding labels to content that they ought not to be adding labels to. That aside, as I say, I am grateful to my noble friend for these amendments.
My Lords, like the noble Baroness, Lady Stowell, I have no major objection and support the Government’s amendments. In a sense the Minister got his retaliation in first, because we will have a much more substantial debate on the scope of Clause 14. At this point I welcome any restriction on Clause 14 in the way that the Minister has stated.
Yet to come we have the whole issue of whether an unregulated recognised news publisher, effectively unregulated by the PRP’s arrangements, should be entitled to complete freedom in terms of below-the-line content, where there is no moderation and it does not have what qualifies as independent regulation. Some debates are coming down the track and—just kicking the tyres on the Minister’s amendments—I think the noble Baroness, Lady Stowell, made a fair point, which I hope the Minister will answer.
My Lords, I am sorry; in my enthusiasm to get this day of Committee off to a swift start, I perhaps rattled through that rather quickly. On Amendment 50E, which my noble friend Lady Stowell asked about, I make clear that platforms will be in breach of their duties if, without applying the protection, they add warning labels to news publishers’ content that they know will be seen by adult users, regardless of whether that content particularly appeals to children.
As the noble Lord, Lord Clement-Jones, and others noted, we will return to some of the underlying principles later on, but the Government have laid these amendments to clarify category 1 platforms’ duties to protect recognised news publishers’ content. They take some publishers out of scope of the protections and make it clearer that category 1 platforms will have only to offer news publishers an appeal before taking punitive actions against their content.
The noble Baroness, Lady Fox, asked about how we define “recognised news publisher”. I am conscious that we will debate this more in later groups, but Clause 50 sets out a range of criteria that an organisation must meet to qualify as a recognised news publisher. These include the organisation’s “principal purpose” being the publication of news, it being subject to a “standards code” and its content being “created by different persons”. The protections for organisations are focused on publishers whose primary purpose is reporting on news and current affairs, recognising the importance of that in a democratic society. I am grateful to noble Lords for their support.
What my noble friend said is absolutely fine with me, and I thank him very much for it. It might be worth letting the noble Baroness, Lady Fox, know that Amendment 127 has now been moved to the group that the noble Lord, Lord Clement-Jones, referred to. I thought it was worth offering that comfort to the noble Baroness.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I strongly support Amendment 97 in the name of the noble Baroness, Lady Morgan. We must strengthen the Bill by imposing an obligation on Ofcom to develop and issue a code of practice on violence against women and girls. This will empower Ofcom and guide services in meeting their duties in regard to women and girls, and encourage them to recognise the many manifestations of online violence that disproportionately affect women and girls.
Refuge, the domestic abuse charity, has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As other noble Lords have said, this tech abuse can take many forms but social media is a particularly powerful weapon for perpetrators, with one in three women experiencing online abuse, rising to almost two in three among young women. Yet the tech companies have been too slow to respond. Many survivors are left waiting weeks or months for a response when they report abusive content, if indeed they receive one at all. It appears that too many services do not understand the risks and nature of VAWG. They do not take complaints seriously and they think that this abuse does not breach community standards. A new code would address this with recommended measures and best practice on the appropriate prevention of and response to violence against women and girls. It would also support the delivery of existing duties set out in the Bill, such as those on illegal content, user empowerment and child safety.
I hope the Minister can accept this amendment, as it would be in keeping with other government policies, such as in the strategic policing requirement, which requires police forces to treat violence against women and girls as a national threat. Adding this code would help to meet the Government’s national and international commitments to tackling online VAWG, such as the tackling VAWG strategy and the Global Partnership for Action on Gender-Based Online Harassment and Abuse.
The Online Safety Bill is a chance to act on tackling the completely unacceptable levels of abuse of women and girls by making it clear through Ofcom that companies need to take this matter seriously and make systemic changes to the design and operation of their services to address VAWG. It would allow Ofcom to add this as a priority, as mandated in the Bill, rather than leave it as an optional extra to be tackled at a later date. The work to produce this code has already been done thanks to Refuge and other charities and academics who have produced a model that is freely available and has been shared with Ofcom. So it is not an extra burden and does not need to delay the implementation of the Bill; in fact, it will greatly aid Ofcom.
The Government are to be congratulated on their amendment to include controlling or coercive behaviour in their list of priority offences. I would like to congratulate them further if they can accept this valuable Amendment 97.
My Lords, I start by commending my noble friend Lady Morgan on her clear introduction to this group of amendments. I also commend the noble Baroness, Lady Kidron, on her powerful speech.
From those who have spoken so far, we have a clear picture of the widespread nature of some of the abuse and offences that women experience when they go online. I note from what my noble friend Lady Morgan said that there is widespread support from a range of organisations outside the Committee for this group of amendments. She also made an important and powerful point about the potential chilling effect of this kind of activity on women, including women in public life, being able to exercise their right to freedom of expression.
I feel it is important for me to make it clear that—this is an obvious thing—I very much support tough legal and criminal sanctions against any perpetrator of violence or sexual abuse against women. I really do understand and support this, and hear the scale of the problem that is being outlined in this group of amendments.
Mine is a dissenting voice, in that I am not persuaded by the proposed solution to the problem that has been described. I will not take up a lot of the Committee’s time, but any noble Lords who were in the House when we were discussing a group of amendments on another piece of legislation earlier this year may remember that I spoke against making misogyny a hate crime. The reason why I did that then is similar, in that I feel somewhat nervous about introducing a code of conduct which is directly relevant to women. I do not like the idea of trying to address some of these serious problems by separating women from men. Although I know it is not the intention of a code such as this or any such measures, I feel that it perpetuates a sense of division between men and women. I just do not like the idea that we live in a society where we try to address problems by isolating or categorising ourselves into different groups of people, emphasising the sense of weakness and being victims of any kind of attack or offence from another group, and assuming that everybody who is in the other group will be a perpetrator of some kind of attack, criticism or violence against us.
My view is that, in a world where we see some of this serious activity happening, we should do more to support young men and boys to understand the proper expectations of them. When we get to the groups of amendments on pornography and what more we can do to prevent children’s access to it, I will be much more sympathetic. Forgive me if this sounds like motherhood and apple pie, but I want us to try to generate a society where basic standards of behaviour and social norms are shared between men and women, young and old. I lament how so much of this has broken down, and a lot of the problems we see in society are the fault of political and—dare I say it?—religious leaders not doing more to promote some of those social norms in the past. As I said, I do not want us to respond to the situation we are in by perpetuating more divisions.
I look forward to hearing what my noble friend the Minister has to say, but I am nervous about the solution proposed in the amendments.
My Lords, it gives me great pleasure to follow the noble Baroness, Lady Stowell of Beeston, not least because she became a dissenting voice, and I was dreading that I might be the only one.
First, I think it important that we establish that those of us who have spent decades fighting violence against women and girls are not complacent about it. The question is whether the physical violence we describe in the Bill is the same as the abuse being described in the amendments. I worry about conflating online incivility, abuse and vile things said with physical violence, as is sometimes done.
I note that Refuge, an organisation I have a great deal of respect for, suggested that the user empowerment duties that opted to place the burden on women users to filter out their own online experience was the same as asking women to take control of their own safety and protect themselves offline from violence. I thought that was unfair, because user empowerment duties and deciding what you filter out can be women using their agency.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I have held back from contributing to this group, because it is not really my group and I have not really engaged in the topic at all. I have been waiting to see whether somebody who is engaged in it would raise this point.
The one factual piece of information that has not been raised in the debate is the fact that the IWF, of which I too am a huge admirer—I have huge respect for the work that it does; it does some fantastic work—is a registered charity. That may lead to some very proper questions about what its role should be in any kind of formal relationship with a statutory regulator. I noticed that no one is proposing in any of these amendments that it be put on the face of the Bill, which, searching back into my previous roles and experience, I think I am right to say would not be proper anyway. But even in the context of whatever role it might have along with Ofcom, I genuinely urge the DCMS and/or Ofcom to ensure that they consult the Charity Commission, not just the IWF, on what is being proposed so that it is compatible with its other legal obligations as a charity.
If I might follow up that comment, I agree entirely with what the noble Baroness has just said. It is very tricky for an independent charity to have the sort of relationship addressed in some of the language in this debate. Before the Minister completes his comments and sits down again, I ask him: if Ofcom were to negotiate a contracted set of duties with the IWF—indeed, with many other charities or others who are interested in assisting with this important work—could that be done directly by Ofcom, with powers that it already has? I think I am right to say that it would not require parliamentary approval. It is only if we are talking about co-regulation, which again raises other issues, that we would go through a process that requires what sounded like the affirmative procedure—the one that was used, for example, with the Advertising Standards Authority. Is that right?
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, it is a great pleasure to follow the noble Lord, Lord Stevenson. I am grateful to him, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, for their support for my amendments, which I will come to in a moment. Before I do, I know that my noble friend Lord Moylan will be very disappointed not to be here for the start of this debate. From the conversation I had with him last week when we were deliberating the Bill, I know that he is detained on committee business away from the House. That is what is keeping him today; I hope he may join us a bit later.
Before I get into the detail of my amendments, I want to take a step back and look at the bigger picture. I remind noble Lords that on the first day in Committee, when we discussed the purpose of the Bill, one of the points I made was that, in my view, the Bill is about increasing big tech’s accountability to the public. For too long, and I am not saying anything that is new or novel here, it has enjoyed power beyond anything that other media organisations have enjoyed—including the broadcasters, which, as we know, have been subject to regulation for a long time now. I say that because, in my mind, the fundamental problem this legislation seeks to address is the lack of accountability of social media and tech platforms to citizens and users for the power and influence they have over our lives and society, as well as their economic impact. The latter will be addressed via the Digital Markets, Competition and Consumers Bill.
I emphasise “if that is the problem”, because when we talk about this bit of the Bill and the amendments we have tabled, we have started—and I am as guilty of this as anyone else—to frame it very much as if the problem is around the powers for the Secretary of State. In my view, we need to think about why they are not, in the way they are currently proposed, the right solution to the problem that I have outlined.
I do not think what we should be doing, as some of what is proposed in the Bill tends to do, is shift the democratic deficit from big tech to the regulator, although, of course, like all regulators, Ofcom must serve the public interest as a whole, which means taking everyone’s expectations seriously in the way in which it goes about its work.
That kind of analysis of the problem is probably behind some of what the Government are proposing by way of greater powers for the Secretary of State for oversight and direction of the regulator in what is, as we have heard, a novel regulatory space. I think that the problem with some, although not all, of the new powers proposed for the Secretary of State is that they would undermine the independence of Ofcom and therefore dilute the regulator’s authority over the social media and tech platforms, and that is in addition to what the noble Lord, Lord Stevenson, has already said, which is that there is a fundamental principle about the independence of media regulators in the western world that we also need to uphold and to which the Government have already subscribed.
If that is the bigger picture, my amendments would redress the balance between the regulator and the Executive, but there remains the vital role of Parliament, which I will come back to in a moment and which the noble Lord, Lord Stevenson, has already touched on, because that is where we need to beef up oversight of regulators.
Before I get into the detail, I should also add that my amendments have the full authority of your Lordships’ Communications and Digital Select Committee, which I have the great honour of chairing. In January, we took evidence from my noble friend Minister and his colleague, Paul Scully, and our amendments are the result of their evidence. I have to say that my noble friend on the Front Bench is someone for whom I have huge respect and admiration, but on that day when the Ministers were before us, we found as a committee that the Government’s evidence in respect of the powers that they were proposing for the Secretary of State was not that convincing.
I shall outline the amendments, starting with Amendments 113, 114, and 115. I am grateful to other noble Lords who have signed them, which demonstrates support from around the House. The Bill allows the Secretary of State to direct Ofcom to change its codes of practice on regulating social media firms for reasons of public policy. While it is legitimate for the Government to set strategic direction, this goes further and amounts to direct and unnecessary interference. The Government have suggested clarifying this clause, as we have heard, with a list of issues such as security, foreign policy, economic policy and burden to business, but it is our view as a committee that the list of items is so vague and expansive that almost anything could be included in it. Nor does it recognise the fact that the Government should respect the separation of powers between Executive and regulator in the first place, as I have already described. These amendments would therefore remove the Secretary of State’s power to direct Ofcom for reasons of public policy. Instead, the Secretary of State may write to Ofcom with non-binding observations on issues of security and child safety to which it must have regard. It is worth noting that under Clause 156 the Secretary of State still has powers to direct Ofcom in special circumstances to address threats to public health, safety and security, so the Government will not be left toothless, although I note that the noble Lord, Lord Stevenson, is proposing to remove Clause 156. Just to be clear, the committee is not proposing removing Clause 156; that is a place where the noble Lord and I propose different remedies.
Amendments 117 and 118 are about limiting the risk of infinite ping-pong. As part of its implementation work, Ofcom will have to develop codes of practice, but the Government can reject those proposals infinitely if they disagree with them. At the moment that would all happen behind closed doors. In theory, this process could go on for ever, with no parliamentary oversight. The Select Committee and I struggle to understand why the Government see this power as necessary, so our amendments would remove the Secretary of State’s power to issue unlimited directions to Ofcom on a draft code of practice, replacing it with a maximum of two exchanges of letters.
Amendment 120, also supported by the noble Lords I referred to earlier, is closely related to previous amendments. It is designed to improve parliamentary oversight of Ofcom’s draft codes of practice. Given the novel nature of the proposals to regulate the online world, we need to ensure that the Government and Ofcom have the space and flexibility to develop and adapt their proposals accordingly, but there needs to be a role for Parliament in scrutinising that work and being able to hold the Executive and regulator to account where needed. The amendment would ensure that the affirmative procedure, and not the negative procedure currently proposed in the Bill, was used to approve Ofcom’s codes of practice if they had been subject to attempts by the Secretary of State to introduce changes. This amendment is also supported by the Delegated Powers and Regulatory Reform Committee in its report.
Finally, Amendment 257 would remove paragraph (a) from Clause 157(1). This is closely related to previous amendments regarding the Secretary of State’s powers. The clause currently provides powers to provide wide-ranging guidance to Ofcom about how it carries out its work. This is expansive and poorly defined, and the committee again struggled to see the necessity for it. The Secretary of State already has extensive powers to set strategic priorities for Ofcom, establish expert advisory committees, direct action in special circumstances, direct Ofcom about its codes or just write to it if my amendments are accepted, give guidance to Ofcom about its media literacy work, change definitions, and require Ofcom to review its codes and undertake a comprehensive review of the entire online safety regime. Including yet another power to give unlimited guidance to Ofcom about how it should carry out its work seems unnecessary and intrusive, so this amendment would remove it, by removing paragraph (a) of Clause 157(1).
I hope noble Lords can see that, even after taking account of the amendments that the committee is proposing, the Secretary of State would be left with substantial and suitable powers to discharge their responsibilities properly.
Perhaps I may comment on some of the amendments to which I have not added my name. Amendment 110 from the noble Lords, Lord Stevenson and Lord Clement-Jones, and Amendment 290 from the noble Lord, Lord Stevenson, are about parliamentary oversight by Select Committees. I do not support the detail of these amendments nor the procedures proposed, because I believe they are potentially too cumbersome and could cause too much delay to various processes. As I have already said, and as the noble Lord, Lord Stevenson, said in opening, the Select Committee and I are concerned to ensure that there is adequate parliamentary oversight of Ofcom as it implements this legislation over the next few years. My committee clearly has a role in this, alongside the new DSIT Select Committee in the House of Commons and perhaps others, but we need to guard against duplication and fragmentation.
When we publish the wording, we will rightly have an opportunity to discuss it before the debate on Report. I will be happy to discuss it with noble Lords then. On the broader points about economic policy, that is a competency of His Majesty’s Government, not an area of focus for Ofcom. If the Government had access to additional information that led them to believe that a code of practice as drafted could have a significant, disproportionate and adverse effect on the livelihoods of the British people or to the broader economy, and if it met the test for exceptional circumstances, taking action via a direction from the Secretary of State could be warranted. I will happily discuss that when my noble friend and others see the wording of the changes we will bring on Report. I am sure we will scrutinise that properly, as we should.
I was about to say that, in addition to the commitment we have already made, in the light of the debate today we will also consider whether transparency about the use of this power could be increased further, while retaining the important need for government oversight of issues that are genuinely beyond Ofcom’s remit. I am conscious that, as my noble friend Lady Stowell politely said, I did not convince her or your Lordships’ committee when I appeared before it with my honourable friend Paul Scully. I am happy to continue our discussions and I hope that we may reach some understanding on this important area.
I am sorry to interrupt, but may I clarify what my noble friend just said? I think he said that, although he is open to increasing the transparency of the procedure, he does not concede a change—from direction to a letter about guidance which Ofcom should take account of. Is he willing to consider that as well?
My Lords, I also failed to stand up before the noble Lord, Lord Allan, did. I too am always slightly nervous to speak before or after him for fear of not having the detailed knowledge that he does. There have been so many powerful speeches in this group. I will try to speak swiftly.
My role in this amendment was predefined for me by the noble Baroness, Lady Kidron, as the midwife. I have spent many hours debating these amendments with my noble friend Lord Bethell, the noble Baroness, Lady Kidron, and with many noble Lords who have already spoken in this debate. I think it is very clear from the debate why it is so important to put a definition of age assurance and age verification on the face of the Bill. People feel so passionately about this subject. We are creating the digital legal scaffolding, so being really clear what we mean by the words matters. It really matters and we have seen it mattering even in the course of this debate.
My two friends—they are my friends—the noble Baroness, Lady Kidron, and my noble friend Lord Bethell both used the word “proportionate”, with one not wanting us to be proportionate and the other wanting us to be proportionate. Yet, both have their names to the same amendment. I thought it might be helpful to explain what I think they both mean—I am sure they will interrupt me if I get this wrong—and explain why the words of the amendment matter so much.
Age assurance should not be proportionate for pornography. It should be the highest possible bar. We should do everything in our power to stop children seeing it, whether it is on a specific porn site or on any other site. We do not want our children to see pornography; we are all agreed on that. There should not be anything proportionate about that. It should be the highest bar. Whether “beyond reasonable doubt” is the right wording or it should instead be “the highest possible bar practically achievable”, I do not know. I would be very keen to hear my noble friend the Minister’s thoughts on what the right wording is because, surely, we are all clear it should be disproportionate; it should absolutely be the hardest we can take.
Equally, age assurance is not just about pornography, as the noble Lord, Lord Allan, has said. We need to have a proportionate approach. We need a ladder where age assurance for pornography sits at the top, and where we are making sure that nine year-olds cannot access social media sites if they are age-rated for 13. We all know that we can go into any primary school classroom in the land and find that the majority of nine year-olds are on social media. We do not have good age assurance further down.
As both the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, have said, we need age assurance to enable providers to adapt the experience to make it age-appropriate for children on services we want children to use. It needs to be both proportionate and disproportionate, and that needs to be defined on the face of the Bill. If we do not, I fear that we will fall into the trap that the noble Lord, Lord Allan, mentioned: the cookie trap. We will have very well-intentioned work that will not protect children and will go against the very thing that we are all looking for.
In my role as the pragmatic midwife, I implore my noble friend the Minister to hear what we are all saying and to help us between Committee and Report, so that we can come back together with a clear definition of age assurance and age verification on the face of the Bill that we can all support.
My Lords, about half an hour ago I decided I would not speak, but as we have now got to this point, I thought I might as well say what I was going to say after all. I reassure noble Lords that in Committee it is perfectly permissible to speak after the winder, so no one is breaking any procedural convention. That said, I will be very brief.
My first purpose in rising is to honour a commitment I made last week when I spoke against the violence against women and girls code. I said that I would none the less be more sympathetic to and supportive of stronger restrictions preventing child access to pornography, so I want to get my support on the record and honour that commitment in this context.
My noble friend Lady Harding spoke on the last group about bringing our previous experiences to bear when contributing to some of these issues. As I may have said in the context of other amendments earlier in Committee, as a former regulator, I know that one of the important guiding principles is to ensure that you regulate for a reason. It is very easy for regulators to have a set of rules. The noble Baroness, Lady Kidron, referred to rules of the road for the tech companies to follow. It is very easy for regulators to examine whether those rules are being followed and, having decided that they have, to say that they have discharged their responsibility. That is not good enough. There must be a result, an outcome from that. As the noble Lord, Lord Allan, emphasised, this must be about outcomes and intended benefits.
I support making it clear in the Bill that, as my noble friend Lady Harding said, we are trying to prevent, disproportionately, children accessing pornography. We will do all we can to ensure that it happens, and that should be because of the rules being in place. Ofcom should be clear on that. However, I also support a proportionate approach to age assurance in all other contexts, as has been described. Therefore, I support the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell, and the role my noble friend Lady Harding has played in arriving at a pragmatic solution.
My Lords, it is a privilege to be in your Lordships’ House, and on some occasions it all comes together and we experience a series of debates and discussions that we perhaps would never have otherwise reached, and at a level which I doubt could be echoed anywhere else in the world. This is one of those days. We take for granted that every now and again, we get one of these rapturous occasions when everything comes together, but we forget the cost of that. I pay tribute, as others have, to the noble Baroness, Lady Kidron. She has worked so hard on this issue and lots of other issues relating to this Bill and has exhausted herself more times than is right for someone of her still youthful age. I am very pleased that she is going off on holiday and will not be with us for a few days; I wish her well. I am joking slightly, but I mean it sincerely when I say that we have had a very high-quality debate. That it has gone on rather later than the Whips would have wanted is tough, because it has been great to hear and be part of. However, I will be brief.
It was such a good debate that I felt a tension, in that everybody wanted to get in and say what they wanted to say be sure they were on the record. That can sometimes be a disaster, because everyone repeats everything, but as the noble Baroness, Lady Harding, said, we know our roles, we know what to say and when to say it, and it has come together very nicely. Again, we should congratulate ourselves on that. However, we must be careful about something which we keep saying to each other but sometimes do not do. This is a Bill about systems, not content. The more that we get into the content issues, the more difficult it is to remember what the Bill can do and what the regulator will be able to do if we get the Bill to the right place. We must be sure about that.
I want to say just a few things about where we need to go with this. As most noble Lords have said, we need certainty: if we want to protect our children, we have to be able to identify them. We should not be in any doubt about that; there is no doubt that we must do it, whatever it takes. The noble Lord, Lord Allan, is right to say that we are in the midst of an emerging set of technologies, and there will be other things coming down the line. The Bill must keep open to that; it must not be technology-specific, but we must be certain of what this part is about, and it must drill down to that. I come back to the idea of proportionality: we want everybody who is 18 or under to be identifiable as such, and we want to be absolutely clear about that. I like the idea that this should be focused on the phones and other equipment we use; if we can get to that level, it will be a step forward, although I doubt whether we are there yet.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I support Amendment 227 in particular. I am pleased to contribute, as someone who gave evidence to the Leveson inquiry, explaining why social media should not be in scope for any new press regulation scheme. It is entertaining for me now to come through the looking glass and listen to the noble Lords, Lord Black of Brentwood and Lord Faulks, in particular making the kinds of argument I made then, as we discuss whether the press should be in scope for a new social media regulatory scheme.
These amendments are a helpful way to test how the Government expect their decision to afford certain privileges for online activity by journalists and news publishers to work. That is what the regime does, in effect, with the rationale, which was explained to us, that this is why certain bodies can be privileged when using user-to-user services and search engines in a way that, if they were not afforded that status, they would not be given those privileges. Again, it is noteworthy that there has often been criticism of social media precisely for giving special treatment to some users, including in stories in some of the press that we are talking about, and here we are creating not just a state sanction but a state-ordered two-tier system that all the social media companies will now have to adopt. That creates some interesting questions in itself.
I want to press the Minister primarily on definitions. It is certainly my experience that definitions of who is a journalist or a news media publisher are challenging and can be highly political. There have been several pressure points, pushing social media companies to try to define journalists and news publishers for themselves, outside of any regulatory scheme—notably following the disputes about misinformation and disinformation in the United States. The European Union also has a code of practice on misinformation and disinformation. Every time someone approaches this subject, they ask social media companies to try to distinguish journalists and news media from other publishers. So these efforts have been going on for some time, and many of them have run into disputes because there is no consistent agreement about who should be in or outside those regimes. This is one of those problems that seems clear and obvious when you stand back from it, but the more that you zoom in, the more complex and messy it becomes. We all say, “Oh yes, journalists and news publishers—that is fine”, and we write that in the legislation, but, in practice, it will be really hard when people have to make decisions about individuals.
Some news organisations are certainly highly problematic. Most terrorist organisations have news outlets and news agencies. They do not advertise themselves as such but, if you work in a social media platform, you have to learn to distinguish them. They are often presented entirely legitimately, and some of the information that you use to understand why they are problematic may be private, which creates all sorts of problems. Arguably, this is the Russia Today situation: it presented itself as legitimate and was registered with Ofcom for a period of time; we accepted that it was a legitimate news publisher, but we changed our view because we regard the Russian Government as a terrorist regime, in some senses. That is happening all of the time, with all sorts of bodies across the world that have created these news organisations. In the Middle East in particular, you have to be extraordinarily careful—you think that something is a news organisation but you then find that it has a Hezbollah connection and, there you go, you have to try to get rid of it. News organisations tied to extremist organisations is one area that is problematic, and my noble friend referred to it already.
There is also an issue with our domestic media environment. Certainly, most people would regard Gary Lineker as a journalist who works for a recognised news publisher—the BBC—but not everyone will agree with that definition. Equally, most people regard the gentleman who calls himself Tommy Robinson as not being a journalist; however much he protests that he is in front of judges and others, and however much support he has from recognised news publishers in the United States, most people would say that he is not a journalist. The community of people who agree that Gary Lineker is not a journalist and that of people who think that Tommy Robinson is not a journalist do not overlap much, but I make the point that there is continually this contention about individuals, and people have views about who should be in or out of any category that we create.
This is extraordinarily difficult, as in the Bill we are tasking online services with a very hard job. In a few lines of it, we say: “Create these special privileges for these people we call journalists and news publishers”. That is going to be really difficult for them to do in practice and they are going to make mistakes, either exclusionary or inclusionary. We are giving Ofcom an incredibly difficult role, which is why this debate is important, because it is going to have to adjudicate when that journalist or news publisher says to Ofcom: “I think this online platform is breaching the Online Safety Act because of the way it treated me”. Ofcom is going to have to take a view about whether that organisation or individual is legitimate. Given the individuals I named, you can bet your bottom dollar that someone is going to go to Ofcom and say, “I don’t think that Gary Lineker or the BBC are legitimate”. That one should be quite easy; others across the spectrum will be much more difficult for it to deal with.
That is the primary logic underlying Amendment 227: we have known unknowns. There will be unanticipated effects of this legislation and, until it is in place and those decisions are being made, we do not know how it will work. Frankly, we do not know whether, as a result of legal trickery and regulatory decisions, we have inadvertently created a loophole where some people will be able to go and win court cases by claiming protections that we did not intend them to have. I disagree with the noble Lord, Lord Black: I do not think Amendment 227 undermines press freedom in any sense at all. All it does is to say: “We have created an Online Safety Bill. We expect it to enhance people’s safety and within it we have some known unknowns. We do not know how this exemption is going to work. Why not ask Ofcom to see if any of those unintended consequences happen?”
I know that we are labouring our way through the Online Safety Bill version 1, so we do not want to think about an online safety Bill version 2, but there will at some point have to be a revision. It is entirely rational and sensible that, having put this meaningful exemption in there—it has been defended, so I am sure that the Government will not want to give it up—the least we can do is to take a long, hard look, without interfering with press freedom, and get Ofcom to ask, “Did we see those unintended consequences? Do we need to look at the definitions again?”
My Lords, the noble Lord, Lord Allan, has clearly and comprehensively painted a picture of the complex world in which we now live, and I do not think that anybody can disagree with that or deny it. We are in a world which is going to keep evolving; we have talked in lots of other contexts about the pace of change, and so on. However, in recognising all that, what the noble Lord has just described—the need for constant evaluation of whether this regime is working effectively—is a job for Parliament, not for Ofcom. That is where I come back to in starting my response to this group of amendments.
Briefly—in order that we can get to the wind-ups and conclude business for the day—ensuring that recognised news publishers and organisations are not subject to Ofcom or any form of state regulation is a vital principle. I am pleased that the Government have included the safeguards which they have in the legislation, while also making it much harder for the tech platforms to restrict the freedom of recognised news publishers and users’ access to them.
I reiterate that I understand that this is becoming increasingly complicated, but these are important principles. We have to start in the world that we currently understand and know, ensure that we protect those publications which we recognise as trusted news providers now, and do not give way on those principles. As my noble friend Lord Black said, regarding debates about Section 40 of the Crime and Courts Act, there will be an opportunity to re-evaluate that in due course when we come to the media Bill. For what it is worth, my personal view is that I support the Government’s intention to remove it.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, I broadly support all the amendments in this group but I will focus on the three amendments in the names of the noble Lord, Lord Russell, and others; I am grateful for their clear exposition of why these amendments are important. I draw particular attention to Amendment 281A and its helpful list of functions that are considered to be harmful and to encourage addiction.
There is a very important dimension to this Bill, whose object, as we have now established, is to encourage safety by design. An important aspect of it is cleaning up, and setting right, 20 years or more of tech development that has not been safe by design and has in fact been found to be harmful by way of design. As the noble Baroness, Lady Harding, just said, in many conversations and in talking to people about the Bill, one of the hardest things to communicate and get across is that this is about not only content but functionality. Amendment 281A provides a useful summary of the things that we know about in terms of the functions that cause harm. I add my voice to those encouraging the Minister and the Government to take careful note of it and to capture this list in the text of the Bill in some way so that this clean-up operation can be about not only content for the future but functionality and can underline the objectives that we have set for the Bill this afternoon.
My Lords, I start by saying amen—not to the right reverend Prelate but to my noble friend Lady Harding. She said that we should not assume that, just because charities exist, they are all doing good; as a former chair of the Charity Commission, I can say that that is very true.
The sponsors of Amendments 281 to 281B have made some powerful arguments in support of them. They are not why I decided to speak briefly on this group but, none the less, they made some strong points.
I come back to Amendments 28 to 30. Like others, I do not have a particular preference for which of the solutions is proposed to address this problem but I have been very much persuaded by the various correspondence that I have received—I am sure that other noble Lords have received such correspondence—which often uses Wikipedia as the example to illustrate the problem.
However, I take on board what my noble friend said: there is a danger of identifying one organisation and getting so constrained by it that we do not address the fundamental problems that the Bill is about, which is making sure that there is a way of appropriately excluding organisations that should not be subject to these various regulations because they are not designed for them. I am open to the best way of doing that.
On the noble Baroness’s point, that is why I intervened in the debate: so that we are all clear. We are not saying that, for priority content, it is an amber light and not a red light. We are not saying, “Just remove all this stuff”; it would be a wrong response to the Bill to say, “It’s a fictional character being slaughtered so remove it”, because now we have removed “Twilight”, “Watership Down” and whatever else. We are saying, “Think very carefully”. If it is one of those circumstances where this is causing harm—they exist; we cannot pretend that they do not—it should be removed. However, the default should not be to remove everything on this list; that is the point I am really trying to make.
My Lords, our debate on this group is on the topic of priority harms to children. It is not one that I have engaged in so I tread carefully. One reason why I have not engaged in this debate is because I have left it to people who know far more about it than I do; I have concentrated on other parts of the Bill.
In the context of this debate, one thing has come up on which I feel moved to make a short contribution: misinformation and disinformation content. There was an exchange between my noble friend Lady Harding and the noble Baroness, Lady Fox, on this issue. Because I have not engaged on the topic of priority harms, I genuinely do not have a position on what should and should not be featured. I would not want anybody to take what I say as support for or opposition to any of these amendments. However, it is important for us to acknowledge that, as much as misinformation and disinformation are critical issues—particularly for children and young people because, as the right reverend Prelate said, the truth matters—we cannot, in my view, ignore the fact that misinformation and disinformation have become quite political concepts. They get used in a way where people often define things that they do not agree with as misinformation—that is, opinions are becoming categorised as misinformation.
We are now putting this in legislation and it is having an impact on content, so it is important, too, that we do not just dismiss that kind of concern as not relevant because it is real. That is all I wanted to say.
My Lords, I will speak briefly as I know that we are waiting for a Statement.
If you talk to colleagues who know a great deal about the harm that is happening and the way in which platforms operate, as well as to colleagues who talk directly to the platforms, one thing that you commonly hear from them is a phrase that often recurs when they talk to senior people about some of the problems here: “I never thought of that before”. That is whether it is about favourites on Snapchat, which cause grief in friendship groups, about the fact that, when somebody leaves a WhatsApp group, it flags up who that person is—who wants to be seen as the person who took the decision to leave?—or about the fact that a child is recommended to other children even if the company does not know whether they are remotely similar.
If you are 13, you are introduced as a boy to Andrew Tate; if you are a girl, you might be introduced to a set of girls who may or may not share anorexia content, but they dog-whistle and blog. The companies are not deliberately orchestrating these outcomes—it is the way they are designed that is causing those consequences—but, at the moment, they take no responsibility for what is happening. We need to reflect on that.
I turn briefly to a meeting that the noble Lord, Lord Stevenson, and I were at yesterday afternoon, which leads neatly on to some of the comments the noble Baroness, Lady Fox, made, a few moments ago about the far right. The meeting was convened by Luke Pollard MP and was on the strange world known as the manosphere, which is the world of incels—involuntary celibates. As your Lordships may be aware, on various occasions, certain individuals who identify as that have committed murder and other crimes. It is a very strange world.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberThat point is well made. In support of that, if the public space treated me in a discriminatory way, I would expect to have redress, but I do not think I have a right in every public space to say everything I like in the classic Article 8 sense. My right vis-à-vis the state is much broader than my right vis-à-vis any public space that I am operating in where norms apply as well as my basic legal rights. Again, to take the pub example, if I went in and made a racist speech, I may well be thrown out of the pub even though it is sub-criminal and the police are never called; they do not need to be as the space itself organises it.
I am making the point that terms of service are about managing these privately managed public services, and it would be a mistake to equate them entirely with our right to speak or the point at which the state can step in and censor us. I understand the point about state interference but it cuts both ways: both the state interfering in excessively censoring what we can say but also the state potentially interfering in the management of what is, after all, a private space. To refer back to the US first amendment tradition, a lot of that was about freedom of religion and precisely about enabling heterodoxy. The US did not want an orthodoxy in which one set of rules applied everywhere to everybody. Rather, it wanted people to have the right to dissent, including in ways that were exclusive. You could create your own religious sect and you could not be told not to have those beliefs.
Rolling that power over to the online world, online services, as long as they are non-discriminatory, can have quite different characters. Some will be very restrictive of speech like a restrictive religious sect; some will be very open and catholic, with a small “c”, in the sense of permitting a broad range of speech. I worry about some of the amendments in case there is a suggestion that Ofcom would start to tell a heterodox community of online services that there is an orthodox way to run their terms of service; I would rather allow this to be a more diverse environment.
Having expressed some concerns, I am though very sympathetic to Amendment 162 on Section 5 of the Public Order Act. I have tried in our debates to bring some real experience to this. There are two major concerns about the inclusion of the Public Order Act in the Bill. One is a lack of understanding of what that means. If you look at the face of the language that has been quoted at us, and go back to that small service that does not have a bunch of lawyers on tap, it reads as though it is stopping any kind of abusive content. Maybe you will google it, as I did earlier, and get a little thing back from the West Yorkshire Police. I googled: “Is it illegal to swear in the street?”. West Yorkshire Police said, “Yes, it is”. So if you are sitting somewhere googling to find out what this Public Order Act thing means, you mind end up thinking, “Crikey, for UK users, I have to stop them swearing”. There is a real risk of misinterpretation.
The second risk is that of people deliberately gaming the system; again, I have a real-life example from working in one of the platforms. I had people from United Kingdom law enforcement asking us to remove content that was about demonstrations by far-right groups. They were groups I fundamentally disagree with, but their demonstrations did not appear to be illegal. The grounds cited were that, if you allow this content to go ahead and the demonstration happens, there will be a Public Order Act offence. Once you get that on official notepaper, you have to be quite robust to say, “No, I disagree”, which we did on occasion.
I think there will be other services that receive Public Order Act letters from people who seem official and they will be tempted to take down content that is entirely legal. The critical thing here is that that content will often be political. In other parts of the Bill, we are saying that we should protect political speech, yet we have a loophole here that risks that.
I am sure the Minister will not concede these amendments, but I hope he will concede that it is important that platforms are given guidance so that they do not think that somebody getting upset about a political demonstration is sufficient grounds to remove the content as a Public Order Act offence. If you are a local police officer it is much better to get rid of that EDL demonstration, so you write to the platform and it makes your life easier, but I do not think that would be great from a speech point of view.
Finally, I turn to the point made by the noble Lord, Lord Moylan, on Amendment 188 about the ECHR Article 8 exemption. As I read it, if your terms of service are not consistent with ECHR Article 8—and I do not think they will be for most platforms—you then get an exemption from all the other duties around appeals and enforcing them correctly. It is probably a probing amendment but it is a curious way of framing it; it essentially says that, if you are more restrictive, you get more freedom in terms of the Ofcom relationship. I am just curious about the detail of that amendment.
It is important that we have this debate and understand this relationship between the state, platforms and terms of service. I for one am persuaded that the general framework of the Bill makes sense; there are necessary and proportionate restrictions. I am strongly of the view that platforms should be allowed to be heterodox in their terms of service. Ofcom’s job is very much to make sure that they are done correctly but not to interfere with the content of those terms of service beyond that which is illegal. I am persuaded that we need to be extraordinarily careful about including Public Order Act offences; that particular amendment needs a good hearing.
My Lords, I have said several times when we have been debating this Bill—and I will probably say it again when we get to the group about powers—that, for me, the point of the Online Safety Bill is to address the absence of accountability for the extraordinary power that the platforms and search engines have over what we see online and, indeed, how we live and engage with each other online. Through this Bill, much greater responsibility for child safety will be placed on the platforms. That is a good thing; I have been very supportive of the measures to ensure that there are strong protections for children online.
The platforms will also have responsibility, though, for some measures to help adults protect themselves. We must not forget that, the more responsibility that platforms have to protect, the more power we could inadvertently give them to influence what is an acceptable opinion to hold, or to shape society to such an extent that they can even start to influence what we believe to be right or wrong—we are talking about that significant amount of power.
I was of the camp that was pleased when the Government removed the legal but harmful aspects of the Bill, because for me they represented a serious risk to freedom of expression. As I just described, I felt that they risked too much inadvertent power, as it were, going to the platforms. But, with the Government having done that, we have seen through the passage of the Bill some push-back, which is perfectly legitimate and understandable—I am not criticising anyone—from those who were concerned about that move. In response to that, the Government amended the Bill to provide assurances and clarifications on things like the user-empowerment tools. As I said, I do not have any problem; although I might not necessarily support some of the specific measures that were brought forward, I am okay with that as a matter of principle.
However, as was explained by my noble friend Lord Moylan and the noble Baroness, Lady Fox, there has not been a similar willingness from the Government to reassure those who remain concerned about the platforms’ power over freedom of expression. We have to bear in mind that some people’s concerns in this quarter remained even when the legal but harmful change was made—that is, the removal of legal but harmful was a positive step, but it did not go far enough for some people with concerns about freedom of expression.
I am sympathetic to the feeling behind this group, which was expressed by my noble friend and the noble Baroness, Lady Fox. I am sympathetic to many of the amendments. As the noble Lord, Lord Allan of Hallam, pointed out, specifically Amendment 162 in relation to the Public Order Act seems worthy of further consideration by the Government. But the amendments in the group that caught my attention place a specific duty on Ofcom in regard to freedom of expression when drawing up or amending codes of practice or other guidance—these amendments are in my noble friend Lord Moylan’s name. When I looked at them, I did not think that they undermined anything else that the Government brought forward through the amendments to the Bill, as he said, but I thought that they would go a long way towards enforcing the importance of freedom of expression as part of this regulatory framework—one that we expect Ofcom to attach serious importance to.
I take on board what the noble Lord, Lord Allan, said about the framework of this legislation being primarily about safeguarding and protection. The purpose of the Bill is not to enhance freedom of expression, but, throughout its passage, that has none the less always been a concern. It is right that the Government seek to balance these two competing fundamental principles. I ask whether more can be done—my noble friend pointed to the recommendations of the Equality and Human Rights Commission and how they reinforce some of what he proposed. I would like to think that my noble friend the Minister could give some greater thought to this.
As was said, it is to the Government’s credit how much they have moved on the Bill during its passage, particularly between Committee and Report. That was quite contrary to the sense that I think a lot of us felt during the early stages of our debates. It would be a shame if, once the Bill leaves the House, it is felt that the balance is not as fine—let me put it like that—as some people feel it needs to be. I just wanted to express some support and ask my noble friend the Minister to give this proper and serious consideration.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, the amendments in this group consider regulatory accountability and the roles of Ofcom, the Government and Parliament in overseeing the new framework. The proposals include altering the powers of the Secretary of State to direct Ofcom, issue guidance to Ofcom and set strategic priorities. Ofcom’s operational independence is key to the success of this framework, but the regime must ensure that there is an appropriate level of accountability to government. Parliament will also have important functions, in particular scrutinising and approving the codes of practice which set out how platforms can comply with their duties and providing oversight of the Government’s powers.
I heard the strength of feeling expressed in Committee that the Bill’s existing provisions did not get this balance quite right and have tabled amendments to address this. Amendments 129, 134 to 138, 142, 143, 146 and 147 make three important changes to the power for the Secretary of State to direct Ofcom to modify a draft code of practice. First, these amendments replace the public policy wording in Clause 39(1)(a) with a more defined list of reasons for which the Secretary of State can make a direction. This list comprises: national security, public safety, public health and the UK’s international obligations. This is similar to the list set out in a Written Ministerial Statement made last July but omits “economic policy” and “burden to business”.
This closely aligns the reasons in the Bill with the existing power in Section 5 of the Communications Act 2003. The power is limited to those areas genuinely beyond Ofcom’s remit as a regulator and where the Secretary of State might have access to information or expertise that the regulator does not. Secondly, the amendments clarify that the power will be used only for exceptional reasons. As noble Lords know, this has always been our intent and the changes we are tabling today put this beyond doubt. Thirdly, the amendments increase the transparency regarding the use of the power by requiring the Secretary of State to publish details of a direction at the time the power is used. This will ensure that Parliament has advance sight of modifications to a code and I hope will address concerns that several directions could be made on a single code before Parliament became aware.
This group also considers Amendments 131 to 133, which create an 18-month statutory deadline for Ofcom to submit draft codes of practice to the Secretary of State to be laid in Parliament relating to illegal content, safety duties protecting children and other cross-cutting duties. These amendments sit alongside Amendment 230, which we debated on Monday and which introduced the same deadline for Ofcom’s guidance on Part 5 of the regime.
I am particularly grateful to my noble friend Lady Stowell of Beeston, with whom I have had the opportunity to discuss these amendments in some detail as they follow up points that she and the members of her committee gave particular attention to. I beg to move.
My Lords, I will speak to the amendments in this group in my name: Amendments 139, 140, 144 and 145. I thank the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Viscount, Lord Colville, for signing those amendments and for their continued support on this group. I am also grateful to my noble friend the Minister and his team for engaging with me on the issue of Secretary of State powers. He has devoted a lot of time and energy to this, which is reflected in the wide- ranging group of amendments tabled by him.
Before I go any further, it is worth emphasising that the underlying concern here is making sure that we have confidence, through this new regulation regime, that the Bill strikes the right balance of power between government, Parliament, the regulator and big tech firms. The committee that I chair—the Communications and Digital Select Committee of your Lordships’ House—has most focused on that in our consideration of the Bill. I should say also that the amendments I have brought forward in my name very much have the support of the committee as well.
These amendments relate to Clause 39, which is where the main issue lies in the context of Secretary of State powers, and we have three broad concerns. First, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything. Secondly, these powers allowed the Government to conduct an infinite form of ping-pong with the regulator, enabling the Government to prevail in a dispute. Thirdly, this ping-pong could take place in private with no possibility of parliamentary oversight or being able to intervene, as would be appropriate in the event of a breakdown in the relationship between executive and regulator.
This matters because the Online Safety Bill creates a novel form for regulating the internet and what we can or cannot see online, in particular political speech, and it applies to the future. It is one thing for the current Government, who I support, to say that they would never use the powers in this way. That is great but, as we know, current Governments cannot speak for whoever is in power in the generations to come, so it is important that we get this right.
As my noble friend said, he has brought forward amendments to Clause 39 that help to address this. I support him in and commend him for that. The original laundry list of powers to direct Ofcom has been shortened and now follows the precedent set out in the Communications Act 2003. The government amendments also say that the Secretary of State must now publish their directions to Ofcom, which will improve transparency, and once the code is agreed Ofcom will publish changes so that Parliament can see what changes have been made and why. These are all very welcome and, as I say, they go a long way to addressing some of our concerns, but two critical issues remain.
First, the Government retain an opt-out, which means that they do not have to publish their directions if the Secretary of State believes that doing so would risk
“national security or public safety”,
or international relations. However, those points are now the precise grounds on which the Secretary of State may issue a direction and, if history is any guide, there is a real risk that we will never hear about the directions because the Government have decided that they are a security issue.
My Amendments 139 and 140 would require the Secretary of State to at least notify Parliament of the fact that a direction has been issued and what broad topic it relates to. That would not require any details to be published, so it does not compromise security, but it does give assurance that infinite, secretive ping-pong is not happening behind the scenes. My noble friend spoke so quickly at the beginning that I was not quite sure whether he signalled anything, but I hope that he may be able to respond enthusiastically to Amendments 139 and 140.
Secondly, the Government still have powers for infinite ping-pong. I appreciate that the Government have reservations about capping the number of exchanges between the Secretary of State and Ofcom, but they must also recognise the concern that they appear to be preparing the ground for any future Government to reject infinitely the regulator’s proposals and therefore prevail in a dispute about a politically contentious topic. My Amendments 144 and 145 would clarify that the Government will have a legally binding expectation that they will use no more than the bare minimum number of directions to achieve the intent set out in their first direction.
The Government might think that adding this to the Bill is superfluous, but it is necessary in order to give Parliament and the public confidence about the balance of power in this regime. If Parliament felt that the Secretary of State was acting inappropriately, we would have sufficient grounds to intervene. As I said, the Government acknowledged in our discussions the policy substance of these concerns, and as we heard from my noble friend the Minister in introducing this group, there is an understanding on this. For his part, there is perhaps a belief that what they have done goes far enough. I urge him to reconsider Amendments 144 and 145, and I hope that, when he responds to the debate on this group, he can say something about not only Amendments 139 and 140 but the other two amendments that will give me some grounds for comfort.
My Lords, I realise that I am something of a fish out of water in this House, as I was in Committee, on the Bill, which is fundamentally flawed in a number of respects, including its approach to governance, which we are discussing today. Having said that, I am generally sympathetic to the amendments proposed by my noble friend Lady Stowell of Beeston. If we are to have a flawed approach, her amendments would improve it somewhat.
However, my approach is rather different and is based on the fairly simple but important principle that we live in a free democracy. If we are to introduce a new legislative measure such as this Bill, which has far- reaching powers of censorship taking us back 70 or 80 years in terms of the freedom of expression we have been able to develop since the 1950s and 1960s— to the days of Lady Chatterley’s Lover and the Lord Chamberlain, in equivalent terms, as far as the internet and the online world are concerned—then decisions of such a far-reaching character affecting our lives should be taken by somebody who is democratically accountable.
My approach is utterly different from that which my noble friend on the Front Bench has proposed. He has proposed amendments which limit yet further the Secretary of State’s power to give directions to Ofcom, but the Secretary of State is the only party in that relationship who has a democratic accountability. We are transferring huge powers to a completely unaccountable regulator, and today my noble friend proposes transferring, in effect, even more powers to that unaccountable regulator.
To go back to a point that was discussed in Committee and earlier on Report, if Ofcom takes certain decisions which make it impossible for Wikipedia to operate its current model, such that it has to close down at least its minority language websites—my noble friend said that the Government have no say over that and no idea what Ofcom will do—to whom do members of the public protest? To whom do they offer their objections? There is no point writing to the Secretary of State because, as my noble friend told us, they will not have had any say in the matter and we in this House will have forsworn the opportunity, which I modestly proposed, to take those powers here. There is no point writing to their MP, because all their MP can do is badger the Secretary of State. It is a completely unaccountable structure that is completely indefensible in a modern democratic society. So I object to the amendments proposed by my noble friend, particularly Amendments 136 and 137.
I am grateful to my noble friend for his constructive response to my Amendments 139 and 140. I am sure he will do me the honour of allowing me to see the Government’s reversioning of my amendments before they are laid so that we can be confident at Third Reading that they are absolutely in line with expectations.
Could I press my noble friend a little further on Amendments 144 and 145? As I understood what he said, the objection from within government is to the language in the amendments I have tabled—although as my noble friend Lady Harding said, they are incredibly modest in their nature.
I was not sure whether my noble friend was saying in his defence against accepting them that issuing a direction would have to be exceptional, and that that led to a need to clarify that this would be ongoing. Would each time there is a ping or a pong be exceptional? Forgive me, because it starts to sound a bit ridiculous when we get into this amount of detail, but it seems to me that the “exceptional” issue kicks in at the point where you issue the direction. Once you engage in a dialogue, “exceptional” is no longer really the issue. It is an odd defence against trying to limit the number of times you allow that dialogue to continue. Bearing in mind that he is willing to look again at Amendments 139 and 140, I wonder whether, between now and Third Reading, he would at least ask parliamentary counsel to look again at the language in my original amendment.
I am certainly happy to commit to showing my noble friend the tidying up we think necessary of the two amendments I said we are happy to accept ahead of Third Reading. On the others, as I said, the code could be delayed repeatedly only if the Secretary of State showed that there remained exceptional reasons once it had been modified, and that high bar would need to be met each time. So we do not agree with her Amendments 14 and 145 because of concerns about the drafting of my noble friend’s current amendment and because the government amendments we have brought forward cater for the scenario about which she is concerned. Her amendments would place a constraint on the Secretary of State not to give more directions than are necessary to achieve the objectives set out in the original direction, but they would not achieve the intent I think my noble friend has. The Bill does not require the direction to have a particular objective. Directions are made because the Secretary of State believes that modifications are necessary for exceptional reasons, and the direction must set out the reasons why the Secretary of State believes that a draft should be modified.
Through the amendments the Government have laid today, the direction would have to be for exceptional reasons relating to a narrower list and Parliament would be made aware each time a direction was made. Parliament would also have increased scrutiny in cases where a direction had been made under Clause 39(1)(a), because of the affirmative procedure. However, I am very happy to keep talking to my noble friend, as we will be on the other amendments, so we can carry on our conversation then if she wishes.
Let me say a bit about the amendments tabled by my noble friend Lord Moylan. His Amendment 218 would require the draft statement of strategic priorities laid before Parliament to be approved by resolution of each House. As we discussed in Committee, the statement of strategic priorities is necessary because future technological changes are likely to shape harms online, and the Government must have an avenue through which to state their strategic priorities in relation to these emerging technologies.
The Bill already requires the Secretary of State to consult Ofcom and other appropriate persons when preparing a statement. This provides an opportunity for consideration and scrutiny of a draft statement, including, for example, by committees of Parliament. This process, combined with the negative procedure, provides an appropriate level of scrutiny and is in line with comparable existing arrangements in the Communications Act in relation to telecommunications, the management of radio spectrum and postal services.
My noble friend’s other amendments would place additional requirements on the Secretary of State’s power to issue non-binding guidance to Ofcom about the exercise of its online safety functions. The guidance document itself does not create any statutory requirements —Ofcom is required only to have regard to the guidance —and on that basis, we do not agree that it is necessary to subject it to parliamentary approval as a piece of secondary legislation. As my noble friend Lady Harding of Winscombe pointed out, we do not require that in numerous other areas of the economy, and we do not think it necessary here.
Let me reassure my noble friend Lord Moylan on the many ways in which Parliament will be able to scrutinise the work of Ofcom. Like most other regulators, it is accountable to Parliament in how it exercises its functions. The Secretary of State is required to present its annual report and accounts before both Houses. Ministers from the devolved Administrations must also lay a copy of the report before their respective Parliament or Assembly. Ofcom’s officers can be required to appear before Select Committees to answer questions about its work; indeed, its chairman and chief executive appeared before your Lordships’ Communications and Digital Committee just yesterday. Parliament will also have a role in approving a number of aspects of the regulatory framework through its scrutiny of both primary and secondary legislation.
My Lords, I am completely opposed to Amendments 159 and 160, but the noble Lords, Lord Faulks and Lord Black, and the noble Viscount, Lord Colville, have explained the issues perfectly. I am fully in agreement with what they said. I spoke at length in Committee on that very topic. This is a debate we will undoubtedly come back to in the media Bill. I, for one, am extremely disappointed that the Labour Party has said that it will not repeal Section 40. I am sure that these issues will get an airing elsewhere. As this is a speech-limiting piece of legislation, as was admitted earlier this week, I do not want any more speech limiting. I certainly do not want it to be a media freedom-limiting piece of legislation on top of that.
I want to talk mainly about the other amendments, Amendments 158 and 161, but approach them from a completely different angle from the noble Lord, Lord Allan of Hallam. What is the thinking behind saying that the only people who can clip content from recognised news publishers are the news publishers? The Minister mentioned in passing that there might be a problem of editing them, but it has become common practice these days for members of the public to clip from recognised news publishers and make comments. Is that not going to be allowed? That was the bit that completely confused me. It is too prescriptive; I can see all sorts of people getting caught by that.
The point that the noble Lord, Lord Allan of Hallam, made about what constitutes a recognised news publisher is where the issue gets quite difficult. The point was made about the “wrong” organisations, but I want to know who decides what is right and wrong. We might all nod along when it comes to Infowars and RT, but there are lots of organisations that would potentially fail that test. My concern is that they would not be able to appeal when they are legitimate news organisations, even if not to everybody’s taste. Because I think that we already have too much speech limiting in the Bill, I do not want any more. This is important.
When it comes to talking about the “wrong” organisations, I noticed that the noble Lord, Lord McNally, referred to people who went to Rupert Murdoch’s parties. I declare my interests here: I have never been invited or been to a Rupert Murdoch party—although do feel free, I say, if he is watching—but I have read about them in newspapers. For some people in this Chamber, the “wrong” kind of news organisation is, for example, the Times or one with the wrong kind of owner. The idea that we will all agree or know which news publishers are the “wrong” kind is not clear, and I do not think that the test is going to sort it out.
Will the Minister explain what organisations can do if they fail the recognised news publisher test to appeal and say, “We are legitimate and should be allowed”? Why is there this idea that a member of the public cannot clip a recognised news publisher’s content without falling foul? Why would they not be given some exemption? I genuinely do not understand that.
My Lords, I shall speak very briefly. I feel a responsibility to speak, having spoken in Committee on a similar group of amendments when the noble Lords, Lord Lipsey and Lord McNally, were not available. I spoke against their amendments then and would do so again. I align myself with the comments of my noble friend Lord Black, the noble Lord, Lord Faulks, and the noble Viscount, Lord Colville. As the noble Baroness, Lady Fox, just said, they gave a comprehensive justification for that position. I have no intention of repeating it, or indeed repeating my arguments in Committee, but I think it is worth stating my position.
My Lords, we have heard some very well-rehearsed lines during the debate today, with the usual protagonists. Nevertheless, the truth of the matter is that the Press Recognition Panel is as frustrated as many of us on these Benches and other Benches at the failure to implement a post-Leveson scheme of press regulation. Despite many efforts, it has never been fully put into effect.
I do not think I need to repeat a great deal of what has been said today. For instance, the record of IPSO, which the noble Lord, Lord Faulks, talked about, has been very well tracked by Hacked Off. This is not a proposal for state regulation—which is so often, if you like, the canard placed on it.
If not this Bill, which Bill? The media Bill is not going to tackle issues such as this, as my noble friend Lord McNally said. As the noble Lord, Lord Stevenson, has pointed out, this Bill has been a series of conversations —extremely fruitful conversations—but in this particular direction it has borne no fruit at all.
I must admit that, throughout my looking at the draft Bill and continuing to look through its various versions, this opt-out for news publishers has remained a puzzle. The below-the-line opt-out for the mainstream news media always strikes me as strange, because there is no qualification that there should be any curation of that below-the-line, user-generated content. That is peculiar, and it is rather like somebody in the last chance saloon being rewarded with a bouquet. It seems a rather extraordinary provision.
My noble friend Lord Allan rightly pointed to some of the dangers in the new provisions, and indeed in the provisions generally, for these services. I hope the Minister has at least some answers to give to the questions he raised. Progress on this and the scheme that the PRP was set up to oversee, which is still not in place, remain a source of great division across the parties and within them. There is still hope; it may be that under a different Government we would see a different result.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, I offer my support to the amendment. I spent some time arguing in the retained EU law Bill for increased parliamentary scrutiny. My various amendments did not succeed but at the end of the day—on the final day of ping-pong—the Minister, the noble Lord, Lord Callanan, gave certain assurances based on what is in Schedule 5 to that Act, as it now is, involving scrutiny through committees. So the basic scheme which my noble kinsman has proposed is one which has a certain amount of precedent—although it is not an exact precedent; what might have been the “Callanan rule” is still open to reconstruction as the “Parkinson rule”. I support the amendment in principle.
My Lords, as the noble Lords, Lord Stevenson and Lord Clement-Jones, have already said, the Communications and Digital Select Committee did indeed recommend a new Joint Committee of both Houses to look specifically at the various different aspects of Ofcom’s implementation of what will be the Online Safety Act and ongoing regulation of digital matters. It is something I still have a lot of sympathy for. However, there has not been much appetite for such a Joint Committee at the other end of the Corridor. I do not necessarily think we should give up on that, and I will come back to that in a moment, but in place of that, I am not keen on what is proposed in Amendment 239, because my fear about how that is laid out is that it introduces something that appears a bit too burdensome and probably introduces too much delay in implementation.
To return to the bigger question, I think that we as parliamentarians need to reflect on our oversight of regulators, to which we are delegating significant new powers and requiring them to adopt a much more principles-based approach to regulation to cope with the fast pace of change in the technological world. We have to reflect on whether our current set-up is adequate for the way in which that is changing. What I have in mind is very much a strategic level of oversight, rather than scrutinising operational decisions, although, notwithstanding what the noble Lord has said, something specific in terms of implementation of the Bill and other new legislation is an area I would certainly wish to explore further.
The other aspect of this is making sure that our regulators keep pace too, not just with technology, and apply the new powers we give them in a way which meets our original intentions, but with the new political dynamics. Earlier today in your Lordships’ Chamber, there was a Question about how banks are dealing with political issues, and that raises questions about how the FCA is regulating the banking community. We must not forget that the Bill is about regulating content, and that makes it ever more sensitive. We need to keep reminding ourselves about this; it is very new and very different.
As has been acknowledged, there will continue to be a role for the Communications and Digital Select Committee, which I have the great privilege of chairing, in overseeing Ofcom. My noble friend Lord Grade and Dame Melanie Dawes appeared before us only a week ago. There is a role for the SIT Committee in the Commons; there is also probably some kind of ongoing role for the DCMS Select Committee in the Commons too, I am not sure. In a way, the fractured nature of that oversight makes it all the more critical that we join up a bit more. So I will take it upon myself to give this more thought and speak to the respective chairs of those committees in the other place, but I think that at some point we will need to consider, in some other fora, the way in which we are overseeing the work of regulators.
At some point, I think we will need to address the specific recommendations in the pre-legislative committee’s report, which were very much in line with what my own committee thought was right for the future of digital regulatory oversight, but on this occasion, I will not be supporting the specifics of Amendment 239.
My Lords, very briefly, I was pleased to see this, in whatever form it takes, because as we finish off the Bill, one thing that has come up consistently is that some of us have raised problems of potential unintended consequences, such as whether age gating will lead to a huge invasion of the privacy of adults rather than just narrowly protecting children, or whether the powers given to Ofcom will turn it into the most important and powerful regulator in the country, if not in Europe. In a highly complex Bill, is it possible for us to keep our eye on it a bit more than just by whingeing on the sidelines?
The noble Baroness, Lady Stowell, makes a very important point about the issue in relation to the FCA and banking. Nobody intended that to be the outcome of PEPs, for example, and nobody intended when they suggested encouraging banks to have values such as ESG or EDI—equality, diversity and inclusion—that that would lead to ordinary citizens of this country being threatened with having their banking turned off. It is too late to then retrospectively say, “That wasn’t what we ever intended”.
My Lords, I promise to speak very briefly. I welcome the Government’s amendments. I particularly welcome that they appear to mirror partly some of the safeguards that are embedded in the Investigatory Powers Act 2016.
I have one question for my noble friend the Minister about the wording, “a skilled person”. I am worried that “a skilled person” is a very vague term. I have been taken all through the course of this Bill by the comparison with the Investigatory Powers Act and the need to think carefully about how we balance the importance of privacy with the imperative of protecting our children and being able to track down the most evil and wicked perpetrators online. That is very similar to the debates that we had here several years ago on the Investigatory Powers Act.
The IPA created the Technical Advisory Board. It is not a decision-making body. Its purpose is to advise the Investigatory Powers Commissioner and judicial commissioners on the impact of changing technology and the development of techniques to use investigatory powers while maintaining privacy. It is an expert panel constituted to advise the regulator—in this case, the judicial commissioner—specifically on technology interventions that must balance this really difficult trade-off between privacy and child protection. Why have we not followed the same recipe? Rather than having a skilled person, why would we not have a technology advisory panel of a similar standing where it is clear to all who the members are. Those members would be required to produce a regular report. It might not need to be as regular as the IPA one, but it would just take what the Government have already laid one step further towards institutionalising the independent check that is really important if these Ofcom powers were ever to be used.
My Lords, I added my name to some amendments on this issue in Committee. I have not done so on Report, not least because I have been so occupied with other things and have not had the time to focus on this. However, I remain concerned about this part of the Bill. I am sympathetic to my noble friend Lord Moylan’s Amendment 255, but listening to this debate and studying all the amendments in this group, I am a little confused and so have some simple questions.
First, I heard my noble friend the Minister say that the Government have no intention to require the platforms to carry out general monitoring, but is that now specific in any of the amendments that he has tabled? Regarding the amendments which would bring further safeguards around the oversight of Ofcom’s use of this power, like my noble friend Lady Harding, I have always been concerned that the oversight approach should be in line with that for the Investigatory Powers Act and could never understand why it was not in the original version of the Bill. Like her, I am pleased that the Government have tabled some amendments, but I am not yet convinced that they go far enough.
That leads me to the amendments that have been tabled by the noble Lords, Lord Stevenson and Lord Clement-Jones, and particularly that in the name of the noble Lord, Lord Allan of Hallam. As his noble friend Lord Clement-Jones has added his name to it, perhaps he could answer my question when he gets up. Would the safeguards that are outlined there—the introduction of the Information Commissioner—meet the concerns of the big tech companies? Do we know whether it would meet their needs and therefore lead them not to feel it necessary to withdraw their services from the UK? I am keen to understand that.
There is another thing that might be of benefit for anyone listening to this debate who is not steeped in the detail of this Bill, and I look to any of those winding up to answer it—including my noble friend the Minister. Is this an end to end-to-end encryption? Is that what is happening in this Bill? Or is this about ensuring that what is already permissible in terms of the authorities being able to use their powers to go after suspected criminals is somehow codified in this Bill to make sure it has proper safeguards around it? That is still not clear. It would be very helpful to get that clarity from my noble friend, or others.
My Lords, it is a pleasure to follow the noble Baroness, Lady Stowell. My noble friend has spoken very cogently to Amendment 258ZA, and I say in answer to the question posed by the noble Baroness that I do not think this is designed to make big tech companies content. What it is designed to do is bring this out into the open and make it contestable; to see whether or not privacy is being invaded in these circumstances. To that extent it airs the issues and goes quite a long way towards allaying the concerns of those 80 organisations that we have heard from.
I am not going to repeat all the arguments of my noble friend, but many noble Lords, not least on the opposite Benches, have taken us through some of the potential security and privacy concerns which were also raised by my noble friends, and other reasons for us on these Benches putting forward these amendments. We recognise those concerns and indeed we recognise concerns on both sides. We have all received briefs from the NSPCC and the IWF, but I do not believe that essentially what is being proposed here in our amendments, or indeed by the amendments put forward by the noble Lord, Lord Stevenson, are designed in any way to prevent Ofcom doing its duty in relation to child sexual abuse and exploitation material in private messaging. We believe that review by the ICO to ensure that there is no invasion of privacy is a very useful mechanism.
We have all tried to find solutions and the Minister has put forward his stab at this with the skilled persons report. The trouble is, that does not go far enough, as the noble Baroness, Lady Stowell, said. Effectively, Ofcom can choose the skilled person and what the skilled person is asked to advise on. It is not necessarily comprehensive and that is essentially the major flaw.
As regards the amendments put forward by the noble Lord, Lord Stevenson, it is interesting that the Equality and Human Rights Commission itself said:
“We are concerned by the extent and seriousness of CSEA content being shared online. But these proposed measures may be a disproportionate infringement on millions of individuals’ right to privacy where those individuals are not suspected of any wrongdoing”.
It goes on to say:
“We recommend that Ofcom should be required to apply to an independent judicial commissioner—as is the case for mass surveillance under the Investigatory Powers Act”.
I am sure that is the reason why the noble Lord, Lord Stevenson, put forward his amendments; if he put them to a vote, we would follow and support. Otherwise, we will put our own amendments to the House.
I am grateful to noble Lords for their further scrutiny of this important but complex area, and for the engagement that we have had in the days running up to it as well. We know how child sexual exploitation and abuse offenders sadly exploit private channels, and the great danger that this poses, and we know how crucial these channels are for secure communication. That is why, where necessary and proportionate, and where all the safeguards are met, it is right that Ofcom can require companies to take all technically feasible measures to remove this vile and illegal content.
The government amendments in this group will go further to ensure that a notice is well informed and targeted and does not unduly restrict users’ rights. Privacy and safety are not mutually exclusive—we can and must have both. The safety of our children depends on it.
I make it clear again that the Bill does not require companies to break or weaken end-to-end encryption on their services. Ofcom can require the use of technology on an end-to-end encrypted service only when it is technically feasible and has been assessed as meeting minimum standards of accuracy. When deciding whether to issue a notice, Ofcom will engage in continual dialogue with the company and identify reasonable, technically feasible solutions to the issues identified. As I said in opening, it is right that we require technology companies to use their considerable resources and expertise to develop the best possible protections to keep children safe in encrypted environments. They are well placed to innovate to find solutions that protect both the privacy of users and the safety of children.
Just to be clear, am I right to understand my noble friend as saying that there is currently no technology that would be technically acceptable for tech companies to do what is being asked of them? Did he say that tech companies should be looking to develop the technology to do what may be required of them but that it is not currently available to them?
For clarification, if the answer to that is that the technology does not exist—which I believe is correct, although there are various snake oil salespeople out there claiming that it does, as the noble Baroness, Lady Fox of Buckley, said—my noble friend seems to be saying that the providers and services should develop it. This seems rather circular, as the Bill says that they must adopt an approved technology, which suggests a technology that has been imposed on them. What if they cannot and still get such a notice? Is it possible that these powers will never be capable of being used, especially if they do not co-operate?
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Stowell of Beeston
Main Page: Baroness Stowell of Beeston (Conservative - Life peer)Department Debates - View all Baroness Stowell of Beeston's debates with the Department for Digital, Culture, Media & Sport
(1 year, 2 months ago)
Lords ChamberI am very surprised that the Minister’s speech did not accede to the recommendations from the Delegated Powers and Regulatory Reform Committee, published last week, in the report we made after we were forced to meet during the Recess because of the Government’s failure with this Bill. From his private office, we want answers to what is set out in paragraphs 6 and 7:
“We urge the Minister to take the opportunity during the remaining stages of the Bill”—
which is today—
“to explain to the House”—
I will not read out the rest because it is quite clear. There are two issues—Henry VIII powers and skeleton legislation—and we require the Minister to accede to this report from a committee of the House.
I think that every member of the committee was present at the meeting on 29 August, the day after the bank holiday. We were forced to do that because the Government published amendments to Clauses 216 and 217 on 5 July, but they did not provide a delegated powers memorandum until 17 July, the date they were debated in this House. That prevented a committee of the House being able to report to the House on the issue of delegated powers. We are not interested in policy; all we are looking at is the delegated powers. We agreed that one of us would be here—as it is not a policy issue—to seek that the Minister responds to the recommendations of this committee of the House. I am very surprised that he has not done that.
My Lords, I am very concerned to hear the contribution from the noble Lord, Lord Rooker. I certainly look forward to hearing what the Minister says in reply. I confess that I was not aware of the Delegated Powers and Regulatory Powers Committee’s report to which he referred, and I wish to make myself familiar with it. I hope that he gets a suitable response from the Minister when he comes to wind up.
I am very grateful to the Minister for the amendments he tabled to Clause 44—Amendments 1 and 2. As he said, they ensure that there is transparency in the way that the Secretary of State exercises her power to issue a direction to Ofcom over its codes of practice. I remind the House—I will not detain your Lordships for very long—that the Communications and Digital Select Committee, which I have the privilege to chair, was concerned with the original Clause 39 for three main reasons: first, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything; secondly, those directions could be made without Parliament knowing; and, thirdly, the process of direction could involve a form of ping-pong between government and regulator that could go on indefinitely.
However, over the course of the Bill’s passage, and as a result of our debates, I am pleased to say that, taken as a package, the various amendments tabled by the Government—not just today but at earlier stages, including on Report—mean that our concerns have been met. The areas where the Secretary of State can issue a direction now follow the precedent set by the Communications Act 2003, and the test for issuing them is much higher. As of today, via these amendments, the directions must be published and laid before Parliament. That is critical and is what we asked for on Report. Also, via these amendments, if the Secretary of State has good reason not to publish—namely, if it could present a risk to national security—she will still be required to inform Parliament that the direction has been made and of the reasons for not publishing. Once the code is finalised and laid before Parliament for approval, Ofcom must publish what has changed as a result of the directions. I would have liked to have seen a further amendment limiting the number of exchanges, so that there is no danger of infinite ping-pong between government and regulator, but I am satisfied that, taken together, these amendments make the likelihood of that much lower, and the transparency we have achieved means that Parliament can intervene.
Finally, at the moment, the platforms and social media companies have a huge amount of unaccountable power. As I have said many times, for me, the Bill is about ensuring greater accountability to the public, but that cannot be achieved by simply shifting power from the platforms to a regulator. Proper accountability to the public means ensuring a proper balance of power between the corporations, the regulator, government and Parliament. The changes we have made to the Bill ensure the balance is now much better between government and the regulator. Where I still think we have work to do is on parliamentary oversight of the regulator, in which so much power is being invested. Parliamentary oversight is not a matter for legislation, but it is something we will need to return to. In the meantime, I once again thank the Minister and his officials for their engagement and for the amendments that have been made.
My Lords, I, too, thank the Minister for his engagement and for the amendments he has tabled at various stages throughout the passage of the Bill.
Amendment 15 provides a definition:
““age assurance” means age verification or age estimation”.
When the Minister winds up, could he provide details of the framework or timetable for its implementation? While we all respect that implementation must be delivered quickly, age verification provisions will be worthless unless there is swift enforcement action against those who transgress the Bill’s provisions. Will the Minister comment on enforcement and an implementation framework with direct reference to Amendment 15?
My Lords, I shall ask my noble friend the Minister a question about encryption but, before I do, I will briefly make a couple of other points. First, I echo all the tributes paid around the House to those involved in this legislation. It is no secret that I would have preferred the Bill to be about only child safety, so I particularly congratulate the Government, and the various Members who focused their efforts in that area, on what has been achieved via the Bill.
That said, the Government should still consider other non-legislative measures, such as banning smartphones in schools and government guidance for parents on things such as the best age at which to allow their children to have their own smartphones. These may not be points for DCMS, but they are worth highlighting at this point, as the Bill leaves us, soon to become legislation.
As I said on Report, I remain concerned about the reintroduction of some protections for adults, in lieu of “legal but harmful”, without any corresponding amendments to reinforce to Ofcom that freedom of expression must be the top priority for adults. We now have to leave it to Ofcom and see what happens. I know that the current leadership is deeply conscious of its responsibilities.
On encryption, I was pleased to hear what my noble friend said when he responded to the debate at Third Reading. If he is saying that the technology not existing means that Clause 122 cannot be deployed, as it were, by Ofcom, does that mean that the oversight measures that currently exist would not be deployed? As my noble friend will recall, one of the areas that we were still concerned about in the context of encryption was that what was in the Bill did not mirror what exists for RIPA. I am not sure whether that means that, because Clause 122 has been parked, our oversight concerns have been parked too. It would be helpful if the Minister could clarify that.
In the meantime, in the absence of Clause 122, it is worth us all reinforcing again that we want the tech firms to co-operate fully with law enforcement, either because a user has alerted them to illegal activity or when law enforcement suspects criminal behaviour and seeks their help. In that latter context, it would be helpful to understand what the Minister has said and to know what oversight that might involve. I congratulate my noble friend on this marathon Bill, and I am sorry to have delayed its passing.
My Lords, I will make a short contribution so that I do not disappoint the noble Lord, Lord Moylan; I will make a few direct and crunchy comments. First, I thank colleagues who participated in the debate for giving me a hearing, especially when I raised concerns about their proposals. It has been a constructive process, where we have been, as the Minister said, kicking the tyres, which is healthy in a legislature. It is better to do it now than to find faults when something has already become law.
I am in the unusual position of having worked on problems comparable to those we are now placing on Ofcom’s desk. I have enormous empathy for it and the hard work we are giving it. I do not think we should underestimate just how difficult this job is.
I want to thank the Minister for the additional clarification of how Ofcom will give orders to services that provide private communications. Following on from what the noble Baroness, Lady Stowell, said, I think this is a challenging area. We want Ofcom to give orders where this is easy—for example, to an unencrypted service hosting child sexual abuse material. The technology can be deployed today and is uncontroversial, so it is important that we do not forget that.
I heard the Minister say that we do not want Ofcom to move so fast that it breaks encryption. It should be moving but it should be careful. Those are the fears that have been expressed outside: on the day that this becomes law, Ofcom will issue orders to services providing encrypted communications that they will not be able to accept and therefore they will leave the UK. I think I heard from the Minister today that this is not what we want Ofcom to do. At the same time, as the noble Baroness, Lady Stowell said, we are not expecting Ofcom to ease off; any online service should be doing everything technically possible and feasible to deal with abhorrent material.
I humbly offer three pieces of advice to Ofcom as we pass the baton to it. This is based on having made a lot of mistakes in the past. If I had been given this advice, I might have done a better job in my previous incarnation. First, you cannot overconsult; Ofcom should engage with all interested parties, including those who have talked to us throughout the process of the Bill. It should engage with them until it is sick of engaging with them and then it should engage some more. In particular, Ofcom should try to bring together diverse groups, so I hope it gets into a room the kind of organisations that would be cheering on the noble Lord, Lord Moylan, as well as those that would be cheering on the noble Baroness, Lady Kidron. If Ofcom can bring them into the room, it has a chance of making some progress with its regulations.
Secondly, be transparent. The more information that Ofcom provides about what it is doing, the less space it will leave for people to make up things about what it is doing. I said this in the previous debate about the access request but it applies across the piece. We are starting to see some of this in the press. We are here saying that it is great that we now have a government regulator—independent but part of the UK state—overseeing online services. As soon as that happens, we will start to see the counterreaction of people being incredibly suspicious that part of the UK state is now overseeing their activity online. The best way to combat that is for Ofcom to be as transparent as possible.
Thirdly, explain the trade-offs you are making. This legislation necessarily involves trade-offs. I heard it again in the Minister’s opening remarks: we have indulged in a certain amount of cakeism. We love freedom of expression but we want the platforms to get rid of all the bad stuff. The rubber is going to hit the road once Ofcom has the powers and, in many cases, it will have to decide between one person’s freedom of expression and another’s harm. My advice is not to pretend that you can make both sides happy; you are going to disappoint someone. Be honest and frank about the trade-offs you have made. The legislation has lots of unresolved trade-offs in it because we are giving lots of conflicting instructions. As politicians, we can ride that out, but when Ofcom gets this and has to make real decisions, my advice would be to explain the trade-offs and be comfortable with the fact that some people will be unhappy. That is the only way it will manage to maintain confidence in the system. With that, I am pleased that the Bill has got to this stage and I have a huge amount of confidence in Ofcom to take this and make a success of it.