Online Safety Bill Debate
Full Debate: Read Full DebateLord Stevenson of Balmacara
Main Page: Lord Stevenson of Balmacara (Labour - Life peer)Department Debates - View all Lord Stevenson of Balmacara's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, Amendment 56 proposes a pathway towards setting up an independent ombudsman for the social media space. It is in my name, and I am grateful to the noble Lord, Lord Clement-Jones, for his support. For reasons I will go into, my amendment is a rather transparent and blatant attempt to bridge a gap with the Government, who have a sceptical position on this issue, and I hope that the amendment in its present form will prove more attractive to them than our original proposal.
At the same time, the noble Baroness, Lady Newlove, has tabled an amendment on this issue, proposing an independent appeals mechanism
“to provide impartial out of court resolutions for individual users of regulated services”.
Given that this is almost exactly what I want to see in place—as was set out in my original amendment, which was subsequently rubbished by the Government—I have also signed the noble Baroness’s amendment, and I very much look forward to her speech. The Government have a choice.
The noble Baroness, Lady Fox, also has amendments in this group, although they are pointing in a slightly different direction. I will not speak to them at this point in the proceedings, although I make it absolutely clear that, while I look forward to hearing her arguments —she is always very persuasive—I support the Bill’s current proposals on super-complaints.
Returning to the question of why we think the Bill should make provision for an independent complaints system or ombudsman, I suppose that, logically, we ought first to hear the noble Baroness, Lady Newlove, then listen to the Government’s response, which presumably will be negative. My compromise amendment could then be considered and, I hope, win the day with support from all around the Committee—in my dreams.
We have heard the Government’s arguments already. As the Minister said in his introduction to the Second Reading debate all those months ago on 1 February 2023, he was unsympathetic. At that time, he said:
“Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them”.—[Official Report, 1/2/23; col. 690.]
Talk about getting your retaliation in first.
My proposal is based on the Joint Committee’s unanimous recommendation:
“The role of the Online Safety Ombudsman should be created to consider complaints about actions by higher risk service providers where either moderation or failure to address risks leads to … demonstrable harm (including to freedom of expression) and recourse to other routes of redress have not resulted in a resolution”.
The report goes on to say that there could
“be an option in the Bill to extend the remit of the Ombudsman to lower risk providers. In addition … the Ombudsman would as part of its role i) identify issues in individual companies and make recommendations to improve their complaint handling and ii) identify systemic industry wide issues and make recommendations on regulatory action needed to remedy them. The Ombudsman should have a duty to gather data and information and report it to Ofcom. It should be an ‘eligible entity’ to make super-complaints”
possible. It is a very complicated proposal. Noble Lords will understand from the way the proposal is framed that it would provide a back-up to the primary purpose of complaints, which must be to the individual company and the service it is providing. But it would be based on a way of learning from experience, which it would build up as time went on.
I am sure that the noble Lord, Lord Clement-Jones, will flesh out the Joint Committee’s thinking on this issue when he comes to speak, but I make the point that other countries preparing legislation on online safety are in fact building in independent complaints systems; we are an outlier on this. Australia, Canada and others have already legislated. Another very good example nearer to hand is in Ireland. We are very lucky to have with us today the noble Baroness, Lady Kidron, a member of the expert panel whose advice to the Irish Government to set up such a system in her excellent report in May 2022 has now been implemented. I hope that she will share her thoughts about these amendments later in the debate.
Returning to the Government’s reservations about including an ombudsman service in the Bill, I make the following points based on my proposals in Amendment 56. There need not be any immediate action. The amendment as currently specified requires Ofcom to review complaints systems set up by the companies under Clause 17 as to their effectiveness and efficiency. It asks Ofcom to take other evidence into account and then, and only then, to take the decision of whether to set up an ombudsman system. If there were no evidence of a need for such a service, it would not happen.
As for the other reservations raised by the Minister when he spoke at Second Reading, he said:
“Ombudsman services in other sectors are expensive”.
We agree, but we assume that this would be on a cost recovery model, as other Ofcom services are funded in that way. The primary focus will always be resolving complaints about actions or inactions of particular companies in the companies’ own redress systems, and Ofcom can always keep that under review.
He said that they are “often underused”. Since we do not know at the start what the overall burden will be, we think that the right solution is to build up slowly and let Ofcom decide. There are other reasons why it makes sense to prepare for such a service, and I will come to these in a minute.
He said that other ombudsman services
“primarily relate to complaints which result in financial compensation”.
That is true, but the evidence from other reports, and that we received in the Joint Committee, was that most complainants want non-financial solutions: they want egregious material taken down or to ensure that certain materials are not seen. They are not after the money. Where a company is failing to deliver on those issues in their own complaints system, to deny genuine complainants an appeal to an independent body seems perverse and not in accordance with natural justice.
He said that
“user complaints are likely to be complex”.—[Official Report, 1/2/23; col. 690.]
Yes, they probably are, but that seems to be an argument for an independent appeals body, not against it.
To conclude, we agree that Ofcom should not be the ombudsman and that the right approach is for Ofcom to set up the system as and when it judges that it would be appropriate. We do not want Ofcom to be swamped with complaints from users of regulated services, who, for whatever reason, have not been satisfied by the response of the individual companies or to complex cases, or seek system-wide solutions. But Ofcom needs to know what is happening on the ground, across the sector, as well as in each of the regulated companies, and it needs to be kept aware of how the system as a whole is performing. The relationship between the FCA and the Financial Ombudsman Service is a good model here. Indeed, the fact that some of the responsibilities to be given to Ofcom in the Bill will give rise to complaints to the FOS suggests that there would be good sense in aligning these services right from the start.
We understand that the experience from Australia is that the existence of an independent complaints function can strengthen the regulatory functions. There is also evidence that the very existence of an independent complaints mechanism can provide reassurances to users that their online safety is being properly supported. I beg to move.
My Lords, this is the first time that I have spoken in Committee. I know we have 10 days, but it seems that we will go even further because this is so important. I will speak to Amendments 250A and 250B.
I thank the noble Lords, Lord Russell of Liverpool and Lord Stevenson of Balmacara, and, of course— if I may be permitted to say so—the amazing noble Baroness, Lady Kidron, who is an absolute whizz on this, for placing their names on these amendments, as well as the 5Rights Foundation, the Internet Watch Foundation and the UK Safer Internet Centre for their excellent briefings. I have spoken to these charities, and the work they do is truly amazing. I do not think that the Bill will recognise just how much time and energy they give to support families and individuals. Put quite simply, we can agree that services’ internal complaint mechanisms are failing.
Let me tell your Lordships about Harry. Harry is an autistic teenager who was filmed by a member of the public in a local fast-food establishment when he was dysregulated and engaging in aggressive behaviour. This footage was shared out of context across social media, with much of the response online labelling Harry as a disruptive teenager who was engaging in unacceptable aggression and vandalising public property. This was shared thousands of times over the course of a few weeks. When Harry and his mum reported it to the social media platforms, they were informed that it did not violate community guidelines and that there was a public interest in the footage remaining online. The family, quite rightly, felt powerless. Harry became overwhelmed at the negative response to the footage and the comments made about his behaviour. He became withdrawn and stopped engaging. He then tried to take his own life.
I stress again that the period in question is two years not three.
The answer to the noble Lord’s question is that the super-complaint is not a mechanism for individuals to complain on an individual basis and seek redress.
This is getting worse and worse. I am tempted to suggest that we stop talking about this and try to, in a smaller group, bottom out what we are doing. I really think that the Committee deserves a better response on super-complaints than it has just heard.
As I understood it—I am sure that the noble Baroness, Lady Kidron, is about to make the same point—super-complaints are specifically designed to take away the pressure on vulnerable and younger persons to have responsibility only for themselves in bringing forward the complaint that needs to be resolved. They are a way of sharing that responsibility and taking away the pressure. Is the Minister now saying that that is a misunderstanding?
I have offered a meeting; I am very happy to host the meeting to bottom out these complaints.
As I said, I am very happy to hold the meeting. We are giving users greater protection through the Bill, and, as agreed, we can discuss individual routes to recourse.
I hope that, on the basis of what I have said and the future meeting, noble Lords have some reassurance that the Bill’s complaint mechanisms will, eventually, be effective and proportionate, and feel able not to press their amendments.
I am very sorry that I did not realise that the Minister was responding to this group of amendments; I should have welcomed him to his first appearance in Committee. I hope he will come back—although he may have to spend a bit of time in hospital, having received a pass to speak on this issue from his noble friend.
This is a very complicated Bill. The Minister and I have actually talked about that over tea, and he is now learning the hard lessons of what he took as a light badinage before coming to the Chamber today. However, we are in a bit of a mess here. I was genuinely trying to get an amendment that would encourage the department to move forward on this issue, because it is quite clear from the mood around the Committee that something needs to be resolved here. The way the Government are approaching this is by heading towards a brick wall, and I do not think it is the right way forward.
My Lords, this is unfamiliar territory for me, but the comprehensive introduction of the noble Baroness, Lady Fraser, has clarified the issue. I am only disappointed that we had such a short speech from the noble Lord, Lord Foulkes—uncharacteristic, perhaps I could say—but it was good to hear from the noble and learned Lord, Lord Hope, on this subject as well. The noble Baroness’s phrase “devolution deficit” is very useful shorthand for some of these issues. She has raised a number of questions about the Secretary of State’s powers under Clause 53(5)(c): the process, the method of consultation and whether there is a role for Ofcom’s national advisory committees. Greater transparency in order to understand which offences overlap in all this would be very useful. She deliberately did not go for one solution or another, but issues clearly arise where the thresholds are different. It would be good to hear how the Government are going to resolve this issue.
My Lords, it is a pity that we have not had the benefit of hearing from the Minister, because a lot of his amendments in this group seem to bear on some of the more generic points made in the very good speech by the noble Baroness, Lady Fraser. I assume he will cover them, but I wonder whether he would at least be prepared to answer any questions people might come back with—not in any aggressive sense; we are not trying to scare the pants off him before he starts. For example, the points made by the noble Lord, Lord Clement-Jones, intrigue me.
I used to have responsibility for devolved issues when I worked at No. 10 for a short period. It was a bit of a joke, really. Whenever anything Welsh happened, I was immediately summoned down to Cardiff and hauled over the coals. You knew when you were in trouble when they all stopped speaking English and started speaking Welsh; then, you knew there really was an issue, whereas before I just had to listen, go back and report. In Scotland, nobody came to me anyway, because they knew that the then Prime Minister was a much more interesting person to talk to about these things. They just went to him instead, so I did not really learn very much.
I noticed some issues in the Marshalled List that I had not picked up on when I worked on this before. I do not know whether the Minister wishes to address this—I do not want to delay the Committee too much—but are we saying that to apply a provision in the Bill to the Bailiwick of Guernsey or the Isle of Man, an Order in Council is required to bypass Parliament? Is that a common way of proceeding in these places? I suspect that the noble and learned Lord, Lord Hope, knows much more about this than I do—he shakes his head—but this is a new one on me. Does it mean that this Parliament has no responsibility for how its laws are applied in those territories, or are there other procedures of which we are unaware?
My second point again picks up what the noble Lord, Lord Clement-Jones, was saying. Could the Minister go through in some detail the process by which a devolved authority would apply to the Secretary of State—presumably for DSIT—to seek consent for a devolved offence to be included in the Online Safety Bill regime? If this is correct, who grants to what? Does this come to the House as a statutory instrument? Is just the Secretary of State involved, or does it go to the Privy Council? Are there other ways that we are yet to know about? It would be interesting to know.
To echo the noble Lord, Lord Clement-Jones, we probably do need a letter from the Minister, if he ever gets this cleared, setting out exactly how the variation in powers would operate across the four territories. If there are variations, we would like to know about them.
My Lords, I am very grateful to my noble friend Lady Fraser of Craigmaddie for her vigilance in this area and for the discussion she had with the Bill team, which they and I found useful. Given the tenor of this short but important debate, I think it may be helpful if we have a meeting for other noble Lords who also want to benefit from discussing some of these things in detail, and particularly to talk about some of the issues the noble Lord, Lord Stevenson of Balmacara, just raised. It would be useful for us to talk in detail about general questions on the operation of the law before we look at this again on Report.
In a moment, I will say a bit about the government amendments which stand in my name. I am sure that noble Lords will not be shy in taking the opportunity to interject if questions arise, as they have not been shy on previous groups.
I will start with the amendments tabled by my noble friend Lady Fraser. Her Amendment 58 seeks to add reference to the Human Rights Act 1998 to Clause 18. That Act places obligations on public authorities to act compatibly with the European Convention on Human Rights. It does not place obligations on private individuals and companies, so it would not make sense for such a duty on internet services to refer to the Human Rights Act.
Under that Act, Ofcom has obligations to act in accordance with the right to freedom of expression under Article 10 of the European Convention on Human Rights. As a result, the codes that Ofcom draws up will need to comply with the Article 10 right to freedom of expression. Schedule 4 to the Bill requires Ofcom to ensure that measures which it describes in a code of practice are designed in light of the importance of protecting the right of users’
“freedom of expression within the law”.
Clauses 44(2) and (3) provide that platforms will be treated as complying with their freedom of expression duty if they take the recommended measures that Ofcom sets out in the codes. Platforms will therefore be guided by Ofcom in taking measures to comply with its duties, including safeguards for freedom of expression through codes of practice.
My noble friend’s Amendment 136 seeks to add offences under the Hate Crime and Public Order (Scotland) Act 2021 to Schedule 7. Public order offences are already listed in Schedule 7 to the Bill, which will apply across the whole United Kingdom. This means that all services in scope will need proactively to tackle content that amounts to an offence under the Public Order Act 1986, regardless of where the content originates or where in the UK it can be accessed.
The priority offences list has been developed with the devolved Administrations, and Clause 194 outlines the parliamentary procedures for updating it. The requirements for consent will be set out in the specific subordinate legislation that may apply to the particular offence being made by the devolved authorities—that is to say, they will be laid down by the enabling statutes that Parliament will have approved.
Amendment 228 seeks to require the inclusion of separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s transparency reports. These transparency reports are based on the information requested from category 1, 2A and 2B service providers through transparency reporting. I assure my noble friend that Ofcom is already able to request country-specific information from providers in its transparency reports. The legislation sets out high-level categories of information that category 1, 2A and 2B services may be required to include in their transparency reports. The regulator will set out in a notice the information to be requested from the provider, the format of that information and the manner in which it should be published. If appropriate, Ofcom may request specific information in relation to each country in the UK, such as the number of users encountering illegal content and the incidence of such content.
Ofcom is also required to undertake consultation before producing guidance about transparency reporting. In order to ensure that the framework is proportionate and future-proofed, however, it is vital to allow the regulator sufficient flexibility to request the types of information that it sees as relevant, and for that information to be presented by providers in a manner that Ofcom has deemed to be appropriate.
Similarly, Amendment 225A would require separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s research about users’ experiences of regulated services. Clause 141 requires that Ofcom make arrangements to undertake consumer research to ascertain public opinion and the experiences of UK users of regulated services. Ofcom will already be able to undertake this research on a country-specific basis. Indeed, in undertaking its research and reporting duties, as my noble friend alluded to, Ofcom has previously adopted such an approach. For instance, it is required by the Communications Act 2003 to undertake consumer research. While the legislation does not mandate that Ofcom conduct and publish nation-specific research, Ofcom has done so, for instance through its publications Media Nations and Connected Nations. I hope that gives noble Lords some reassurance of its approach in this regard. Ensuring that Ofcom has flexibility in carrying out its research functions will enable us to future-proof the regulatory framework, and will mean that its research activity is efficient, relevant and appropriate.
I will now say a bit about the government amendments standing in my name. I should, in doing so, highlight that I have withdrawn Amendments 304C and 304D, previously in the Marshalled List, which will be replaced with new amendments to ensure that all the communications offences, including the new self-harm offence, have the appropriate territorial extent when they are brought forward. They will be brought forward as soon as possible once the self-harm offence has been tabled.
Amendments 267A, 267B, 267C, 268A, 268B to 268G, 271A to 271D, 304A, 304B and 304E are amendments to Clauses 160, 162, 164 to 166, 168 and 210 and Schedule 14, relating to the extension of the false and threatening communications offences and the associated liability of corporate officers in Clause 166 to Northern Ireland.
This group also includes some technical and consequential amendments to the false and threatening communications offences and technical changes to the Malicious Communications (Northern Ireland) Order 1988 and Section 127 of the Communications Act 2003. This will minimise overlap between these existing laws and the new false and threatening communications offences in this Bill. Importantly, they mirror the approach taken for England and Wales, providing consistency in the criminal law.
This group also contains technical amendments to update the extent of the epilepsy trolling offence to reflect that it applies to England, Wales and Northern Ireland.
I suggested that we might see a table, independent of the meetings, although I am sure they could coincide. Would it be possible to have a table of all the criminal offences that the Minister listed and how they apply in each of the territories? Without that, we are a bit at sea as to exactly how they apply.
This has been a very good debate indeed. I have good days and bad days in Committee. Good days are when I feel that the Bill is going to make a difference and things are going to improve and the sun will shine. Bad days are a bit like today, where we have had a couple of groups, and this is one of them, where I am a bit worried about where we are and whether we have enough—I was going to use that terrible word “ammunition” but I do not mean that—of the powers that are necessary in the right place and with the right focus to get us through some of the very difficult questions that come in. I know that bad cases make bad law, but they can also illustrate why the law is not good enough. As the noble Baroness, Lady Kidron, was saying, this is possibly one of the areas we are in.
The speeches in the debate have made the case well and I do not need to go back over it. We have got ourselves into a situation where we want to reduce harm that we see around but do not want to impact freedom of expression. Both of those are so important and we have to hold on to them, but we find ourselves struggling. What do we do about that? We think through what we will end up with this Bill on the statute book and the codes of practice through it. This looks as though it is heading towards the question of whether the terms of service that will be in place will be sufficient and able to restrict the harms we will see affecting people who should not be affected by them. But I recognise that the freedom of expression arguments have won the day and we have to live with that.
The noble Baroness, Lady Kidron, mentioned the riskiness of the smaller sites—categories 2A and 2B and the ones that are not even going to be categorised as high as that. Why are we leaving those to cause the damage that they are? There is something not working here in the structure of the Bill and I hope the Minister will be able to provide some information on that when he comes to speak.
Obviously, if we could find a way of expressing the issues that are raised by the measures in these amendments as being illegal in the real world, they would be illegal online as well. That would at least be a solution that we could rely on. Whether it could be policed and serviced is another matter, but it certainly would be there. But we are probably not going to get there, are we? I am not looking at the Minister in any hope but he has a slight downward turn to his lips. I am not sure about this.
How can we approach a legal but harmful issue with the sort of sensitivity that does not make us feel that we have reduced people’s ability to cope with these issues and to engage with them in an adult way? I do not have an answer to that.
Is this another amplification issue or is it deeper and worse than that? Is this just the internet because of its ability to focus on things to keep people engaged, to make people stay online when they should not, to make them reach out and receive material that they ought not to get in a properly regulated world? Is it something that we can deal with because we have a sense of what is moral and appropriate and want to act because society wants us to do it? I do not have a solution to that, and I am interested to hear what the Minister will say, but I think it is something we will need to come back to.
My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.
Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.
However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.
Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.
The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.
Yes, they are—with the addition of what I am coming to. In addition to the duty for companies to consider the role of algorithms, which I talked about, Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties, including the power to require information from providers about the operation of their algorithms. The regulator will be able to hold senior executives criminally liable if they fail to ensure that their company is providing Ofcom with the information it requests.
However, we must not restrict users’ right to see legal content and speech. These amendments would prescribe specific approaches for companies’ treatment of legal content accessed by adults, which would give the Government undue influence in choosing, on adult users’ behalf, what content they see—
I wanted to give the Minister time to get on to this. Can we now drill down a little on the terms of service issue? If the noble Baroness, Lady Kidron, is right, are we talking about terms of service having the sort of power the Government suggest in cases where they are category 1 and category 2A but not search? There will be a limit, but an awful lot of other bodies about which we are concerned will not fall into that situation.
Also, I thought we had established, much to our regret, that the terms of service were what they were, and that Ofcom’s powers—I paraphrase to make the point—were those of exposure and transparency, not setting minimum standards. But even if we are talking only about the very large and far-reaching companies, should there not be a power somewhere to engage with that, with a view getting that redress, if the terms of service do not specify it?
The Bill will ensure that companies adhere to their terms of service. If they choose to allow content that is legal but harmful on their services and they tell people that beforehand—and adults are able and empowered to decide what they see online, with the protections of the triple shield—we think that that strikes the right balance. This is at the heart of the whole “legal but harmful” debate in another place, and it is clearly reflected throughout the approach in the Bill and in my responses to all of these groups of amendments. But there are duties to tackle illegal content and to make sure that people know the terms of service for the sites they choose to interact with. If they feel that they are not being adhered to—as they currently are not in relation to suicide and self-harm content on many of the services—users will have the recourse of the regulator to turn to.