Online Safety Legislation: Abuse on Social Media Debate
Full Debate: Read Full DebateBaroness Jones of Whitchurch
Main Page: Baroness Jones of Whitchurch (Labour - Life peer)Department Debates - View all Baroness Jones of Whitchurch's debates with the Department for Business and Trade
(2 months, 2 weeks ago)
Lords ChamberTo ask His Majesty’s Government what assessment they have made of the ability of current online safety legislation to regulate abuse, including racism, Islamophobia, homophobia, and sectarianism, on social media platforms.
My Lords, as my noble friend will know, we take these issues very seriously. The Online Safety Act will tackle illegal abuse, protect children and empower users. Regulated providers, including social media companies, must implement systems to reduce the risk that their services are used for illegal activity, including illegal abuse. Under the Act, stirring up hatred is a priority offence, requiring providers to proactively combat illegal racism, Islamophobia, homophobia and sectarianism.
My Lords, I thank my noble friend the Minister for her detailed Answer. What consideration have the Government given to the flourishing of hate content on smaller online platforms, which they have the power to regulate under the highest tier of regulation: category 1 under Schedule 11 to the Online Safety Act? Are the Government minded to reject Ofcom’s advice not to use the powers available to them under the Act to do so?
My Lords, we share my noble friend’s concern about the flourishing of hate crime on these sites and particularly on smaller online platforms. The Secretary of State for DSIT is carefully considering Ofcom’s categorisation recommendations and will make regulations as soon as reasonably practical. He can decide to proceed with Ofcom’s advice or divert from it. If the latter approach is taken, a statement must be published explaining why.
My Lords, it was reported today that the United States, the EU and the UK are all expected to sign the Council of Europe’s convention on AI, which emphasises human rights and democratic values in its approach to the regulation of public and private sector systems. The convention, which is legally enforceable, requires signatories to be accountable for any harmful or discriminatory outcomes of AI systems and for victims of AI-related rights violations to have legal recourse. In addition to the offence of sharing, is now not the time to consider criminalising the creation of sexualised deepfake images without consent? The noble Baroness, Lady Owen, called for this on 13 February in your Lordships’ House, and described deepfake abuse, which is almost wholly misogynistic and now epidemic. It is the new frontier of violence against women.
My Lords, my noble friend will know that, in addition to the implementation of the Online Safety Act, we already have plans to bring forward a new data Bill where some of these issues can be debated. We also have ambitions to bring forward a further piece of AI legislation, on which we will have the opportunity to talk about those issues in more detail. He is absolutely right: these are serious issues. They were debated at length during the passage of the previous data protection Bill, and we hope to return to them again.
My Lords, is it not the case that Ofcom is letting down the public? What we need is to review the role of Ofcom and other regulators and, if they are failing to do their duties for the public, they should be removed from office.
My Lords, Ofcom has a very wide-ranging and serious set of responsibilities. There is no suggestion that it is not carrying out its responsibilities in the run-up to the implementation of the Online Safety Act. We are working very closely with Ofcom and believe that it will carry out those additional functions that we have given it with proper scrutiny and to high standards. Yes, there is a case for looking at all regulators; we have a debate on this on Monday in the House, and I am looking forward to that, but that is a wider issue. For the moment, we have to give Ofcom all the support that we can in implementing a very difficult set of regulations.
My Lords, the crafting of the Online Safety Act was fraught with exceptions, exclusions and loopholes, the most egregious of which is that regulated companies get safe harbour if they comply with Ofcom’s codes, but Ofcom has provided us with codes that have huge gaps in known harms. What plans do the Government have to fulfil their election promise to strengthen the OSA by ensuring that it protects all children effectively, even the very young, and that it has adequate mechanisms to act swiftly in a crisis, or with an evolving new risk, to stop abuse being whipped up algorithmically and directed at minority groups?
My Lords, I think that we are in danger of downplaying the significance of the Online Safety Act. It is a trail-blazing Act; the noble Baroness was very much involved in it. Our priority has to be to get that Act implemented. Under it, all user-to-user services and search providers, regardless of size, will have to take swift and effective action against illegal content, including criminal online abuse and posts of a sexual nature. We should get behind Ofcom and the Online Safety Act, and we will then obviously have to keep that Act under review, but we have the tools to do that.
Does the Minister agree that digital literacy is crucial, so that people are better able to identify often damaging misinformation and fake news? What is the Government’s strategy in that respect?
The noble Baroness makes an important point. Part of Ofcom’s responsibility is to heighten the role of media literacy. We are talking to the Department for Education, and obviously there is a role for schools to be involved in all this—but parents also have to take responsibility for their children, and for their access to these sites. The media literacy role that we have to play goes right throughout society; it is the responsibility of all of us to make sure that people understand, when they access these sites, what they are able to see and how all that can be moderated. Again, the social media companies have a particular responsibility to play in all that. We expect them to uphold their terms of service to make sure that children cannot access the sites that are inappropriate, and we will work with them to make sure that this happens.
I hope that the Government will look with sympathy at the Private Member’s Bill being brought forward by my noble friend Lady Owen of Alderley Edge, which the noble Lord, Lord Browne of Ladyton, mentioned. It deals with very important issues.
The Minister will be aware of the arrest of Pavel Durov in France—the founder and chief executive of the messaging application Telegram. I do not expect her to be able to comment on an ongoing investigation, but can she tell your Lordships’ House whether His Majesty’s Government have had any contact with the Government of France in relation to this matter and whether British law enforcement agencies have been involved in the investigation? I appreciate that she may need to write after checking with them.
I pay tribute to the noble Lord for all the work that he did in getting the Online Safety Act on to the statute book. With regard to Telegram, obviously we cannot comment on issues in another country’s jurisdiction. We have regular contact with all friendly nations dealing with those issues. I cannot comment on whether there has been specific dialogue on the issue of Telegram, but we would normally expect that to be something for the French Government to deal with.
My Lords, I recognise absolutely the urgency and importance of legislation in this area, but does the Minister agree that equally important is the work of tackling the prejudice that lies behind online abuse, and the important role therefore of intermediate institutions such as community groups and faith groups in tackling prejudice? What are the Government doing to support those groups in that work?
The right reverend Prelate makes a very important point. I think that we were all pleased with the community reaction to the riots. It was very heartening to see that people were not prepared to have those abhorrent views coming to the fore in their communities. We need to do more to encourage that community response, and we need to work with all of civil society, including the Church, to make sure that happens. We also need to make sure that the police, as they play a community role, make clear what is illegal and take action when actions in a locality are illegal. This is a much broader issue about civil society, and I agree with him.