Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateMatt Rodda
Main Page: Matt Rodda (Labour - Reading Central)Department Debates - View all Matt Rodda's debates with the Department for Digital, Culture, Media & Sport
(2 years, 8 months ago)
Commons ChamberI will come on to some of those issues. My hon. Friend makes a valid point.
I fear the Government’s current solution to the balance between free speech and regulation will please no one and takes us down an unhelpful rabbit hole. Some believe the Bill will stifle free speech, with platforms over-zealously taking down legitimate political and other views. In response, the Government have put in what they consider to be protections for freedom of speech and have committed to setting out an exhaustive list of “legal but harmful” content, thus relying almost entirely on a “take down content” approach, which many will still see as Government overreach.
On the other hand, those who want harmful outcomes addressed through stronger regulation are left arguing over a yet-to-be-published list of Government-determined harmful content. This content-driven approach moves us in the wrong direction away from the “duty of care” principles the Bill is supposed to enshrine. The real solution is a systems approach based on outcomes, which would not only solve the free speech question, but make the Bill overall much stronger.
What does that mean in practice? Essentially, rather than going after individual content, go after the business models, systems and policies that drive the impact of such harms—[Interruption.] The Minister for Security and Borders, the right hon. Member for East Hampshire (Damian Hinds), says from a sedentary position that that is what the Bill does, but none of the leading experts in the field think the same. He should talk to some of them before shouting at me.
The business models of most social media companies are currently based on engagement, as my hon. Friend the Member for Liverpool, Walton (Dan Carden) outlined. The more engagement, the more money they make, which rewards controversy, sensationalism and fake news. A post containing a racist slur or anti-vax comment that nobody notices, shares or reads is significantly less harmful than a post that is quickly able to go viral. A collective pile-on can have a profoundly harmful effect on the young person on the receiving end, even though most of the individual posts would not meet the threshold of harmful.
I will not, sorry. Facebook whistleblower Frances Haugen, who I had the privilege of meeting, cited many examples to the Joint Committee on the draft Online Safety Bill of Facebook’s models and algorithms making things much worse. Had the Government chosen to follow the Joint Committee recommendations for a systems-based approach rather than a content-driven one, the Bill would be stronger and concerns about free speech would be reduced.
Online Safety Bill Debate
Full Debate: Read Full DebateMatt Rodda
Main Page: Matt Rodda (Labour - Reading Central)Department Debates - View all Matt Rodda's debates with the Department for Digital, Culture, Media & Sport
(2 years ago)
Commons ChamberI am grateful to have the opportunity to speak in this debate. I commend the right hon. Member for Basingstoke (Dame Maria Miller) on her work in this important area. I would like to focus my remarks on legal but harmful content and its relationship to knife crime, and to mention a very harrowing and difficult constituency case of mine. As we have heard, legal but harmful content can have a truly dreadful effect. I pay tribute to the families of the children who have been lost, who have attended the debate, a number of whom are still in the Public Gallery.
Just to be clear, the hon. Gentleman’s speech must relate to the amendments before us today.
Thank you, Madam Deputy Speaker. A boy called Olly Stephens in my constituency was just 13 years old when he was stabbed and brutally murdered in an attack linked to online bullying. He died, sadly, very near his home. His parents had little idea of the social media activity in his life. It is impossible to imagine what they have been through. Our hearts go out to them.
Harmful but legal content had a terrible effect on the attack on Olly. The two boys who attacked and stabbed him had been sharing enormous numbers of pictures and videos of knives, repeatedly, over a long period of time. There were often videos of teenagers playing with knives, waving them or holding them. They circulated them on 11 different social media platforms over a long period of time. None of those platforms took any action to take the content down. We all need to learn more about such cases to fully understand the impact of legal but harmful content. Even at this late stage, I hope that the Government will think again about the changes they have made to the Bill and include this area again in the Bill.
There is a second aspect of this very difficult case that I want to mention: the fact that Olly’s murder was discussed on social media and was planned to some extent beforehand. The wider issues here underline the need for far greater regulation and moderation of social media, in particular teenagers’ use of these powerful sites. I am finding it difficult to talk about some of these matters, but I hope that the Government will take my points on board and address the issue of legal but harmful content, and that the Minister will think again about these important matters. Perhaps we will have an opportunity to discuss it in the Bill’s later stages.
I am pleased to follow my fairly close neighbour from Berkshire, the hon. Member for Reading East (Matt Rodda). He raised the issue of legal but harmful content, which I will come to, as I address some of the amendments before us.
I very much welcome the new shape and focus of the Bill. Our primary duty in this place has to be to protect children, above almost all else. The refocusing of the Bill certainly does that, and it is now in a position where hon. Members from all political parties recognise that it is so close to fulfilling its function that we want it to get through this place as quickly as possible with today’s amendments and those that are forthcoming in the Lords and elsewhere in future weeks.
The emerging piece of legislation is better and more streamlined. I will come on to further points about legal but harmful, but I am pleased to see that removed from the Bill for adults and I will explain why, given the sensitive case that the hon. Member for Reading East mentioned. The information that he talked about being published online should be illegal, so it would be covered by the Bill. Illegal information should not be published and, within the framework of the Bill, would be taken down quickly. We in this place should not shirk our responsibilities; we should make illegal the things that we and our constituents believe to be deeply harmful. If we are not prepared to do that, we cannot say that some other third party has a responsibility to do it on our behalf and we are not going to have anything to do with it, and they can begin to make the rules, whether they are a commercial company or a regulator without those specific powers.
I welcome the shape of the Bill, but some great new clauses have been tabled. New clause 16 suggests that we should make it an offence to encourage self-harm, which is fantastic. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) has indicated that he will not press it to a vote, because the Government and all of us acknowledge that that needs to be dealt with at some point, so hopefully an amendment will be forthcoming in the near future.
On new clause 23, it is clear that if a commercial company is perpetrating an illegal act or is causing harm, it should pay for it, and a proportion of that payment must certainly support the payments to victims of that crime or breach of the regulations. New clauses 45 to 50 have been articulately discussed by my right hon. Friend the Member for Basingstoke (Dame Maria Miller). The technology around revenge pornography and deepfakes is moving forward every day. With some of the fakes online today, it is not possible to tell that they are fakes, even if they are looked at under a microscope. Those areas need to be dealt with, but it is welcome that she will not necessarily press the new clauses to a vote, because those matters must be picked up and defined in primary legislation as criminal acts. There will then be no lack of clarity and we will not need the legal but harmful concept—that will not need to exist. Something will either be illegal, because it is harmful, or not.
The Bill is great because it provides a framework that enables everything else that hon. Members in the House and people across the country may want to be enacted at a future date. It also enables the power to make those judgments to remain with this House—the democratically elected representatives of the people—rather than some grey bureaucratic body or commercial company whose primary interest is rightly to make vast sums of money for its shareholders. It is not for them to decide; it is for us to decide what is legal and what should be allowed to be viewed in public.
On amendment 152, which interacts with new clause 11, I was in the IT industry for about 15 to 20 years before coming to this place, albeit with a previous generation of technology. When it comes to end-to-end encryption, I am reminded of King Canute, who said, “I’m going to pass a law so that the tide doesn’t come in.” Frankly, we cannot pass a law that bans mathematics, which is effectively what we would be trying to do if we tried to ban encryption. The nefarious types or evildoers who want to hide their criminal activity will simply use mathematics to do that, whether in mainstream social media companies or through a nefarious route. We have to be careful about getting rid of all the benefits of secure end-to-end encryption for democracy, safety and protection from domestic abuse—all the good things that we want in society—on the basis of a tiny minority of very bad people who need to be caught. We should not be seeking to ban encryption; we should be seeking to catch those criminals, and there are ways of doing so.
I welcome the Bill; I am pleased with the new approach and I think it can pass through this House swiftly if we stick together and make the amendments that we need. I have had conversations with the Minister about what I am asking for today: I am looking for an assurance that the Government will enable further debate and table the amendments that they have suggested. I also hope that they will be humble, as my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said, and open to some minor adjustments, even to the current thinking, to make the Bill pass smoothly through the Commons and the Lords.
I would like the Government to confirm that it is part of their vision that it will be this place, not a Minister of State, that decides every year—or perhaps every few months, because technology moves quickly—what new offences need to be identified in law. That will mean that Ofcom and the criminal justice system can get on to that quickly to ensure that the online world is a safer place for our children and a more pleasant place for all of us.
Online Safety Bill Debate
Full Debate: Read Full DebateMatt Rodda
Main Page: Matt Rodda (Labour - Reading Central)Department Debates - View all Matt Rodda's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberI agree with my right hon. Friend; that is exactly right, and it is also right that we look at including additional offences on the face of the Bill in schedule 7 as offences that will be considered as part of the legislation.
Where this touches on advertising, the Government have already accepted, following the recommendation of the Joint Committee, that the promotion of fraud should be regulated in the Bill, even if it is in advertising. There are other aspects of this, too, including modern slavery and immigration, where we need to move at pace to close the loophole where consideration was to be given to advertising outside of the Bill through the online advertising review. The principle has already been accepted that illegal activity promoted through an advert on an online platform should be regulated as well as if it was an organic posting. That general provision does not yet exist, however. Given that the Government have considered these additional amendments, which was the right thing to do, they also need to look at the general presumption that any illegal activity that is a breach of the safety duties should be included and regulated, and that if somebody includes it in an advert it does not become exempt, when it would be regulated if it was in an organic posting.
I would like to focus on new clause 1, dealing with redress, new clause 43, dealing with the toggle default, and new clause 4 on minimum standards. This Bill is a very important piece of legislation, but I am afraid that it has been seriously watered down by the Government. In particular, it has been seriously weakened by the removal of measures to tackle legal but harmful content. I acknowledge that some progress has been made recently, now that the Government have accepted the need for criminal sanctions for senior managers of tech companies. However, there are still many gaps in the Bill and I want to deal with some of them in the time available to me tonight.
First, I pay tribute to the families who have lost children due to issues related to social media. Some of those families are in the Public Gallery tonight. In particular, I want to mention the Stephens family from my Reading East constituency. Thirteen-year-old Olly Stephens was murdered in an horrific attack following a plot hatched on social media. The two boys who attacked Olly had both shared dozens of images of knives online, and they used 11 different social media platforms to do so. Sadly, none of the platforms took down the content, which is why these matters are so important to all of us and our communities.
Following this awful case, I support a number of new clauses that I believe would lead to a significant change in the law to prevent a similar tragedy. I stress the importance of new clause 1, which would help parents to make complaints. As Olly’s dad, Stuart, often says, “You simply cannot contact the tech companies. You send an email and get no reply.” It is important to tackle this matter, and I believe that new clause 1 would go some way towards doing that.
As others have said, surely it makes sense for parents to know their children have some protection from harmful content. New clause 43 would provide reassurance by introducing a default position of protecting children. I urge Members on both sides of the House to support this new clause. Both children and vulnerable adults should be better protected from legal but harmful content, and further action should be taken. New clause 43 would take clear steps in that direction.
I am aware of time, and I support many other important new clauses. I reiterate my support and backing for my Front-Bench colleague, my hon. Friend the Member for Pontypridd (Alex Davies-Jones). Thank you, Madam Deputy Speaker, for the opportunity to contribute to this debate.
It is a pleasure to follow the hon. Member for Reading East (Matt Rodda). I congratulate him on his moving tribute to his constituent’s son. It is a terrible story.
This Bill will be life changing for many, but I am sorry to say that it has taken far too long to get to this point. The Government promised in 2015 to end children’s exposure to harmful online material, and in 2017 they committed to making the UK the safest place for children to be online. This morning, as I waited in the freezing cold on the station platform for a train that was late, a fellow passenger spoke to me about the Bill. He told me how happy he is that action is, at last, under way to protect children from the dangers of the internet. As a father of three young children, he told me that the internet is one of his greatest concerns.
I am afraid that, at the moment, the internet is as lawless as the wild west, and children are viewing images of abuse, addiction and self-harm on a daily basis. As others have said, the stats are shocking. Around 3,500 online child sex offences are recorded by police each month, and each month more than a million UK children access online pornography. It has been said that, in the time it takes to make a cup of tea, a person who joins certain popular social media platforms will have been introduced to suicidal content, “Go on, just kill yourself. You know you want to.”
I am incredibly proud that our Government have introduced a Bill that will change lives for the better, and I hope and expect it will be a “best in class” for other Governments to do likewise. I pay tribute to my right hon. Friend the Secretary of State for Digital, Culture, Media and Sport and her predecessors for their ruthless focus on making the online world a safer place. Ultimately, improving lives is what every MP is here to do, and on both sides of the House we should take great delight that, at last, this Bill will have its remaining Commons stages today.
I pay tribute to my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone (Sir William Cash) for their determination to give the Bill even more teeth, and I sincerely thank the Secretary of State for her willingness not only to listen but to take action.
New clause 2, tabled by my hon. Friends, will not be pressed because the Secretary of State has agreed to table a Government amendment when the Bill goes to the other place. New clause 2 sought to create a backstop so that, if a senior manager in a tech firm knowingly allows harm to be caused to a child that results in, for example their abuse or suicide, the manager should be held accountable and a criminal prosecution, with up to two years in prison, should follow. I fully appreciate that many in the tech world say, first, that that will discourage people from taking on new senior roles and, secondly, that it will discourage inward investment in the UK tech sector. Those serious concerns deserve to be properly addressed.
First, with regard to the potential for senior tech staff to be unwilling to take on new roles where there is this accountability, I would argue that from my experience as City Minister in 2015 I can provide a good example of why that is an unnecessary concern. We were seeking to address the aftermath of the 2008 financial crisis and we established the possibility of criminal liability for senior financial services staff. It was argued at the time that that would be highly damaging to UK financial services and that people would be unwilling to take on directorships and risk roles. I think we can all see clearly that those concerns were unfounded. Some might even say, “Well, tech firms would say that, wouldn’t they?”. The likelihood of a criminal prosecution will always be low, but the key difference is that in the future tech managers, instead of waking up each day thinking only about business targets, will wake up thinking, “Have I done enough to protect children, as I meet my business targets?”. I am sure we can agree that that would be a very good thing.
Secondly, there are those who argue that inward investment to the UK’s tech sector would be killed off by this move, and that would indeed be a concern. The UK tech sector leads in Europe, and at the end of 2022 it retained its position as the main challenger to the US and China. Fast-growing UK tech companies have continued to raise near-record levels of investment—more than France and Germany combined. The sector employs 3 million people across the UK and continues to thrive. So it is absolutely right that Ministers take seriously the concerns of these major employers.
However, I think we can look to Ireland as a good example of a successful tech hub where investment has not stopped as a result of strong accountability laws. The Irish Online Safety and Media Regulation Act 2022 carries a similar criminal responsibility to the one proposed in new clause 2, yet Ireland remains a successful tech hub in the European Union.