(1 year, 10 months ago)
Commons ChamberI completely support my hon. Friend’s comments and I was pleased to see her champion that cause in the Bill Committee. Of course I support amendment 43, tabled in the names of SNP colleagues, to ensure that the toggle is on by default. Abhorrent material is being shared and amplified—that is the key point, amplified—online by algorithms and by the processes and systems in place. It is obvious that the Government just do not get that. That said, there is a majority in Parliament and in the country for strengthening the Online Safety Bill, and Labour has been on the front foot in arguing for a stronger Bill since First Reading last year.
It is also important to recognise the sheer number of amendments and changes we have seen to the Bill so far. Even today, there are many more amendments tabled by the Government. If that does not give an indication of the mess they have made of getting this legislation over the line in a fit and proper state, I do not know what does.
I have said it before, and I am certain I will say it again, but we need to move forward with this Bill, not backward. That is why, despite significant Government delay, we will support the Bill’s Third Reading, as each day of inaction allows more harm to spread online. With that in mind, I too will make some progress.
I will first address new clause 1, tabled in my name and that of my hon. Friend the Member for Manchester Central (Lucy Powell). This important addition to the Bill will go some way to address the gaps around support for individual complaints. We in the Opposition have repeatedly queried Ministers and the Secretary of State on the mechanisms available for individuals who have appeals of complaints. That is why new clause 1 is so important. It is vital that platforms’ complaints procedures are fit for purpose, and this new clause will finally see the Secretary of State publishing a report on the options available to individuals.
We already know that the Bill in its current form fails to consider an appropriate avenue for individual complaints. This is a classic case of David and Goliath, and it is about time those platforms went further in giving their users a transparent, effective complaints process. That substantial lack of transparency underpins so many of the issues Labour has with the way the Government have handled—or should I say mishandled—the Bill so far, and it makes the process by which the Government proceeded to remove the all-important clauses on legal but harmful content, in a quiet room on Committee Corridor just before Christmas, even more frustrating.
That move put the entire Bill at risk. Important sections that would have put protections in place to prevent content such as health and foreign-state disinformation, the promotion of self-harm, and online abuse and harassment from being actively pushed and promoted were rapidly removed by the Government. That is not good enough, and it is why Labour has tabled a series of amendments, including new clauses 4, 5, 6 and 7, that we think would go some way towards correcting the Government’s extremely damaging approach.
Under the terms of the Bill as currently drafted, platforms could set whatever terms and conditions they want and change them at will. We saw that in Elon Musk’s takeover at Twitter, when he lifted the ban on covid disinformation overnight because of his own personal views. Our intention in tabling new clause 4 is to ensure that platforms are not able to simply avoid safety duties by changing their terms and conditions whenever they see fit. This group of amendments would give Ofcom the power to set minimum standards for platforms’ terms and conditions, and to direct platforms to change them if they do not meet those standards.
My hon. Friend is making an important point. She might not be aware of it, but I recently raised in the House the case of my constituents, whose 11-year-old daughter was groomed on the music streaming platform Spotify and was able to upload explicit photographs of herself on that platform. Thankfully, her parents found out and made several complaints to Spotify, which did not immediately remove that content. Is that not why we need the ombudsman?
I am aware of that case, which is truly appalling and shocking. That is exactly why we need such protections in the Bill: to stop those cases proliferating online, to stop the platforms from choosing their own terms of service, and to give Ofcom real teeth, as a regulator, to take on those challenges.