All 2 Marcus Fysh contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 17th Jan 2023
Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments

Online Safety Bill

Marcus Fysh Excerpts
Vicky Ford Portrait Vicky Ford (Chelmsford) (Con)
- View Speech - Hansard - - - Excerpts

Legislating in an online world is incredibly complex and full of pitfalls, because the digital world moves so fast that it is difficult to make effective and future-proof legislation. I do not want to wind up my hon. Friend the Member for Stone (Sir William Cash) by mentioning Europe, but I am proud to have worked alongside other British MEPs to introduce the GDPR, which the tech companies hated—especially the penalties.

The GDPR is not perfect legislation, but it fundamentally transformed how online actors think about the need to protect personal data, confidentiality and privacy. The Bill can do exactly the same and totally transform how online safety is treated, especially for children. I have been a proud champion of the Internet Watch Foundation for more than a decade and I have worked with it to tackle the hideous sexual abuse of children online. As a children’s Minister during the Bill’s passage, I am aware of the serious harms that the online world can and does pose, and I am proud that Ministers have put protecting children at the front of the Bill.

Along with other hon. Members, I have signed new clause 2. If, God forbid, hospital staff were constantly and repeatedly causing harm to children and the hospital boss was aware of it but turned a blind eye and condoned it, we would all expect that hospital boss to end up in the courts and, if necessary, in prison. Tech bosses should have the same. I thank the Government for saying that they will go along with the Irish style legislation here, and I look forward to their doing so.

My amendments—amendment 83 and new clause 8, which was not in scope—relate to eating disorders. Amendment 83 is intended to make it very clear that eating disorders should be treated as seriously as other forms of self-harm. I would like to thank everybody in the Chamber who spoke to me so kindly after I spoke in the last debate about my own experience as a former anorexic and all those outside the Chamber who have since contacted me.

Anorexia is the biggest killer of all mental illnesses. It is a sickness that has a slow and long-burning fuse, but all too often that fuse is deadly. There has been a terrifying rise in the number of cases, and it is very clear that social media posts that glamorise eating disorders are helping to fuel this epidemic. I am talking not about content that advertises a diet, but egregious content that encourages viewers to starve themselves in some cases—too many cases—to death. Content promoting eating disorders is no less dangerous than other content promoting other forms of self-harm; in fact, given the huge numbers of people suffering from eating disorders—about 1.25 million people in this country—it may be considered the most dangerous. It is dangerous not only for children, but for vulnerable adults.

My amendment, as I have said, endeavours to make it clear that content promoting eating disorders should be treated in the same way and as seriously as content promoting other forms of self-harm. I thank all those who signed it, including former Health Ministers and Digital Ministers, the current Chair of the Health and Social Care Committee, my hon. Friend the Member for Winchester (Steve Brine) and the current and former Chairs of the Women and Equalities Committee, my right hon. Friends the Members for Romsey and Southampton North (Caroline Nokes) and for Basingstoke (Dame Maria Miller). I hope the fact that MPs of such experience have signed these amendment sends a clear message to those in the other place that we treat this issue very seriously.

My amendment 83 is not the clearest legal way in which to manage the issue, so I do not intend to press it today. I thank the Secretary of State, the Minister responsible for the Bill and the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar), who I know want to move on this, for meeting me earlier today and agreeing that we will find a way to help protect vulnerable adults as well as children from being constantly subjected to this type of killing content. I look forward to continuing to work with Ministers and Members of the other place to find the best legally watertight way forward.

Marcus Fysh Portrait Mr Marcus Fysh (Yeovil) (Con)
- View Speech - Hansard - -

It is a pleasure to follow my right hon. Friend the Member for Chelmsford (Vicky Ford), who made a very powerful speech, and I completely agree with her about the importance of treating eating disorders as being of the same scale of harm as other things in the Bill.

I was the media analyst for Merrill Lynch about 22 years ago, and I made a speech about the future of media in which I mentioned the landscape changing towards one of self-generated media. However, I never thought we would get to where it is now and what the effect is. I was in the Pizza Express on Gloucester Road the other day at birthday party time, and an 11-year-old boy standing in the queue was doomscrolling TikTok videos rather than talking to his friends, which I just thought was a really tragic indication of where we have got to.

Digital platforms are also critical sources of information and our public discourse. Across the country, people gather up to 80% of information from such sources, but we should not have trust in them. Their algorithms, which promote and depromote, and their interfaces, which engage, are designed, as we have heard, to make people addicted to the peer validation and augmentation of particular points of view. They are driving people down tribal rabbit holes to the point where they cannot talk to each other or even listen to another point of view. It is no wonder that 50% of young people are unhappy or anxious when they use social media, and these algorithmic models are the problem. Trust in these platforms is wrong: their promotion or depromotion of messages and ideas is opaque, often subjective and subject to inappropriate influence.

It is right that we tackle illegal activity and that harms to children and the vulnerable are addressed, and I support the attempt to do that in the Bill. Those responsible for the big platforms must be held to account for how they operate them, but trusting in those platforms is wrong, and I worry that compliance with their terms of service might become a tick-box absolution of their responsibility for unhappiness, anxiety and harm.

What about harm to our public sphere, our discourse, and our processes of debate, policymaking and science? To trust the platforms in all that would be wrong. We know they have enabled censorship. Elon Musk’s release of the Twitter files has shown incontrovertibly that the big digital platforms actively censor people and ideas, and not always according to reasonable moderation. They censor people according to their company biases, by political request, or with and on behalf of the three-letter Government agencies. They censor them at the behest of private companies, or to control information on their products and the public policy debate around them. Censorship itself creates mistrust in our discourse. To trust the big platforms always to do the right thing is wrong. It is not right that they should be able to hide behind their terms of service, bury issues in the Ofcom processes in the Bill, or potentially pay lip service to a tick-box exercise of merely “having regard” to the importance of freedom of expression. They might think they can just write a report, hire a few overseers, and then get away scot-free with their cynical accumulation, and the sale of the data of their addicted users and the manipulation of their views.

The Government have rightly acknowledged that addressing such issues of online safety is a work in progress, but we must not think that the big platforms are that interested in helping. They and their misery models are the problem. I hope that the Government, and those in the other place, will include in the Bill stronger duties to stop things that are harmful, to promote freedom of expression properly, to ensure that people have ready and full access to the full range of ideas and opinions, and to be fully transparent in public and real time about the way that content is promoted or depromoted on their platforms. Just to trust in them is insufficient. I am afraid the precedent has been set that digital platforms can be used to censor ideas. That is not the future; that is happening right now, and when artificial intelligence comes, it will get even worse. I trust that my colleagues on the Front Bench and in the other place will work hard to improve the Bill as I know it can be improved.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- View Speech - Hansard - - - Excerpts

I strongly support the Bill. This landmark piece of legislation promises to put the UK at the front of the pack, and I am proud to see it there. We must tackle online abuse while protecting free speech, and I believe the Bill gets that balance right. I was pleased to serve on the Bill Committee in the last Session, and I am delighted to see it returning to the Chamber. The quicker it can get on to the statute book, the more children we can protect from devastating harm.

I particularly welcome the strengthened protections for children, which require platforms to clearly articulate in their terms of service what they are doing to enforce age requirements on their site. That will go some way to reassuring parents that their children’s developing brains will not be harmed by early exposure to toxic, degrading, and demeaning extreme forms of pornography. Evidence is clear that early exposure over time warps young girls’ views of what is normal in a relationship, with the result that they struggle to form healthy equal relationships. For boys, that type of sexual activity is how they learn about sex, and it normalises abusive, non-consensual and violent acts. Boys grow up into men whose neural circuits become habituated to that type of imagery. They actually require it, regardless of the boundaries of consent that they learn about in their sex education classes—I know this is a difficult and troubling subject, but we must not be afraid to tackle it, which is what we are doing with the Bill. It is well established that the rise of that type of pornography on the internet over time has driven the troubling and pernicious rise in violence against women and girls, perpetrated by men, as well as peer-on-peer child sexual abuse and exploitation.

During Committee we had a good debate about the need for greater criminal sanctions to hold directors individually to account and drive a more effective safety culture in the boardroom. I am proud to serve in the Chamber with my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates). I have heard about all their work on new clause 2 and commend them heartily for it. I listened carefully to the Minister’s remarks in Committee and thank him and the Secretary of State for their detailed engagement.

Online Safety Bill

Marcus Fysh Excerpts
Finally, as I said at the start, the Bill is not perfect and there is still much work to be done, but if we can agree the final changes we are discussing and, indeed, if their Lordships are prepared to endorse that next week, the very real prize to be won is that Ofcom can begin the work that it needs to do sooner rather than later and we can bring nearer the benefits that this legislation can deliver for the vulnerable online. More than that, we can enhance the reputation of Parliament as we show that we can do difficult legislation in otherwise fractious times with sincerity, seriousness and a willingness to compromise. I think that is a valuable prize and one within our grasp, and it is why I shall support the Government amendments.
Marcus Fysh Portrait Mr Marcus Fysh (Yeovil) (Con)
- View Speech - Hansard - -

It is a pleasure to follow my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who made a characteristically thoughtful speech. At the outset, I want to put on record my entry in the Register of Members’ Financial Interests, and also my chairmanships of the all-party parliamentary groups on digital identity and on central bank and digital currency, which includes stablecoins. I also put on record the fact that I am the father of an eight-year-old girl and a nine-year-old girl, who have both just got iPads, and I am very aware of the need to protect them as all other children in the UK.

I just want to say that I have had good engagement with Ministers during the progress of this Bill through all of its stages, and I want to thank them and their teams for that. I also want to say that I really welcome what is now in the Bill to progress what I talked about in this place at the last stage it was discussed, which was the effect of algorithms and all of those design features that are part of the addiction we have heard several Members talk about as a potential harm. I think it is really good that that is in the Bill, and it is really good that the tech companies are being forced to think hard about these things.

My amendments—they are proposals for amendments rather than ones I realistically thought we would adopt through votes today—were designed to address a couple of potential shortcomings I saw in the Bill. One was the potential chilling effect on innovation and in the use of really important services that are user-to-user services from a technical point of view, but are not involved in the transmission of the content we are trying to deal with as the main objectives of this Bill. So it was very welcome to hear my hon. Friend the Minister speak at the Dispatch Box about the Government’s intention not to cover the sorts of services to do with data exchange and multi-party computation—some of the more modern methods by which the internet of things, artificial intelligence and various other types of platform run—which are not about making content available that could be a risk in the context we are talking about.

The other shortcoming I was trying to address was this idea, coming back to my right hon. and learned Friend the Member for Kenilworth and Southam, of the potential for the introduction of systemic weaknesses and vulnerabilities into the core systems that all our communications, many of our services, Government services and others rely on day by day for their secure operation. I think he made a very interesting point about the need to think through the precise legal impact that the potential uncertainty about some of those issues might have on the operation of those systems.

I am trying to introduce amendments—for example, amendment (a) in lieu of Lords amendment 189—essentially to provide clarification. This is particularly important when we are thinking about the remote access powers or the remote viewing of information powers in Lords amendment 189, which is why I have proposed an amendment in lieu. It is incredibly important that what we do in this Bill does not create the really fundamental weaknesses that could undermine the security that we and all of our systems rely on for their core operations.

I was also trying to address people’s understandable desire for their data not to be potentially accessible by an unauthorised third party. That type of systemic weakness, which could be introduced by doing the access process in the wrong way, is something we need to think carefully about, and I hope the Minister will say something about intent in respect of that at the Dispatch Box.

I do not want to take too much more time because I know that lots of other Members wish to speak, but the place where I got these ideas, particularly around systemic weakness, were powers in Australian law that are there to provide protection from exactly that type of application of the regulations. I know officials think that Lords amendment 189 does not present such a systemic risk, because it is about viewing information remotely rather than having access to the system directly, but I think that needs more clarity. It actually states:

“view remotely—

information…in real time”

which could potentially be interpreted as requiring that type of access.

On proportionality—this is my last point—we must think about the concept of necessity within that. We must try to strike the right balance—I hope we will all try to do this—between wanting to encourage tech firms to divulge how their systems work, and give people, including the Government, tools to say when something is not working well and they want to opt out of it, while also ensuring that fundamental operative things that are used in cryptography and computer systems to communicate with each other securely, are not inadvertently undermined.