All 4 Rachel Maclean contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Mon 5th Dec 2022
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Tue 17th Jan 2023

Online Safety Bill

Rachel Maclean Excerpts
Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Exactly that. My hon. Friend is absolutely right. I come back to the point about drafting this legislation, which is not straightforward and easy because of the definitions. It is not just about what is in scope of the Bill but about the implications of the definitions and how they could be applied in law.

The Minister touched on the criminal side of things; interpretation in the criminal courts and how that would be applied in case law are the points that need to be fleshed out. This is where our work on CT is so important, because across the world with Five Eyes we have been consistent. Again, there are good models out there that can be built upon. We will not fix all this through one Bill—we know that. This Bill is foundational, which is why we must move forward.

On new clause 11, I seek clarity—in this respect, I need reassurance not from the Minister but from other parts of government—on how victims and survivors, whether of terrorist activity, domestic abuse or violence against women and girls, will be supported and protected by the new safeguards in the Bill, and by the work of the Victims’ Commissioner.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - -

I thank my right hon. Friend for sharing her remarks with the House. She is making an excellent speech based on her considerable experience. On the specific issue of child sexual abuse and exploitation, many organisations, such as the Internet Watch Foundation, are instrumental in removing reports and web pages containing that vile and disgusting material. In the April 2020 White Paper, the Government committed to look at how the Internet Watch Foundation could use its technical expertise in that field. Does she agree that it would be good to hear from the Minister about how the Internet Watch Foundation could work with Ofcom to assist victims?

ONLINE SAFETY BILL (First sitting)

Rachel Maclean Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. It was helpful to hear the Minister’s clarification of age assurance and age verification, and it was useful for him to put on the record the difference between the two.

I have a couple of points. In respect of Ofcom keeping up to date with the types of age verification and the processes, new ones will come through and excellent new methods will appear in coming years. I welcome the Minister’s suggestion that Ofcom will keep up to date with that, because it is incredibly important that we do not rely on, say, the one provider that there is currently, when really good methods could come out. We need the legislation to ensure that we get the best possible service and the best possible verification to keep children away from content that is inappropriate for them.

This is one of the most important parts of the Bill for ensuring that we can continue to have adult sections of the internet—places where there is content that would be disturbing for children, as well as for some adults—and that an age-verification system is in place to ensure that that content can continue to be there. Websites that require a subscription, such as OnlyFans, need to continue to have in place the age-verification systems that they currently have. By writing into legislation the requirement for them to continue to have such systems in place, we can ensure that children cannot access such services but adults can continue to do so. This is not about what is banned online or about trying to make sure that this content does not exist anywhere; it is specifically about gatekeeping to ensure that no child, as far as we can possibly manage, can access content that is inappropriate for kids.

There was a briefing recently on children’s access to pornography, and we heard horrendous stories. It is horrendous that a significant number of children have seen inappropriate content online, and the damage that that has caused to so many young people cannot be overstated. Blocking access to adult parts of the internet is so important for the next generation, not just so that children are not disturbed by the content they see, but so that they learn that it is not okay and normal and understand that the depictions of relationships in pornography are not the way reality works, not the way reality should work and not how women should be treated. Having a situation in which Ofcom or anybody else is better able to take action to ensure that adult content is specifically accessed only by adults is really important for the protection of children and for protecting the next generation and their attitudes, particularly towards sex and relationships.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - -

I wish to add some brief words in support of the Government’s proposals and to build on the comments from Members of all parties.

We know that access to extreme and abusive pornography is a direct factor in violence against women and girls. We see that play out in the court system every day. People claim to have watched and become addicted to this type of pornography; they are put on trial because they seek to play that out in their relationships, which has resulted in the deaths of women. The platforms already have technology that allows them to figure out the age of people on their platforms. The Bill seeks to ensure that they use that for a good end, so I thoroughly support it. I thank the Minister.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

There are two very important and distinct issues here. One is age verification. The platforms ask adults who have identification to verify their age; if they cannot verify their age, they cannot access the service. Platforms have a choice within that. They can design their service so that it does not have adult content, in which case they may not need to build in verification systems—the platform polices itself. However, a platform such as Twitter, which allows adult content on an app that is open to children, has to build in those systems. As the hon. Member for Aberdeen North mentioned, people will also have to verify their identity to access a service such as OnlyFans, which is an adult-only service.

ONLINE SAFETY BILL (Third sitting)

Rachel Maclean Excerpts
Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

I rise to recognise the spirit and principle behind new clause 9, while, of course, listening carefully to the comments made by my hon. Friend the Member for Folkestone and Hythe. He is right to raise those concerns, but my question is: is there an industry-specific way in which the same responsibility and liability could be delivered?

I recognise too that the Bill is hugely important. It is a good Bill that has child protection at its heart. It also contains far more significant financial penalties than we have previously seen—as I understand it, 10% of qualifying revenue up to £18 million. This will drive some change, but it comes against the backdrop of multi-billion-pound technology companies.

I would be interested to understand whether a double lock around the board-level responsibility might further protect children from some of the harrowing and harmful content we see online. What we need is nothing short of transformation and significant culture change. Even today, The Guardian published an article about TikTok and a study by the Centre for Countering Digital Hate, which found that teenagers who demonstrated an interest in self-harm and eating disorders were having algorithms pushing that content on to them within minutes. That is most troubling.

We need significant, serious and sustained culture change. There is precedent in other sectors, as has been mentioned, and there was a previous recommendation, so clearly there is merit in this. My understanding is that there is strong public support, because the public recognise that this new responsibility cannot be strengthened by anything other than liability. If there is board-level liability, that will drive priorities and resources, which will broker the kind of change we are looking for. I look forward to what the Minister might share today, as this has been a good opportunity to bring these issues into further consideration, and they might then be carried over into subsequent stages of this excellent Bill.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - -

I would like to build on the excellent comments from my colleagues and to speak about child sexual abuse material. I thank my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone for tabling the amendment. I am very interested in how we can use the excellent provisions in the Bill to keep children safe from child sexual abuse material online. I am sure the Committee is aware of the devastating impact of such material.

Sexual abuse imagery—of girls in particular—is increasingly prevalent. We know that 97% of this material in 2021 showed female children. The Internet Watch Foundation took down a record-breaking 252,000 URLs that had images of children being raped, and seven in 10 of those images were of children aged 11 to 13. Unfortunately, the National Crime Agency estimates that between 550,000 and 850,000 people in the UK are searching for such material on the internet. They are actively looking for it, and at the moment they are able to find it.

My concern is with how we use what is in the Bill already to instil a top-down culture in companies, because this is about culture change in the boardroom, so that safety is considered with every decision. I have read the proceedings from previous sittings, and I recognise that the Government and Ministers have said that we have sufficient provisions to protect children, but I think there is a little bit of a grey area with tech companies.

I want to mention Apple and the update it was planning for quite a few years. There was an update that would have automatically scanned for child sex abuse material. Apple withdrew it following a backlash from encryption and privacy experts, who claimed it would undermine the privacy and security of iCloud users and make people less safe on the internet. Having previously said that it would pause it to improve it, Apple now says that it has stopped it altogether and that it is vastly expanding its end-to-end encryption, even though law enforcement agencies around the world, including our own UK law enforcement agencies, have expressed serious concerns because it makes investigations and prosecution more challenging. All of us are not technical experts. I do not believe that we are in a position to judge how legitimate it is for Apple to have this pause. What we do know is that while there is this pause, the risks for children are still there, proliferating online.

We understand completely that countering this material involves a complicated balance and that the tech giants need to walk a fine line between keeping users safe and keeping their data safe. But the question is this: if Apple and others continue to delay or backtrack, will merely failing to comply with an information request, which is what is in the Bill now, be enough to protect children from harm? Could they delay indefinitely and still be compliant with the Bill? That is what I am keen to hear from the Minister. I would be grateful if he could set out why he thinks that individuals who have the power to prevent the harmful content that has torn apart the lives of so many young people and their families should not face criminal consequences if they fail to do so. Can he reassure us as to how he thinks that the Bill can protect so many children—it is far too many children—from this material online?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports new clause 9, as liability is an issue that we have repeatedly raised throughout the passage of the Bill—most recently, on Report. As colleagues will be aware, the new clause would introduce criminal liabilities for directors who failed to comply with their duties. This would be an appropriate first step in ensuring a direct relationship between senior management of platforms and companies, and their responsibilities to protect children from significant harm. As we have heard, this measure would drive a more effective culture of awareness and accountability in relation to online safety at the top of and within the entire regulated firm. It would go some way towards ensuring that online safety was at the heart of the governance structures internally. The Bill must go further to actively promote cultural change and put online safety at the forefront of business models; it must ensure that these people are aware that it is about keeping people safe and that that must be at the forefront, over any profit. A robust corporate and senior management liability scheme is needed, and it needs to be one that imposes personal liability on directors when they put children at risk.

The Minister knows as well as I do that the benefits of doing so would be strong. We have only to turn to the coroner’s comments in the tragic case of Molly Russell’s death—which I know we are all mindful of as we debate this Bill—to fully understand the damaging impact of viewing harmful content online. I therefore urge the Minister to accept new clause 9, which we wholeheartedly support.

Online Safety Bill

Rachel Maclean Excerpts
Marcus Fysh Portrait Mr Marcus Fysh (Yeovil) (Con)
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow my right hon. Friend the Member for Chelmsford (Vicky Ford), who made a very powerful speech, and I completely agree with her about the importance of treating eating disorders as being of the same scale of harm as other things in the Bill.

I was the media analyst for Merrill Lynch about 22 years ago, and I made a speech about the future of media in which I mentioned the landscape changing towards one of self-generated media. However, I never thought we would get to where it is now and what the effect is. I was in the Pizza Express on Gloucester Road the other day at birthday party time, and an 11-year-old boy standing in the queue was doomscrolling TikTok videos rather than talking to his friends, which I just thought was a really tragic indication of where we have got to.

Digital platforms are also critical sources of information and our public discourse. Across the country, people gather up to 80% of information from such sources, but we should not have trust in them. Their algorithms, which promote and depromote, and their interfaces, which engage, are designed, as we have heard, to make people addicted to the peer validation and augmentation of particular points of view. They are driving people down tribal rabbit holes to the point where they cannot talk to each other or even listen to another point of view. It is no wonder that 50% of young people are unhappy or anxious when they use social media, and these algorithmic models are the problem. Trust in these platforms is wrong: their promotion or depromotion of messages and ideas is opaque, often subjective and subject to inappropriate influence.

It is right that we tackle illegal activity and that harms to children and the vulnerable are addressed, and I support the attempt to do that in the Bill. Those responsible for the big platforms must be held to account for how they operate them, but trusting in those platforms is wrong, and I worry that compliance with their terms of service might become a tick-box absolution of their responsibility for unhappiness, anxiety and harm.

What about harm to our public sphere, our discourse, and our processes of debate, policymaking and science? To trust the platforms in all that would be wrong. We know they have enabled censorship. Elon Musk’s release of the Twitter files has shown incontrovertibly that the big digital platforms actively censor people and ideas, and not always according to reasonable moderation. They censor people according to their company biases, by political request, or with and on behalf of the three-letter Government agencies. They censor them at the behest of private companies, or to control information on their products and the public policy debate around them. Censorship itself creates mistrust in our discourse. To trust the big platforms always to do the right thing is wrong. It is not right that they should be able to hide behind their terms of service, bury issues in the Ofcom processes in the Bill, or potentially pay lip service to a tick-box exercise of merely “having regard” to the importance of freedom of expression. They might think they can just write a report, hire a few overseers, and then get away scot-free with their cynical accumulation, and the sale of the data of their addicted users and the manipulation of their views.

The Government have rightly acknowledged that addressing such issues of online safety is a work in progress, but we must not think that the big platforms are that interested in helping. They and their misery models are the problem. I hope that the Government, and those in the other place, will include in the Bill stronger duties to stop things that are harmful, to promote freedom of expression properly, to ensure that people have ready and full access to the full range of ideas and opinions, and to be fully transparent in public and real time about the way that content is promoted or depromoted on their platforms. Just to trust in them is insufficient. I am afraid the precedent has been set that digital platforms can be used to censor ideas. That is not the future; that is happening right now, and when artificial intelligence comes, it will get even worse. I trust that my colleagues on the Front Bench and in the other place will work hard to improve the Bill as I know it can be improved.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- View Speech - Hansard - -

I strongly support the Bill. This landmark piece of legislation promises to put the UK at the front of the pack, and I am proud to see it there. We must tackle online abuse while protecting free speech, and I believe the Bill gets that balance right. I was pleased to serve on the Bill Committee in the last Session, and I am delighted to see it returning to the Chamber. The quicker it can get on to the statute book, the more children we can protect from devastating harm.

I particularly welcome the strengthened protections for children, which require platforms to clearly articulate in their terms of service what they are doing to enforce age requirements on their site. That will go some way to reassuring parents that their children’s developing brains will not be harmed by early exposure to toxic, degrading, and demeaning extreme forms of pornography. Evidence is clear that early exposure over time warps young girls’ views of what is normal in a relationship, with the result that they struggle to form healthy equal relationships. For boys, that type of sexual activity is how they learn about sex, and it normalises abusive, non-consensual and violent acts. Boys grow up into men whose neural circuits become habituated to that type of imagery. They actually require it, regardless of the boundaries of consent that they learn about in their sex education classes—I know this is a difficult and troubling subject, but we must not be afraid to tackle it, which is what we are doing with the Bill. It is well established that the rise of that type of pornography on the internet over time has driven the troubling and pernicious rise in violence against women and girls, perpetrated by men, as well as peer-on-peer child sexual abuse and exploitation.

During Committee we had a good debate about the need for greater criminal sanctions to hold directors individually to account and drive a more effective safety culture in the boardroom. I am proud to serve in the Chamber with my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates). I have heard about all their work on new clause 2 and commend them heartily for it. I listened carefully to the Minister’s remarks in Committee and thank him and the Secretary of State for their detailed engagement.