7 Caroline Ansell debates involving the Department for Digital, Culture, Media & Sport

Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Thu 19th Nov 2020

ONLINE SAFETY BILL (Third sitting)

Caroline Ansell Excerpts
Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(2 years ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Forgive me, Dame Angela.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - -

I rise to recognise the spirit and principle behind new clause 9, while, of course, listening carefully to the comments made by my hon. Friend the Member for Folkestone and Hythe. He is right to raise those concerns, but my question is: is there an industry-specific way in which the same responsibility and liability could be delivered?

I recognise too that the Bill is hugely important. It is a good Bill that has child protection at its heart. It also contains far more significant financial penalties than we have previously seen—as I understand it, 10% of qualifying revenue up to £18 million. This will drive some change, but it comes against the backdrop of multi-billion-pound technology companies.

I would be interested to understand whether a double lock around the board-level responsibility might further protect children from some of the harrowing and harmful content we see online. What we need is nothing short of transformation and significant culture change. Even today, The Guardian published an article about TikTok and a study by the Centre for Countering Digital Hate, which found that teenagers who demonstrated an interest in self-harm and eating disorders were having algorithms pushing that content on to them within minutes. That is most troubling.

We need significant, serious and sustained culture change. There is precedent in other sectors, as has been mentioned, and there was a previous recommendation, so clearly there is merit in this. My understanding is that there is strong public support, because the public recognise that this new responsibility cannot be strengthened by anything other than liability. If there is board-level liability, that will drive priorities and resources, which will broker the kind of change we are looking for. I look forward to what the Minister might share today, as this has been a good opportunity to bring these issues into further consideration, and they might then be carried over into subsequent stages of this excellent Bill.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I would like to build on the excellent comments from my colleagues and to speak about child sexual abuse material. I thank my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone for tabling the amendment. I am very interested in how we can use the excellent provisions in the Bill to keep children safe from child sexual abuse material online. I am sure the Committee is aware of the devastating impact of such material.

Sexual abuse imagery—of girls in particular—is increasingly prevalent. We know that 97% of this material in 2021 showed female children. The Internet Watch Foundation took down a record-breaking 252,000 URLs that had images of children being raped, and seven in 10 of those images were of children aged 11 to 13. Unfortunately, the National Crime Agency estimates that between 550,000 and 850,000 people in the UK are searching for such material on the internet. They are actively looking for it, and at the moment they are able to find it.

My concern is with how we use what is in the Bill already to instil a top-down culture in companies, because this is about culture change in the boardroom, so that safety is considered with every decision. I have read the proceedings from previous sittings, and I recognise that the Government and Ministers have said that we have sufficient provisions to protect children, but I think there is a little bit of a grey area with tech companies.

I want to mention Apple and the update it was planning for quite a few years. There was an update that would have automatically scanned for child sex abuse material. Apple withdrew it following a backlash from encryption and privacy experts, who claimed it would undermine the privacy and security of iCloud users and make people less safe on the internet. Having previously said that it would pause it to improve it, Apple now says that it has stopped it altogether and that it is vastly expanding its end-to-end encryption, even though law enforcement agencies around the world, including our own UK law enforcement agencies, have expressed serious concerns because it makes investigations and prosecution more challenging. All of us are not technical experts. I do not believe that we are in a position to judge how legitimate it is for Apple to have this pause. What we do know is that while there is this pause, the risks for children are still there, proliferating online.

We understand completely that countering this material involves a complicated balance and that the tech giants need to walk a fine line between keeping users safe and keeping their data safe. But the question is this: if Apple and others continue to delay or backtrack, will merely failing to comply with an information request, which is what is in the Bill now, be enough to protect children from harm? Could they delay indefinitely and still be compliant with the Bill? That is what I am keen to hear from the Minister. I would be grateful if he could set out why he thinks that individuals who have the power to prevent the harmful content that has torn apart the lives of so many young people and their families should not face criminal consequences if they fail to do so. Can he reassure us as to how he thinks that the Bill can protect so many children—it is far too many children—from this material online?

Online Safety Bill (Third sitting)

Caroline Ansell Excerpts
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - -

Q May I ask about anonymity? It is mentioned in the Bill, but only once. Do you think there is a need for more expansive coverage of this issue? Do you think people should be able to use the internet while remaining anonymous, and if not, to whom would users disclose their identity? Would it be to the platform, or would it be more publicly than that?

Stephen Kinsella: There are a few questions there, obviously. I should say that we are happy with the approach in the Bill. We always felt that focusing on anonymity was the wrong place to start. Instead, we thought that a positive right to be verified, and then a right to screen out replies and posts from unverified accounts, was the way to go.

In terms of who one should make the disclosure to, or who would provide the verification, our concern was always that we did not want to provide another trove of data that the platforms could use to target us with adverts and otherwise monetise. While we have tried to be agnostic on the solution—again, we welcome the approach in the Bill, which is more about principles and systems than trying to pick outcomes—there are third-party providers out there that could provide one-stop verification. Some of them, for instance, rely on the open banking principles. The good thing about the banks is that under law, under the payment services directive and others, we are the owners of our own data. It is a much greyer area whether we are the owners of the data that the social media platforms hold on us, so using that data that the banks have—there is a solution called One ID, for instance—they will provide verification, and you could then use that to open your social media accounts without having to give that data to the platforms.

I saw in the evidence given to you on Tuesday that it was claimed that 80% of users are reluctant to give their data to platforms. We were surprised by that, and so we looked at it. They chose their words carefully. They said users were reluctant to give their data to “certain websites”. What they meant was porn sites. In the polling they were referring to, the question was specifically about willingness to share data with porn sites, and people are, understandably, reluctant to do that. When using open banking or other systems, there are good third-party providers, I would suggest, for verification.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

Q May I ask a quick supplementary about positive verification, before others contribute? A contributor to a previous session said there was a reluctance—a genuinely held reluctance—by some to be verified. In that way, it suppressed democratic engagement. Do you recognise that as an issue or a fault line in the verification argument?

Stephen Kinsella: Very much not. We have conducted polling using YouGov. Compassion in Politics did polling using Opinium. The figures vary slightly, but at a minimum, two in three citizens—often four out of five citizens—are very willing to be verified and would like the opportunity to be verified if it meant that they could then screen out replies from unverified accounts. I would say there is a weight of evidence on this from the polling. By the way, we would be very happy to conduct further polling, and we would be very happy to consult with the Committee on the wording of the questions that should be put, if that would be helpful, but I think we are quite confident what the response would be.

Liron Velleman: We set two clear tests for the situation on anonymity on platforms. First, will it harm the ability of some groups in society to have freedom of speech online? We are concerned that verification could harm the ability of LGBT people and domestic abuse survivors to use the platforms in the full ways they wish to. For example, if a constituent who is, say, a domestic abuse survivor or LGBT, wished to get in touch with you but was not verified on the platform, it would be one restriction that you would not be able to get around if you chose to change your settings.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

Q Would that be an argument for their identity verification being at platform level, rather than any wider public identity?

Liron Velleman: That could be very possible. One of our key questions is whether verification would mean that you had to use your real name on the platform or whether you had to verify that you were a person who was using a platform, but could then use a pseudonym on the front face of the website. I could sign up and say, “Here is my ID for the platform verification”, but if I did not wish to use my name, in order to protect my actual identity publicly on the platform, I could choose not to but still be verified as a real person. It would be different to having to have my name, Liron Velleman, as the user for Facebook or Twitter or any other platform.

The second test for us is whether it is going to make a real difference to reducing online harm. With a lot of the harm we see, people are very happy to put their names to the racism, misogyny and sexism and homophobia that they put online. We would not want to see a huge focus on anonymity, whereby we “ended” anonymity online, and yet online harm continued to propagate. We believe it would still continue, and we would not want people to be disappointed that that had not completely solved the issue. Of course, there are a huge number of anonymous accounts online that carry out abuse. Anything we can do to reduce that is welcome, but we do not see it as the silver bullet that could end racism online.

Stephen Kinsella: Obviously, we have not suggested that there is a silver bullet. We are talking about responding to what users want. A lot of users want the ability to say that they do not want to interact with people who are not using their real name. That does not mean that one could not envisage other levels of filter. You could have a different filter that said, “I am happy to interact with people who are verified to be real, but I don’t require that they have given their name”. The technology exists there, certainly to provide a menu of solutions. If you could only have one, we happen to think ours is the best, and that the evidence shows it would reduce a significant amount of disinformation spread and, certainly, abuse.

Danny Stone: I think one issue will be Ofcom’s ability to ensure consistency in policing. It is very difficult, actually, to find out where crimes have happened and who an individual is. Sometimes, the police have the power to compel the revelation of identity. The way the platforms respond is, I think, patchy, so Ofcom’s position in its guidance here will be pretty important.

None Portrait The Chair
- Hansard -

Thank you. We have time for a question from Navendu Mishra before we bring the Minister in.

Online Safety Bill (Fourth sitting)

Caroline Ansell Excerpts
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question for Poppy, although I am afraid it might not have a quick answer. How much of an impact does the algorithmic categorisation of things—the way we are fed things on social media—have on our lives? Do you think it is steering people towards more and more extreme content? Or is it a totally capitalist thing that is not harmful, and just something that sells us things every so often?

Poppy Wood: I think it goes without saying that the algorithmic promotion of harmful content is one of the biggest issues with the model we have in big tech today. It is not the individual pieces of content in themselves that are harmful. It is the scale over which they spread out—the amplification of them; the targeting; the bombardment.

If I see one piece of flat-earth content, that does not necessarily harm me; I probably have other counter-narratives that I can explore. What we see online, though, is that if you engage with that one piece of flat-earth content, you are quickly recommended something else—“You like this, so you’ll probably like that”—and then, before you know it, you are in a QAnon conspiracy theory group. I would absolutely say that the algorithmic promotion of harmful content is a real problem. Does that mean we ban algorithms? No. That would be like turning off the internet. You have to go back and ask, how it is that that kind of harm is promoted, and how is it that we are exploiting human behaviour? It is human nature to be drawn to things that we cannot resist. That is something that the Bill really needs to look at.

In the risk assessments, particularly for illegal content and content that is harmful to children, it explicitly references algorithmic promotion and the business model. Those are two really big things that you touched on in the question. The business model is to make money from our time spent online, and the algorithms serve us up the content that keeps us online. That is accounted for very well in the risk assessments. Some of the things around the safety duties do not necessarily account for that, just because you are risk assessing for it. Say you identify that our business model does promote harmful content; under the Bill, you do not have to mitigate that all the time. So I think there are questions around whether the Bill could go further on algorithmic promotion.

If you do not mind, I will quickly come back to the question you asked Eva about reporting. We just do not know whether reporting is really working because we cannot see—we cannot shine a light into these platforms. We just have to rely on them to tell us, “Hey, reporting is working. This many pieces of content were reported and this many pieces of content were taken down.” We just do not know if that is true. A big part of this regime has to be about transparency. It already is, but I think it could go much further in enabling Ofcom, Government, civil society and researchers to say, “Hey, you said that many pieces of content were reported and that many pieces of content were taken down, but actually, it turns out that none of that is true. We are still seeing that stuff online.” Transparency is a big part of the solution around understanding whether reporting is really working and whether the platforms are true to their word.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - -

Q May I ask a follow-up question on that? Poppy, you referenced risk assessments. Would you value and welcome more specifics around quality standards and minimum requirements on risk assessments? My main question is about privacy and anonymity, but I would appreciate a word on risk assessments.

Poppy Wood: Absolutely. I know that children’s groups are asking for minimum standards for children’s risk assessments, but I agree that they should be across the board. We should be looking for the best standards that we can get. I really do not trust the platforms to do these things properly, so I think we have to be really tough with them about what we expect from them. We should absolutely see minimum standards.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

Q Do you think Ofcom has the resources that it would require to push for an independent audit of risk assessments?

Poppy Wood: Obviously Ofcom is growing. The team at Ofcom are fantastic, and they are hiring really top talent. They have their work cut out in dealing with some of the biggest and wealthiest companies in the world. They need to be able to rely on civil society and researchers to help them to do their job, but I do not think we should rule out Ofcom being able to do these things. We should give it the powers to do them, because that makes this regime have proper teeth. If we find down the line that, actually, it is too much, that is for the Government to sort out with resourcing, or for civil society and researchers to support, but I would not want to rule things out of the Bill just because we think Ofcom cannot do them.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

Q What are your thoughts on the balance between privacy and anonymity?

Poppy Wood: Of course, the Bill has quite a unique provision for looking at anonymity online. We have done a big comparison of online safety regulations across the world, and nobody is looking at anonymity in the same way as the UK. It is novel, and with that comes risk. Let us remember that anonymity is a harm reduction mechanism. For lots of people in authoritarian regimes, and even for those in the UK who are survivors of domestic abuse or who want to explore their sexuality, anonymity is a really powerful tool for reducing harm, so we need to remember that when we are talking about anonymity online.

One of my worries about the anonymity agenda in the Bill is that it sounds really good and will resonate really well with the public, but it is very easy to get around, and it would be easy to oversell it as a silver bullet for online harm. VPNs exist so that you can be anonymous. They will continue to exist, and people will get around the rules, so we need to be really careful with the messaging on what the clauses on anonymity really do. I would say that the whole regime should be a privacy-first regime. There is much more that the regime can do on privacy. With age verification, it should be privacy first, and anonymity should be privacy first.

I also have some concerns about the watering down of privacy protections from the draft version of the Bill. I think the language was “duty to account for the right to privacy”, or something, and that right-to-privacy language has been taken out. The Bill could do more on privacy, remembering that anonymity is a harm-reducing tool.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

Q Eva, there is just one reference to anonymity in the Bill currently. Do you think there is an opportunity to express a fuller, more settled opinion and potentially expand on that juxtaposition?

Eva Hartshorn-Sanders: I heard the advice that the representative of the Information Commissioner’s Office gave earlier—he feels that the balance is right at the moment. It is important to incorporate freedom of speech and privacy within this framework in a democratic country. I do not think we need to add anything more than that.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us this afternoon. May I ask for your views on the clauses on journalistic content exemption and democratic content exemption? Do you think that these measures are likely to be effective?

Poppy Wood: I know you have spoken a lot about this over the past few days, but the content of democratic importance clause is a layer of the Bill that makes the Bill very complicated and hard to implement. My concern about these layers of free speech—whether it is the journalistic exemption, the news media exemption or the content of democratic importance clause—is that, as you heard from the tech companies, they just do not really know what to do with it. What we need is a Bill that can be implemented, so I would definitely err on the side of paring back the Bill so that it is easy to understand and clear. We should revisit anything that causes confusion or is obscure.

The clause on content of democratic importance is highly problematic—not just because it makes the Bill hard to implement and we are asking the platforms to decide what democratic speech is, but because I think it will become a gateway for the sorts of co-ordinated disinformation that we spoke about earlier. Covid disinformation for the past two years would easily have been a matter of public policy, and I think the platforms, because of this clause, would have said, “Well, if someone’s telling you to drink hydroxychloroquine as a cure for covid, we can’t touch that now, because it’s content of democratic importance.”

I have another example. In 2018, Facebook said that it had identified and taken down a Facebook page called “Free Scotland 2014”. In 2018—four years later—Facebook identified it. It was a Russian/Iranian-backed page that was promoting falsehoods in support of Scottish independence using fake news websites, with articles about the Queen and Prince Philip wanting to give themselves a pay rise by stealing from the poor. It was total nonsense, but that is easily content of democratic importance. Even though it was backed by fake actors—as we have said, I do not think there is anything in the Bill to preclude that at the moment, or at least to get the companies to focus on it—in 2014, that content would have been content of democratic importance, and the platforms took four years to take it down.

I think this clause would mean that that stuff became legitimate. It would be a major loophole for hate and disinformation. The best thing to do is to take that clause out completely. Clause 15(3) talks about content of democratic importance applying to speech across a diverse range of political opinion. Take that line in that subsection and put it in the freedom of expression clause—clause 19. What you then have is a really beefed-up freedom of expression clause that talks about political diversity, but you do not have layers on top of it that mean bad actors can promote hate and disinformation. I would say that is a solution, and that will make the Bill much easier to implement.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I would just like to query your thoughts on a right to redress for victims. Do you think that having an ombudsman in the Bill would be appropriate, and what would you like to see to support victims of fraud?

Martin Lewis: As you will know, I had to sue Facebook for defamation, which is a ridiculous thing to do in order to stop scam adverts. I was unable to report the scam adverts to the police, because I had not been scammed—even though it was my face that was in them—and many victims were not willing to come forward. That is a rather bizarre situation, and we got Facebook to put forward £3 million to set up Citizens Advice Scam Action—that is what I settled for, as well as a scam ad reporting tool.

There are two levels here. The problem is who is at fault. Of course, those mainly at fault for scams are the scammers. They are criminals and should be prosecuted, but not enough of them are. You have times when it is the bank’s fault. If a company has not put proper precautions in place, and people have got scammed because it has put up adverts or posts that it should have prevented, they absolutely need to have some responsibility for that. I think you will struggle to have a direct redress system put in place. I would like to see it, but it would be difficult.

It is rather interesting to me that I am worried that the £3 million for Citizens Advice Scam Action, which was at least meant to provide help and support for victims of scams, is going to run out. I have not seen any more money coming from Facebook, Google or any of the other big players out there. If we are not going to fund direct redress, we could at least make sure that they fund a collective form of redress and help for the victims of scams, as a bare minimum. It is very strange that these firms go so quiet on this, and what they say is, “We are doing everything we can.”

From my meetings with these firms—these are meetings with lawyers in the room, so I have to be slightly careful—one of the things that I would warn the Committee about is that they tend to get you in and give you a presentation on all the technological reasons why they cannot stop scam adverts. My answer to them after about 30 seconds, having stopped what was meant to be an hour-long presentation, is, “I have not framed the fact that you need a technological solution. I have said you need a solution. If the answer to stopping scam adverts, and to stopping scams, is that you have to pre-vet every single advert, as old-fashioned media did, and that every advert that you put up has to have been vetted by a human being, so be it. You’re making it a function of technology, but let’s be honest: this is a function of profitability.” We have to look at the profitability of these companies when it comes to redress. What your job is—if you forgive me saying this—is to make sure that it costs them more money to let people be scammed than it does to stop people being scammed. If we solve that, we will have a lot fewer scams on social media and on the search advertising.

Rocio Concha: I completely agree with everything that Martin says. At the moment, the provisions in the Bill for “priority illegal content” require the platforms to publish reports that say, “This is how much illegal content we are seeing on the platform, and these are the measures that we are going to take.” They are also required to have a way for users to report it and to complain when they think that the platforms are not doing the right thing. At the moment, that does not apply to fraudulent advertising, so you have an opportunity to fix that in the Bill very easily, to at least get the transparency out there. The platform has to say, “We are finding this”—that puts pressure on the platform, because it is there and is also with the regulator—“and these are the measures that we are taking.” That gives us transparency to say, “Are these measures enough?” There should also be an easy way for the user to complain when they think that platforms are not doing the right thing. It is a complex question, but there are many things in the Bill that you can improve in order to improve the situation.

Tim Fassam: I wonder if it would be useful to give the Committee a case study. Members may be familiar with London Capital & Finance. Now, London Capital & Finance is one of the most significant recent scams. It sold mini-bonds fraudulently, at a very high advertised return, which then collapsed, with individuals losing all their money.

Those individuals were compensated through two vehicles. One was a Government Bill; so, they were compensated by the taxpayer. The others, because they were found to have been given financial advice despite LCF not having advice permissions or operating through a regulated product, went on to the Financial Services Compensation Scheme, which, among others, our members pay for; legitimate financial services companies pay for it. The most recent estimate is over £650 million. The expectation is that that will reach £1 billion at some point over the next few years, in terms of cost to the economy.

LCF was heavily driven by online advertising, and we would argue that the online platforms were in fact probably the only people who could have stopped it happening. They have profited from those adverts and they have not contributed anything to either of those two schemes. We would argue—possibly not for this Bill—that serious consideration should be given to the tech platforms being part of the financial services compensation scheme architecture and contributing to the costs of scams that individuals have fallen foul of, as an additional incentive for them to get on top of this problem.

Martin Lewis: That is a very important point, but I will just pick up on what Rocio was saying. One of the things that I would like to see, as well as much more rigid requirements of how reporting scams can be put in place—because I cannot see proper pre-vetting happening with these technology companies, but we can at least rely on social policing and reporting of scams. There are many people who recognise a scam, just as there are many people who do not recognise a scam.

However, I also think this is a wonderful opportunity to make sure that the method, the language and the symbols used for reporting scams are universal in the UK, so that whatever site you are on, if you see an advert you click the same symbol, and the process is unified and universal, and works in a very similar way, so that you can report a scam the same way on every site, which makes it simpler, and we can train people in how to do it and we can make the processes work.

Then, of course, we have to make sure that they act on the back of reports, but simply the various ways it is reported, and the complexity, and the number of clicks that you need to make mean it is a lot easier generally to click on an advert than it is to click to report an advert that is a scam. And with so many scams out there, I think there should be a parity of ease between those two factors.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

Q May I ask, directly related to that, about the complaints procedure? What would you like to see in terms of changes there, to make it more unified, more universal and simpler? It has been suggested that it is not robust enough, not dynamic enough and not fast enough.

Rocio Concha: There were complaints from the users. At the moment, this Bill will not allow this for fraudulent advertising. So, we need to make sure that it is a requirement for the platforms to allow and to have an easy tool for people to complain and to report when they see something that is fraudulent. At the moment, the Bill does not do that. It is an easy fix; you can do it. And then the user will have that tool. It would also give us transparency for the regulator and for organisations such as ours, to see what is happening and to see what measures the platforms are taking.

Tim Fassam: I would agree with that. I would also highlight a particular problem that our members have flagged, and we have flagged directly with Meta and Instagram. Within the definition in the Bill of individuals who can raise concern about social media platforms, our members find they fall between two stools, because quite often what is happening is that people are claiming an association with a legitimate firm. So they will have a firm’s logo, or a firm’s web address, in their profile for their social media and then they will not directly claim to be a financial adviser but imply an association with a legitimate financial advice firm. This happens surprisingly frequently.

Our members find it incredibly difficult to get those accounts taken down, because it is not a fraudulent account; that individual is not pretending to be someone else and they are not the individual claiming pretence. They are not directly claiming to be an employee; they could just say they are a fan of the company. And they are not a direct victim of this individual. What happens is that when they report, it goes into a volume algorithm, and only if a very large number of complaints are made does that particular site get taken down. I think that could be expanded to include complaints from individuals affected by the account, rather than directly believing they are pretending to be that.

None Portrait The Chair
- Hansard -

Mr Lewis, you were nodding.

Martin Lewis: I was nodding—I was smiling and thinking, “If it makes you feel any better, Tim, I have pictures of me that tell people to invest money that are clearly fake, because I don’t do any adverts, and it still is an absolute pain in the backside for me to get them taken down, having sued Facebook.” So, if your members want to feel any sense of comradeship, they are not alone in this; it is very difficult.

I think the interesting thing is about that volumetric algorithm. Of course, we go back to the fact that these big companies like to err on the side of making money and err away from the side of protecting consumers, because those two, when it comes to scams, are diametrically opposed. The sooner we tidy it up, the better. You could have a process where once there has been a certain number of reports—I absolutely get Tim’s point that in certain cases there is not a big enough volume—the advert is taken down and then the company has to proactively decide to put it back up and effectively say, “We believe this is a valid advert.” Then the system would certainly work better, especially if you bring down the required number of reports. At the moment, I think, there tends to be an erring on the side of, “Keep it up as long as it’s making us money, unless it absolutely goes over the top.”

Many tech experts have shown me adverts with my face in on various social media platforms. They say it would take them less than five minutes to write a program to screen them out, but those adverts continue to appear. We just have to be conscious here that—there is often a move towards self-regulation. Let me be plain, as I am giving evidence. I do not trust any of these companies to have the user and the consumer interest at heart when it comes to their advertising; what they have at heart is their own profits, so if we want to stop them, we have to make this Bill robust enough to stop them, because that is the only way it will stop. Do not rely on them trying to do good, because they are trying to make profit and they will err on the side of that over the side of protecting individuals from scam adverts.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you. You alluded earlier to the fact that the Bill contains duties to protect content of democratic importance and journalistic content. What is your view on those measures and their likely effectiveness?

Frances Haugen: I want to reiterate that AI struggles to do even really basic tasks. For example, Facebook’s own document said that it only took down 0.8% of violence-inciting content. Let us look at a much broader category, such as content of democratic importance—if you include that in the Bill, I guarantee you that the platforms will come back to you and say that they have no idea how to implement the Bill. There is no chance that AI will do a good job of identifying content of democratic importance at any point in the next 30 years.

The second question is about carve-outs for media. At a minimum, we need to greatly tighten the standards for what counts as a publication. Right now, I could get together with a friend and start a blog and, as citizen journalists, get the exact same protections as an established, thoughtful, well-staffed publication with an editorial board and other forms of accountability. Time and again, we have seen countries such as Russia use small media outlets as part of their misinformation and disinformation strategies. At a minimum, we need to really tighten that standard.

We have even seen situations where they will use very established publications, such as CNN. They will take an article that says, “Ukrainians destroyed a bunch of Russian tanks,” and intentionally have their bot networks spread that out. They will just paste the link and say, “Russia destroyed a bunch of tanks.” People briefly glance at the snippet, they see the picture of the tank, they see “CNN”, and they think, “Ah, Russia is winning.” We need to remember that even real media outlets can be abused by our enemies to manipulate the public.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

Q Good afternoon, Frances. I want to ask you about anonymity and striking a balance. We have heard variously that anonymity affords some users safe engagement and actually reduces harm, while for others anonymity has been seen to fuel abuse. How do you see the balance, and how do you see the Bill striving to achieve that?

Frances Haugen: It is important for people to understand what anonymity really is and what it would really mean to have confirmed identities. Platforms already have a huge amount of data on their users. We bleed information about ourselves on to these platforms. It is not about whether the platforms could identify people to the authorities; it is that they choose not to do that.

Secondly, if we did, say, mandate IDs, platforms would have two choices. The first would be to require IDs, so that every single user on their platform would have to have an ID that is verifiable via a computer database—you would have to show your ID and the platform would confirm it off the computer. Platforms would suddenly lose users in many countries around the world that do not have well-integrated computerised databases. The platforms will come back to you and say that they cannot lose a third or half of their users. As long as they are allowed to have users from countries that do not have those levels of sophisticated systems, users in the UK will just use VPNs—a kind of software that allows you to kind of teleport to a different place in the world—and pretend to be users from those other places. Things such as ID identification are not very effective.

Lastly, we need to remember that there is a lot of nuance in things like encryption and anonymity. As a whistleblower, I believe there is a vital need for having access to private communications, but I believe we need to view these things in context. There is a huge difference between, say, Signal, which is open source and anyone in the world can read the code for it—the US Department of Defence only endorses Signal for its employees, because it knows exactly what is being used—and something like Messenger. Messenger is very different, because we have no idea how it actually works. Facebook says, “We use this protocol,” but we cannot see the code; we have no idea. It is the same for Telegram; it is a private company with dubious connections.

If people think that they are safe and anonymous, but they are not actually anonymous, they can put themselves at a lot of risk. The secondary thing is that when we have anonymity in context with more sensitive data—for example, Instagram and Facebook act like directories for finding children—that is a very different context for having anonymity and privacy from something like Signal, where you have to know someone’s phone number in order to contact them.

These things are not cut-and-dried, black-or-white issues. I think it is difficult to have mandatory identity. I think it is really important to have privacy. We have to view them in context.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

Thank you. That is very helpful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us and giving evidence, Frances; it is nice to see you again. We had evidence from Meta, your former employer, on Tuesday, in which its representative suggested that it engages in open and constructive co-operation with researchers. Do you think that testimony was true?

Frances Haugen: I think that shows a commendable level of chutzpah. Researchers have been trying to get really basic datasets out of Facebook for years. When I talk about a basic dataset, it is things as simple as, “Just show us the top 10,000 links that are distributed in any given week.” When you ask for information like that in a country like the United States, no one’s privacy is violated: every one of those links will have been viewed by hundreds of thousands, if not millions of people. Facebook will not give out even basic data like that, even though hundreds if not thousands of academics have begged for this data.

The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes—and remember, it does not even say that it will happen; Ofcom might say, “Oh, maybe not.” We need to take a page from the Digital Services Act and say, “On the day that the Bill passes, we get access to data,” or, at worst, “Within three months, we are going to figure out how to do it.” It needs to be not, “Should we do it?” but “How will we do it?”

Oral Answers to Questions

Caroline Ansell Excerpts
Thursday 10th February 2022

(2 years, 10 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Alex Chalk Portrait The Solicitor General
- View Speech - Hansard - - - Excerpts

It is extremely important that we in this House do not inadvertently misrepresent a judgment that has been made in the High Court. In the case that the hon. Gentleman refers to, the Court indicated that the arrangements did not confer any advantage at the decision-making stage of the process; that the company’s offers were very likely to have meant it being awarded contracts even without the arrangements; and that there was sufficient financial due diligence in respect of both sets of contracts. Without seeking to go behind the decision of the Court in that case, it is important that it is placed in its proper context. This Government will abide by the rule of law.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - -

9. What recent assessment she has made of the effectiveness of the Serious Fraud Office in recovering the proceeds of crime.

Alex Chalk Portrait The Solicitor General (Alex Chalk)
- Hansard - - - Excerpts

The SFO has had a very positive year in delivering on its commitment to recover the proceeds of crime. [Interruption.] If the hon. Member for Huddersfield (Mr Sheerman) will listen, so far in 2021-22 the SFO has obtained more than £44.5 million in new financial orders from the courts, and at the same time it has successfully recovered more than £45 million by enforcing these and existing orders. Those are the largest recorded sums obtained and recovered in a single year by the SFO.

Caroline Ansell Portrait Caroline Ansell
- View Speech - Hansard - -

I thank my hon. and learned Friend for his response and hope that there is some hope therein for my constituents who, just two years ago almost to the day, wrote to me about their personal case of how the London Capital & Finance scandal had impacted them. In October of 2021, the only update offered by the SFO was that investigations were ongoing. What assessment can he make of that progress, and what hope can I offer my constituents?

Alex Chalk Portrait The Solicitor General
- View Speech - Hansard - - - Excerpts

I am grateful to my hon. Friend for very properly pressing this case on behalf of her constituents. The SFO continues to investigate the dealings of London Capital & Finance plc and associated companies. The size and complexity of those cases, including the sheer number of victims and witnesses, means that it can take a significant period for a full investigation to be carried out. I meet the SFO director regularly to discuss casework, and I can assure my hon. Friend that driving forward the fastest possible case progression is a priority for me and for the Attorney General. I want to end with this point: over the last five years, thanks to the work of the SFO, a full £1.3 billion has been returned to taxpayers over and above the costs of running the SFO.

Oral Answers to Questions

Caroline Ansell Excerpts
Thursday 16th September 2021

(3 years, 3 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Nadine Dorries Portrait Ms Dorries
- View Speech - Hansard - - - Excerpts

I am afraid that I am going to have to write to the hon. Lady, being new to the job as I am. I will do that immediately.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- View Speech - Hansard - -

T3. May I also offer my congratulations to the new Secretary of State? In Eastbourne, the ambition for a mixed augmented reality studio is beginning to take shape. What would this mean? Skilled jobs, an injection into the hospitality sector and keeping us at the forefront of film-making. What support can the sector and MediaBite, the project lead, anticipate from the Government, and will the Minister join me in wishing them well in their endeavour and in their bid to Innovate UK?

Matt Warman Portrait Matt Warman
- View Speech - Hansard - - - Excerpts

I absolutely join my hon. Friend in endorsing that bid. It is a key ambition of this Government to ensure that augmented reality and all those future technologies are made a reality not just in London and the big cities but across the whole country, so Eastbourne is a real opportunity. I would be happy, for instance, to facilitate a meeting with the BFI or something of that nature in order for her to help to pursue this endeavour.

Oral Answers to Questions

Caroline Ansell Excerpts
Thursday 1st July 2021

(3 years, 5 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
The Secretary of State was asked—
Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - -

What steps his Department is taking to support the recovery of the tourism industry from the covid-19 pandemic.

Henry Smith Portrait Henry Smith (Crawley) (Con)
- Hansard - - - Excerpts

What steps his Department is taking to support the recovery of the tourism industry from the covid-19 pandemic.

Nigel Huddleston Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Nigel Huddleston)
- Hansard - - - Excerpts

We recognise the impact of covid-19 on the tourism industry, which is why we published the tourism recovery plan to help the sector to return to pre-pandemic levels as quickly as possible and build back better for the future. The Government have already provided over £25 billion of support to the tourism, leisure and hospitality sectors in the form of grants, loans and tax breaks. As our plan sets out, we will continue to support the sector as it recovers.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

I thank my hon. Friend for his answer and for his visit last week to my beautiful constituency of Eastbourne, where he will have seen no shortage of ambition or potential—only a shortage of new recruits to the hospitality workforce. What plans do he and the Department have to promote careers in hospitality and tourism, which is a vital sector in the UK and in Eastbourne? Would maintaining the 5% VAT rate help employers to offer ever more competitive wages?

Nigel Huddleston Portrait Nigel Huddleston
- Hansard - - - Excerpts

It was a joy to join my hon. Friend in her incredibly sunny and warm constituency last week and see at first hand the hard work she has been doing on behalf of her constituents, and particularly those in the tourism sector. I know she shares my view that developing skills and careers within tourism and hospitality is vital for the sector’s recovery. As stated in the tourism recovery plan, we will work closely with the sector to ensure that businesses can employ more UK nationals in year-round better paid, high-quality tourism jobs. Regarding extending the temporary VAT cut, as we discussed last week, including with her constituents, the Government keep all taxes under review. I have noted her suggestion and I am sure that Treasury Ministers have, too.

--- Later in debate ---
Simon Baynes Portrait Simon Baynes (Clwyd South) (Con)
- Hansard - - - Excerpts

What steps his Department is taking to support the recovery of the criminal justice system as covid-19 restrictions are eased.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - -

What steps his Department is taking to support the recovery of the criminal justice system as covid-19 restrictions are eased.

Michael Ellis Portrait The Attorney General (Michael Ellis)
- Hansard - - - Excerpts

I frequently meet criminal justice partners to discuss this important issue. The covid-19 outbreak has been felt keenly by the criminal justice system. Recovery is a priority for this Government. I have been proud of the resilience that criminal justice agencies have shown. There is still more to do, but both the CPS and the Serious Fraud Office have been commended for their efforts at this difficult time. I thank them for continuing to support the delivery of justice.

--- Later in debate ---
Michael Ellis Portrait The Attorney General
- Hansard - - - Excerpts

I thank my hon. Friend for his generous question. I am proud that all criminal justice agencies have worked closely together since the covid-19 outbreak to ensure that essential justice services continue to be delivered. The CPS and the court service in north Wales have worked closely together throughout the pandemic to ensure that courts can be run safely and to maximise the flow of cases, while preserving public health. For example, domestic abuse cases in particular have been prioritised in the magistrates courts, so there are no delays or backlogs for those sensitive cases, where victims deserve our protection and support, but that goes in Clwyd South and it goes everywhere.

Caroline Ansell Portrait Caroline Ansell
- Hansard - -

I thank my right hon. and learned Friend for his answer. In Sussex, we have a backlog of over 800 Crown court cases—one case is now approaching four years without coming to court—and a rising drop-out rate. The Nightingale court in Chichester is making a real difference, but we still need greater capacity and pace. Can he assure me that every avenue is being pursued to address this backlog, so that we can ensure justice for victims in Eastbourne and in Sussex?

Michael Ellis Portrait The Attorney General
- Hansard - - - Excerpts

Yes, indeed. CPS South East in her region is working with all criminal justice partners to support the recovery activity within Sussex, including to ensure court capacity can be maximised and file quality improved—of course, the better the file quality, the speedier proceedings can follow. The latest levels of cases that I have seen flowing through the courts indicate that in recent weeks at least, outstanding case load in the Crown court has begun to reduce. However, there is still more to be done, and I should say at this point that there is no limit on the number of days that Crown courts can sit for the next fiscal year. That will enable Crown court judges to hold as many hearings as they safely can and as is physically possible, as we continue to recover from the pandemic.

Online Harms

Caroline Ansell Excerpts
Thursday 19th November 2020

(4 years, 1 month ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - -

Through this pandemic, we have seen what a saving grace the online world has proved to be to us. It is a window, and it has connected us to family and friends, and it has provided important information and services. In fact, I have worked hard to bring together different providers and trainers to close the digital divide that so disadvantages those who are not online. However, at the same time as being a saving grace, it is also a serious threat to our health and wellbeing, our security and our democracy—all of these things. I hope that, through this experience, we have now come to a place where we recognise that there is no longer this distinction between the offline and the online worlds.

That question was very much put at the trial of the man who threatened to kill me in 2017. I can assure hon. Members and all watching that it was real and it hurt. The same pain, the same suffering and the same frustration was felt by one of my constituents in 2016, where again the same question was posed: is there a difference between our online and offline experiences? She was a victim of revenge porn, a really dark and sinister crime. Her frustration and her powerlessness at not being able to bring down images that directed people from across the country to find her and rape her—and how the law did not reach her—was just something extraordinary to me. I therefore hope that that distinction is very much gone. We need a levelling up in our online and offline worlds

I want to focus on children. I applaud the work done to date and I welcome the online harms Bill to come, but unfinished business is my point in this debate. We made a commitment to introduce statutory age verification on porn websites. We supported that in 2016 and we supported it in 2017. It is still supported now. The most recent survey suggested that 83% of parents urged it as mission critical to protect their children. We know that early exposure to porn is harmful. I understand that there are technical issues, but surely these can be overcome. Other countries have shown the way, when we were previously world leading—France, for example, most recently.

More must be expected of our social media giants to maintain safe online environments, but I urge the Minister: we have the legislation, let us use it.