Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(2 years, 5 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Working closely with Ofcom is really good, but do you think there needs to be a duty to co-operate with Ofcom, or indeed with other regulators—to be specified in the Bill—in case relations become more tense in future?

Stephen Almond: The Bill has, in my view, been designed to work closely alongside data protection law. It supports effective co-operation between us and Ofcom by requiring and setting out a series of duties for Ofcom to consult with the ICO on the development of any codes of practice or formal guidance with an impact on privacy. With that framework in mind, I do not think there is a case to instil further co-operation duties in that way. I hope I can give you confidence that we and Ofcom will be working tirelessly together to promote the safety and privacy of citizens online. It is firmly in our interests and in the interest of society as a whole to do so.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Q Thank you for joining us, Mr Almond. You stated the aim of making the UK the

“safest place in the world to be online”.

In your view, what needs to be added or taken away from the Bill to achieve that?

Stephen Almond: I am not best placed to comment on the questions of online safety and online harms. You will speak to a variety of different experts who can comment on that point. From my perspective as a digital regulator, one of the most important things will be ensuring that the Bill is responsive to future challenges. The digital world is rapidly evolving, and we cannot necessarily envisage all the developments in technology that will come, or the emergence of new harms. The data protection regime is a principles-based piece of legislation. That gives us a great degree of flexibility and discretion to adapt to novel forms of technology and to provide appropriate guidance as challenges emerge. I really recommend retaining that risk-based, principles-based approach to regulation that is envisaged currently in the Online Safety Bill.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q There has been much talk about trying to future-proof the Bill. Is there anything you could recommend that should be in the Bill to try to help with that?

Stephen Almond: Again, I would say that the most important thing I can recommend around this is to retain that flexibility within the Bill. I know that a temptation will emerge to offer prescription, whether for the purpose of giving companies clarity today or for addressing present harms, but it is going to be really important to make sure that there is due flexibility to enable the legislation to be responsive to future harms.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Under clause 40, the Secretary of State can modify codes of practice to reflect public policy. How do you respond to criticism that this provision risks undermining the independence of the regulator?

Stephen Almond: Ultimately, it is for Ofcom to raise any concerns about the impact of the regime, as set out by its ability to apply its duties appropriately, independently and with due accountability to Parliament and the public. As a regulator, I would say that it is important to have a proper and proportionate degree of independence, so that businesses and the public can have trust in how regulation is carried out. Ultimately though, it is for Government and Parliament to determine what the right level of independence is.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q You have no concerns about that.

Stephen Almond: No.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Mr Almond, welcome to the Committee. Thank you for joining us this afternoon. Can I start with co-operation? You mentioned a moment ago in answer to Maria Miller that co-operation between regulators, particularly in this context the ICO and Ofcom, was going to be very important. Would you describe the co-operative work that is happening already and that you will be undertaking in the future, and comment on the role that the Digital Regulation Cooperation Forum has in facilitating that?

Stephen Almond: Thank you very much. I will start by explaining the Digital Regulation Cooperation Forum. It is a voluntary, not statutory, forum that brings together ourselves, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority—some of the regulators with the greatest interest in digital regulation—to make sure that we have a coherent approach to the regulation of digital services in the interests of the public and indeed the economy.

We are brought together through our common interest. We do not require a series of duties or statutory frameworks to make us co-operate, because the case for co-operation is very, very clear. We will deliver better outcomes by working together and by joining up where our powers align. I think that is what you are seeing in practice in some of the work we have done jointly—for example, around the implementation of the children’s code alongside Ofcom’s implementation of the video-sharing platform regime. A joined-up approach to questions about, for example, how you assure the age of children online is really important. That gives me real confidence in reassuring the Committee that the ICO, Ofcom and other digital regulators will be able to take a very joined-up approach to regulating in the context of the new online safety regime.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q In terms of online gaming, and predators moving children from more mainstream to less regulated platforms, do you think there are improvements in the Bill that relate to that, or do you think more can be done?

Lynn Perry: Grooming does happen within gaming, and we know that online video games offer some user-to-user interaction. Users sometimes have the ability to create content within platforms, which is in scope for the Bill. The important thing will be enforcement and compliance in relation to those provisions. We work with lots of children and young people who have been sexually exploited and abused, and who have had contact through gaming sites. It is crucial that this area is in focus from the perspective of building in, by design, safety measures that stop perpetrators being able to communicate directly with children.

Private messaging is another area for focus. We also consider it important for Ofcom to have regulatory powers to compel firms to use technology that could identify child abuse and grooming.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q If I could address one question to each witness, that would be fantastic. I do a lot of work with women in sport, including football. Obviously, we have the Women’s Euros coming up, and I have my Panini sticker album at the ready. Do you think the Bill could do more to address the pervasive issue of online threats of violence and abuse against women and girls, including those directed at women in sport, be they players, officials or journalists?

Sanjay Bhandari: I can see that there is something specific in the communications offences and that first limb around threatening communications, which will cover a lot of the things we see directed at female football pundits, like rape threats. It looks as though it would come under that. With our colleagues in other civil society organisations, particularly Carnegie UK Trust, we are looking at whether more should be done specifically about tackling misogyny and violence against women and girls. It is something that we are looking at, and we will also work with our colleagues in other organisations.

None Portrait The Chair
- Hansard -

Q Ms Perry, do you want to add anything to that?

Lynn Perry: When we were looking at children and young people’s access to harmful pornographic content, one thing we were particularly concerned about related to seeing extreme harmful and violent content, often perpetrated towards women. In respect of younger children, violence against women and girls and gender-based violence considerations, it is something that we are concerned about in that context.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Do you have any thoughts on the Bill committing to a statutory user advocacy body representing the interests of children? If you do, how do you think that that could be funded?

Lynn Perry: I am sorry—that was a question about advocacy, I think.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Yes, the idea of having a statutory user advocacy body that would represent the interests of children. This is something that has been talked about. Is that something you have any thoughts about?

Lynn Perry: We certainly have a lot of representation from children and young people directly. Last year, we worked with more than 380,000 children and young people. We think that advocacy and representation on behalf of children and young people can be used to powerful effect. Making sure that the voices of children and young people, their views, wishes and experiences, are heard and influence legislation that could safeguard and protect them effectively is something that we are supportive of.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Should the Bill commit to that?

Lynn Perry: As a recommendation, we think that could only strengthen the protections of children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Picking up that last point about representation for particular groups of users including children, Ms Perry, do you agree that the ability to designate organisations that can make super-complaints might be an extremely valuable avenue, in particular for organisations that represent user groups such as children? Organisations such as yours could get designated and then speak on behalf of children in a formal context. You could raise super-complaints with the regulator on behalf of the children you speak for. Is that something to welcome? Would it address the point made by my colleague, Kim Leadbetter, a moment ago?

Lynn Perry: We would welcome provision to be able to bring particularly significant evidence of concern. That is certainly something that organisations, large charities in the sector and those responsible for representing the rights of children and young people would welcome. On some of these issues, we work in coalition to make representations on behalf of children and young people, as well as of parents and carers, who also raise some concerns. The ability to do that and to strengthen the response is something that would be welcomed.

--- Later in debate ---
Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Eva, there is just one reference to anonymity in the Bill currently. Do you think there is an opportunity to express a fuller, more settled opinion and potentially expand on that juxtaposition?

Eva Hartshorn-Sanders: I heard the advice that the representative of the Information Commissioner’s Office gave earlier—he feels that the balance is right at the moment. It is important to incorporate freedom of speech and privacy within this framework in a democratic country. I do not think we need to add anything more than that.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you to the witnesses for joining us this afternoon. May I ask for your views on the clauses on journalistic content exemption and democratic content exemption? Do you think that these measures are likely to be effective?

Poppy Wood: I know you have spoken a lot about this over the past few days, but the content of democratic importance clause is a layer of the Bill that makes the Bill very complicated and hard to implement. My concern about these layers of free speech—whether it is the journalistic exemption, the news media exemption or the content of democratic importance clause—is that, as you heard from the tech companies, they just do not really know what to do with it. What we need is a Bill that can be implemented, so I would definitely err on the side of paring back the Bill so that it is easy to understand and clear. We should revisit anything that causes confusion or is obscure.

The clause on content of democratic importance is highly problematic—not just because it makes the Bill hard to implement and we are asking the platforms to decide what democratic speech is, but because I think it will become a gateway for the sorts of co-ordinated disinformation that we spoke about earlier. Covid disinformation for the past two years would easily have been a matter of public policy, and I think the platforms, because of this clause, would have said, “Well, if someone’s telling you to drink hydroxychloroquine as a cure for covid, we can’t touch that now, because it’s content of democratic importance.”

I have another example. In 2018, Facebook said that it had identified and taken down a Facebook page called “Free Scotland 2014”. In 2018—four years later—Facebook identified it. It was a Russian/Iranian-backed page that was promoting falsehoods in support of Scottish independence using fake news websites, with articles about the Queen and Prince Philip wanting to give themselves a pay rise by stealing from the poor. It was total nonsense, but that is easily content of democratic importance. Even though it was backed by fake actors—as we have said, I do not think there is anything in the Bill to preclude that at the moment, or at least to get the companies to focus on it—in 2014, that content would have been content of democratic importance, and the platforms took four years to take it down.

I think this clause would mean that that stuff became legitimate. It would be a major loophole for hate and disinformation. The best thing to do is to take that clause out completely. Clause 15(3) talks about content of democratic importance applying to speech across a diverse range of political opinion. Take that line in that subsection and put it in the freedom of expression clause—clause 19. What you then have is a really beefed-up freedom of expression clause that talks about political diversity, but you do not have layers on top of it that mean bad actors can promote hate and disinformation. I would say that is a solution, and that will make the Bill much easier to implement.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you, Poppy. Eva?

Eva Hartshorn-Sanders: I think the principle behind the duty is correct and that they should consider the democratic importance of content when they are making moderation decisions, but what we know from our work is that misinformation and disinformation on social media poses a real threat to elections and democracies around the world. As an international organisation, we have studied the real harms caused by online election disinformation in countries like the US. We saw websites like The Gateway Pundit profit from Google ads to the tune of over $1 million while spreading election disinformation. That has led to real-world death threats sent to election officials and contributed to the events of 6 January. It is not something we want to see replicated in the UK.

The problem with the democratic importance duty is that it is framed negatively about preventing platforms from removing content, rather than positively about addressing content that undermines elections. That is concerning because it is the latter that has proved to be damaging in the real world. I think where we are getting to is that there should be a positive duty on platforms to act on content that is designed and intended to undermine our democracy and our elections.

To add to that, the Joint Committee on the draft Bill looked specifically at having misinformation and disinformation on elections and public health on the face of the Bill rather than leaving it to secondary legislation. That is a position that we would support. The type of harm we have seen over the last couple of years through covid is a known harm and it is one that we should be addressing. It has led to the deaths of millions of people around the world.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q That is really helpful; thank you. You raised the point about the abuse that was directed at election officials in America. Do you think it should almost be a stand-alone offence to send harmful or threatening communications to elected people—MPs, councillors, mayors or police and crime commissioners—or possibly even election officials, the people who are involved in the democratic process, because of the risk that that abuse and threats could have on democracy?

Eva Hartshorn-Sanders: Obviously abuse is unacceptable, and there have been real issues with that globally and I know in the UK from the work we have done with MPs here, including through the misogyny research. I guess this is the balance—if people have concerns about legitimate political decisions that are being made—but that is why you have an independent regulator who can assess that content.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Poppy, do you have any thoughts on that?

Poppy Wood: We are seeing people who put themselves forward in public life receiving all sorts of horrible abuse, which was cited as a big reason for women and people of colour removing themselves from public life in recent elections. My understanding is that the threatening communications offences brought in under the illegal duties will probably cover quite a lot of that. The idea that Eva just gave of an election risk assessment or something might, coupled with the threatening communications offences, mean that you are accounting for how your platform promotes that sort of hate.

One of the things that you would want to try to avoid is making better protections for politicians than for everyone else, but I think that threatening communications already covers some of that stuff. Coupled with an elections risk assessment, that would hopefully mean that there are mitigating effects on the risks identified in those risk assessments to tackle the sorts of things that you were just talking about.

Eva Hartshorn-Sanders: Just to add to that, from our work on “Don’t Feed the Trolls”, we know that a lot of these hate campaigns are quite co-ordinated. There is a whole lot of supporting evidence behind that. They will often target people who raise themselves up in whatever position, whether elected or a different type. The misogyny report we have just done had a mix of women who were celebrities or just had a profile and a large Instagram following and who were, again, subject to that abuse.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Should there be more in the Bill with a specific reference to violence against women and girls, abuse and threats, and misogyny?

Eva Hartshorn-Sanders: There are definitely parts of the Bill that could be strengthened in that area. Part of that relates to incels and how they are treated, or not, as a terrorist organisation; or how small sites might be treated under the Bill. I can elaborate on that if you like.

None Portrait The Chair
- Hansard -

Thank you. Minister.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Kim Leadbeater?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you, Chair, and thank you to the witnesses. I just want to clarify something. We were talking about the journalistic content definition as it is. You are saying that you do not think it is reasonable to expect service providers to identify journalistic content using the definition contained in the Bill. Do you think the Bill should be clearer about what it means by journalistic content and journalism?

Matt Rogerson: My point is that for news publishers there is a lack of definition in the journalistic content exemption, and that platforms without the exemption would have to identify whether every piece of content on their platform was journalism, so it would be very difficult for the platforms to implement. That is why for trusted news brands such as the BBC, The Times, and The Guardian, the news media exemption is really important.

What we do not know, and what Gavin Millar suggested in his paper to Index on Censorship, is how that journalistic content exemption will be interpreted by the platforms. His fear in the paper is that the current definition means that the content has to be UK-linked. It could mean, for example, that a blog or a journalist that talks about issues in the Gulf or Ukraine would not be seen as journalistic content and therefore would not be able to take advantage of the systems that the platforms put in place. I think his view is that it should be in line with the article 10 definition of journalistic content, which would seem to make sense.

Owen Meredith: If I could add to that, speaking from my members’ perspective, they would all fall under the recognised news publisher definition. I think that is why it is an important definition. It is not an easy thing to get right, and I think the Department has done a good job in drafting the Bill. I think it captures everyone we would expect it to capture. I think actually it does set a relatively high bar for anyone else who is seeking to use that. I do not think it is possible for someone to simply claim that they are a recognised news publisher if they are operating in a way that we would not expect of such a person or entity. I think it is very important that that definition is clear. I think it is clear and workable.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q I suppose there are two separate clauses there. There is the news publisher clause and the journalistic content clause. Just so I am clear, you are happy with the news publisher clause?

Owen Meredith: Yes.

Matt Rogerson: Yes.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q What about the journalistic content clause? This is an expression that was new to me—this idea of a citizen journalist. I do not even know what that means. Are we confident that this clause, which talks about journalistic content, is the worrying one?

Owen Meredith: Matt spoke to this a little bit, but from my perspective, my focus has been on making sure that the recognised news publisher clause is right, because everything that my members publish is journalistic content. Therefore, the bulk of journalistic content that is out there will be covered by that. I think where there are elements of what else could be considered journalistic content, the journalistic content clause will pick those up.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q As journalists, does that worry you?

Matt Rogerson: I wish I was a journalist.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Sorry, as representatives of journalists.

Matt Rogerson: It worries me in the sense that we want a plural media ecosystem in this country, and we want individuals who are journalists to have their content published on platforms, so that it can be read by the 50% of the UK population that get their news from Facebook. I think it is potentially problematic that they won’t be able to publish on that platform if they talk about issues that are in the “legal but harmful” bucket of harms, as defined after the Bill is passed. I think there is concern for those groups.

There are suggestions for how you could change the clause to enable them to have more protection. As I say, Gavin Millar has outlined that in his paper. Even then, once you have got that in place, if you have a series of legal but harmful harms that are relatively unclear, the challenge for the platforms will be interpreting that and interpreting it against the journalistic content clause.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q My only concern is that someone who just decides to call themselves a journalist will be able to say what they want.

Owen Meredith: I do not think that would be allowable under the Bill, because of the distinction between a recognised news publisher publishing what we would all recognise as journalistic content, versus the journalistic content exemption. I think that is why they are treated differently.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by clarifying a comment that Owen Meredith made at the very beginning? You were commenting on where you would like the Bill to go further in protecting media organisations, and you said that you wanted there to be a wholesale exemption for recognised news publishers. I think there already is a wholesale exemption for recognised news publishers. The area where the Government have said they are looking at going further is in relation to what some people call a temporary “must carry” provision, or a mandatory right of appeal for recognised news publishers. Can I just clarify that that is what you meant?

Owen Meredith: Yes. I think the issue is how that exemption will work in practice. I think that what the Government have said they are looking at and will bring forward does address the operating in practice.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Mr Lewis, you were nodding.

Martin Lewis: I was nodding—I was smiling and thinking, “If it makes you feel any better, Tim, I have pictures of me that tell people to invest money that are clearly fake, because I don’t do any adverts, and it still is an absolute pain in the backside for me to get them taken down, having sued Facebook.” So, if your members want to feel any sense of comradeship, they are not alone in this; it is very difficult.

I think the interesting thing is about that volumetric algorithm. Of course, we go back to the fact that these big companies like to err on the side of making money and err away from the side of protecting consumers, because those two, when it comes to scams, are diametrically opposed. The sooner we tidy it up, the better. You could have a process where once there has been a certain number of reports—I absolutely get Tim’s point that in certain cases there is not a big enough volume—the advert is taken down and then the company has to proactively decide to put it back up and effectively say, “We believe this is a valid advert.” Then the system would certainly work better, especially if you bring down the required number of reports. At the moment, I think, there tends to be an erring on the side of, “Keep it up as long as it’s making us money, unless it absolutely goes over the top.”

Many tech experts have shown me adverts with my face in on various social media platforms. They say it would take them less than five minutes to write a program to screen them out, but those adverts continue to appear. We just have to be conscious here that—there is often a move towards self-regulation. Let me be plain, as I am giving evidence. I do not trust any of these companies to have the user and the consumer interest at heart when it comes to their advertising; what they have at heart is their own profits, so if we want to stop them, we have to make this Bill robust enough to stop them, because that is the only way it will stop. Do not rely on them trying to do good, because they are trying to make profit and they will err on the side of that over the side of protecting individuals from scam adverts.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q I thank the witnesses for coming. In terms of regulation, I was going to ask whether you believe that Ofcom is the most suitable regulator to operate in this area. You have almost alluded to the fact that you might not. On that basis, should we specify in the Bill a duty for Ofcom to co-operate with other regulators—for example, the Competition and Markets Authority, the Financial Conduct Authority, Action Fraud or whoever else?

Tim Fassam: I believe that would be helpful. I think Ofcom is the right organisation to manage the relationship with the platforms, because it is going to be much broader than the topics we are talking about in our session, but we do think the FCA, Action Fraud and potentially the CMA should be able to direct, and be very clear with Ofcom, that action needs to be taken. Ofcom should have the ability to ask for things to be reviewed to see whether they break the rules.

The other area where we think action probably needs to be taken is where firms are under investigation, because the Bill assumes it is clear cut whether something is fraud, a scam, a breach of the regulations or not. In some circumstances, that can take six months or a year to establish through investigation. We believe that if, for example, the FCA feels that something is high risk, it should be able to ask Ofcom to suspend an advert, or a firm from advertising, pending an investigation to assess whether it is a breach of the regulation.

Rocio Concha: I agree that Ofcom is the right regulator, the main regulator, but it needs to work with the other regulators—with the FCA, ASA and CMA—to enforce the Bill effectively. There is another area. Basically, we need to make sure that Ofcom and all the regulators involved have the right resources. When the initial version of the Bill was published, Ofcom got additional resources to enable it to enforce the Bill. But the Bill has increased in scope, because now it includes fraud and fraudulent advertising. We need to make sure that Ofcom has the right resources to enforce the full Bill effectively. That is something that the Government really need to consider.

Martin Lewis: I was going to make exactly that point, but it has just been made brilliantly so I will not waste your time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I thank the witnesses for joining us this afternoon, and particularly Martin Lewis for his campaigning in this area.

I will start by agreeing with the point that Martin Lewis made a minute or two ago—that we cannot trust these companies to work on their own. Mr Lewis, I am not sure whether you have had a chance to go through clause 34, which we inserted into the Bill following your evidence to the Joint Committee last year. It imposes a duty on these companies to take steps and implement systems to

“prevent individuals from encountering content consisting of fraudulent advertisements”.

There is a clear duty to stop them from doing this, rather as you were asking a minute ago when you described the presentation. Does that strong requirement in clause 34, to stop individuals from encountering fraudulent advertisement content, meet the objective that you were asking for last year?

Martin Lewis: Let me start by saying that I am very grateful that you have put it in there and thankful that the Government have listened to our campaign. What I am about to say is not intended as criticism.

It is very difficult to know how this will work in practice. The issue is all about thresholds. How many scam adverts can we stomach? I still have, daily—even from the platform that I sued, never mind the others—tens of reports directly to me of scam adverts with my face on. Even though there is a promise that we will try to mitigate that, the companies are not doing it. We have to have a legitimate understanding that we are not going to have zero scam adverts on these platforms; unless they were to pre-vet, which I do not think they will, the way they operate means that will not happen.

I am not a lawyer but my concern is that the Bill should make it clear, and that any interpretation of the Bill from Ofcom should be clear, about exactly what threshold of scam adverts is acceptable—we know that they are going to happen—and what threshold is not acceptable. I do not have the expertise to answer your question; I have to rely on your expertise to do that. But I ask the Committee to think properly about what the threshold level should be.

What is and is not acceptable? What counts as “doing everything they can”? They are going to get big lawyers involved if you say there must be zero scam adverts—that is not going to happen. How many scam adverts are acceptable and how many are not? I am so sorry to throw that back as a question when I am a witness, but I do not have the expertise to answer. But that is my concern: I am not 100% convinced of the threshold level that you are setting.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Specifically for children.

Frances Haugen: I will give you an example. Facebook has estimated ages for every single person on the platform, because the reality is that lots of adults also lie about their ages when they join, and advertisers want to target very specific demographics—for example, if you are selling a kit for a 40th birthday, you do not want to mis-target that by 10 years. Facebook has estimated ages for everyone on the platform. It could be required to publish every year, so that we could say, “Hey, there are four kids on the platform who you currently believe, using your estimated ages, are 14 years old—based not on how old they say they are, but on your estimate that this person is 14 years old. When did they join the platform? What fraction of your 14-year-olds have been on the platform since they were 10?” That is a vital statistic.

If the platforms were required to publish that every single quarter, we could say, “Wow! You were doing really badly four years ago, and you need to get a lot better.” Those kinds of lagging metrics are a way of allowing the public to grade Facebook’s homework, instead of just trusting Facebook to do a good job.

Facebook already does analyses like this today. They already know that on Facebook Blue, for example, for some age cohorts, 20% of 11-year-olds were on the platform—and back then, not that many kids were online. Today, I would guess a much larger fraction of 11-year-olds are on Instagram. We need to have transparency into how badly they are doing their jobs.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Frances, do you think that the Bill needs to set statutory minimum standards for things such as risk assessments and codes of practice? What will a company such as Facebook do without a minimum standard to go by?

Frances Haugen: It is vital to get into the statute minimum standards for things such as risk assessments and codes of conduct. Facebook has demonstrated time and again—the reality is that other social media platforms have too—that it does the bare minimum to avoid really egregious reputational damage. It does not ensure the level of quality needed for public safety. If you do not put that into the Bill, I worry that it will be watered down by the mountains of lobbyists that Facebook will throw at this problem.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you. You alluded earlier to the fact that the Bill contains duties to protect content of democratic importance and journalistic content. What is your view on those measures and their likely effectiveness?

Frances Haugen: I want to reiterate that AI struggles to do even really basic tasks. For example, Facebook’s own document said that it only took down 0.8% of violence-inciting content. Let us look at a much broader category, such as content of democratic importance—if you include that in the Bill, I guarantee you that the platforms will come back to you and say that they have no idea how to implement the Bill. There is no chance that AI will do a good job of identifying content of democratic importance at any point in the next 30 years.

The second question is about carve-outs for media. At a minimum, we need to greatly tighten the standards for what counts as a publication. Right now, I could get together with a friend and start a blog and, as citizen journalists, get the exact same protections as an established, thoughtful, well-staffed publication with an editorial board and other forms of accountability. Time and again, we have seen countries such as Russia use small media outlets as part of their misinformation and disinformation strategies. At a minimum, we need to really tighten that standard.

We have even seen situations where they will use very established publications, such as CNN. They will take an article that says, “Ukrainians destroyed a bunch of Russian tanks,” and intentionally have their bot networks spread that out. They will just paste the link and say, “Russia destroyed a bunch of tanks.” People briefly glance at the snippet, they see the picture of the tank, they see “CNN”, and they think, “Ah, Russia is winning.” We need to remember that even real media outlets can be abused by our enemies to manipulate the public.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Good afternoon, Frances. I want to ask you about anonymity and striking a balance. We have heard variously that anonymity affords some users safe engagement and actually reduces harm, while for others anonymity has been seen to fuel abuse. How do you see the balance, and how do you see the Bill striving to achieve that?

Frances Haugen: It is important for people to understand what anonymity really is and what it would really mean to have confirmed identities. Platforms already have a huge amount of data on their users. We bleed information about ourselves on to these platforms. It is not about whether the platforms could identify people to the authorities; it is that they choose not to do that.

Secondly, if we did, say, mandate IDs, platforms would have two choices. The first would be to require IDs, so that every single user on their platform would have to have an ID that is verifiable via a computer database—you would have to show your ID and the platform would confirm it off the computer. Platforms would suddenly lose users in many countries around the world that do not have well-integrated computerised databases. The platforms will come back to you and say that they cannot lose a third or half of their users. As long as they are allowed to have users from countries that do not have those levels of sophisticated systems, users in the UK will just use VPNs—a kind of software that allows you to kind of teleport to a different place in the world—and pretend to be users from those other places. Things such as ID identification are not very effective.

Lastly, we need to remember that there is a lot of nuance in things like encryption and anonymity. As a whistleblower, I believe there is a vital need for having access to private communications, but I believe we need to view these things in context. There is a huge difference between, say, Signal, which is open source and anyone in the world can read the code for it—the US Department of Defence only endorses Signal for its employees, because it knows exactly what is being used—and something like Messenger. Messenger is very different, because we have no idea how it actually works. Facebook says, “We use this protocol,” but we cannot see the code; we have no idea. It is the same for Telegram; it is a private company with dubious connections.

If people think that they are safe and anonymous, but they are not actually anonymous, they can put themselves at a lot of risk. The secondary thing is that when we have anonymity in context with more sensitive data—for example, Instagram and Facebook act like directories for finding children—that is a very different context for having anonymity and privacy from something like Signal, where you have to know someone’s phone number in order to contact them.

These things are not cut-and-dried, black-or-white issues. I think it is difficult to have mandatory identity. I think it is really important to have privacy. We have to view them in context.