(1 year, 10 months ago)
Lords ChamberThat the Bill be now read a second time.
My Lords, I am very glad to be here to move the Second Reading of the Online Safety Bill. I know that this is a moment which has been long awaited in your Lordships’ House and noble Lords from across the House share the Government’s determination to make the online realm safer.
That is what this Bill seeks to do. As it stands, over three quarters of adults in this country express a concern about going online; similarly, the number of parents who feel the benefits outweigh the risks of their children being online has decreased rather than increased in recent years, falling from two-thirds in 2015 to barely over half in 2019. This is a terrible indictment of a means through which people of all ages are living increasing proportions of their lives, and it must change.
All of us have heard the horrific stories of children who have been exposed to dangerous and deeply harmful content online, and the tragic consequences of such experiences both for them and their families. I am very grateful to the noble Baroness, Lady Kidron, who arranged for a number of noble Lords, including me, to see some of the material which was pushed relentlessly at Molly Russell whose family have campaigned bravely and tirelessly to ensure that what happened to their daughter cannot happen to other young people. It is with that in mind, at the very outset of our scrutiny of this Bill, that I would like to express my gratitude to all those families who continue to fight for change and a safer, healthier online realm. Their work has been central to the development of this Bill. I am confident that, through it, the Government’s manifesto commitment to make the UK the safest place in the world to be online will be delivered.
This legislation establishes a regulatory regime which has safety at its heart. It is intended to change the mindset of technology companies so that they are forced to consider safety and risk mitigation when they begin to design their products, rather than as an afterthought.
All companies in scope will be required to tackle criminal content and activity online. If it is illegal offline; it is illegal online. All in-scope platforms and search services will need to consider in risk assessments the likelihood of illegal content or activity taking place on their site and put in place proportionate systems and processes to mitigate those risks. Companies will also have to take proactive measures against priority offences. This means platforms will be required to take proportionate steps to prevent people from encountering such content.
Not only that, but platforms will also need to mitigate the risk of the platform being used to facilitate or commit such an offence. Priority offences include, inter alia: terrorist material, child sexual abuse and exploitation, so-called revenge pornography and material encouraging or assisting suicide. In practice, this means that all in-scope platforms will have to remove this material quickly and will not be allowed to promote it in their algorithms.
Furthermore, for non-priority illegal content, platforms must have effective systems in place for its swift removal once this content has been flagged to them. Gone will be the days of lengthy and arduous complaints processes and platforms feigning ignorance of such content. They can and will be held to account.
As I have previously mentioned, the safety of children is of paramount importance in this Bill. While all users will be protected from illegal material, some types of legal content and activity are not suitable for children and can have a deeply damaging impact on their mental health and their developing sense of the world around them.
All in-scope services which are likely to be accessed by children will therefore be required to assess the risks to children on their service and put in place safety measures to protect child users from harmful and age inappropriate content. This includes content such as that promoting suicide, self-harm or eating disorders which does not meet a criminal threshold; pornography; and damaging behaviour such as bullying.
The Bill will require providers specifically to consider a number of risk factors as part of their risk assessments. These factors include how functionalities such as algorithms could affect children’s exposure to content harmful to children on their service, as well as children’s use of higher risk features on the service such as livestreaming or private messaging. Providers will need to take robust steps to mitigate and effectively manage any risks identified.
Companies will need to use measures such as age verification to prevent children from accessing content which poses the highest risk of harm to them, such as online pornography. Ofcom will be able to set out its expectations about the use of age assurance solutions, including age verification tools, through guidance. This guidance will also be able to refer to relevant standards. The Bill also now makes it clear that providers may need to use age assurance to identify the age of their users to meet the necessary child safety duties and effectively enforce age restrictions on their service.
The Government will set out in secondary legislation the priority categories of content harmful to children so that all companies are clear on what they need to protect children from. Our intention is to have the regime in place as soon as possible after Royal Assent, while ensuring the necessary preparations are completed effectively and service providers understand clearly what is expected. We are working closely with Ofcom and I will keep noble Lords appraised.
My ministerial colleagues in another place worked hard to strengthen these provisions and made commitments to introduce further provisions in your Lordships’ House. With regard to increased protections for children specifically, the Government will bring forward amendments at Committee stage to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it is preparing a code of practice, ensuring that the experience of children and young people is accounted for during implementation.
We will also bring forward amendments to specify that category 1 companies—the largest and most risky platforms—will be required to publish a summary of their risk assessments for both illegal content and material that is harmful to children. This will increase transparency about illegal and harmful content on in-scope services and ensure that Ofcom can do its job regulating effectively.
We recognise the great suffering experienced by many families linked to children’s exposure to harmful content and the importance of this Bill in ending that. We must learn from the horrific events from the past to secure a safe future for children online.
We also understand that, unfortunately, people of any age may experience online abuse. For many adults, the internet is a positive source of entertainment and information and a way to connect with others; for some, however, it can be an arena for awful abuse. The Bill will therefore offer adult users a triple shield of protection when online, striking the right balance between protecting the right of adult users to access legal content freely, and empowering adults with the information and tools to manage their own online experience.
First, as I have outlined, all social media firms and search services will need to tackle illegal content and activity on their sites. Secondly, the Bill will require category 1 services to set clear terms of service regarding the user-generated content they prohibit and/or restrict access to, and to enforce those terms of service effectively. All the major social media platforms such as Meta, Twitter and TikTok say that they ban abuse and harassment online. They all say they ban the promotion of violence and violent threats, yet this content is still easily visible on those sites. People sign up to these platforms expecting one environment, and are presented with something completely different. This must stop.
As well as ensuring the platforms have proper systems to remove banned content, the Bill will also put an end to services arbitrarily removing legal content. The largest platform category 1 services must ensure that they remove or restrict access to content or ban or suspend users only where that is expressly allowed in their terms of service, or where they otherwise have a legal obligation to do so.
This Bill will make sure that adults have the information they need to make informed decisions about the sites they visit, and that platforms are held to their promises to users. Ofcom will have the power to hold platforms to their terms of service, creating a safer and more transparent environment for all.
Thirdly, category 1 services will have a duty to provide adults with tools they can use to reduce the likelihood that they encounter certain categories of content, if they so choose, or to alert them to the nature of that content. This includes content which encourages, promotes, or provides instructions for suicide, self-harm or eating disorders. People will also have the ability to filter out content from unverified users if they so wish. This Bill will mean that adult users will be empowered to make more informed choices about what services they use, and to have greater control over whom and what they engage with online.
It is impossible to speak about the aspects of the Bill which protect adults without, of course, mentioning freedom of expression. The Bill needs to strike a careful balance between protecting users online, while maintaining adults’ ability to have robust—even uncomfortable or unpleasant—conversations within the law if they so choose. Freedom of expression within the law is fundamental to our democracy, and it would not be right for the Government to interfere with what legal speech is permitted on private platforms. Instead, we have developed an approach based on choice and transparency for adult users, bounded by major platforms’ clear commercial incentives to provide a positive experience for their users.
Of course, we cannot have robust debate without being accurately informed of the current global and national landscape. That is why the Bill includes particular protections for recognised news publishers, content of democratic importance, and journalistic content. We have been clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections. We will therefore bring forward an amendment in your Lordships’ House explicitly to exclude entities subject to sanctions from the definition of a recognised news publisher.
Alongside the safety duties for children and the empowerment tools for adults, platforms must also have effective reporting and redress mechanisms in place. They will need to provide accessible and effective mechanisms for users to report content which is illegal or harmful, or where it breaches terms and conditions. Users will need to be given access to effective mechanisms to complain if content is removed without good reason.
The Bill will place a duty on platforms to ensure that those reporting mechanisms are backed up by timely and appropriate redress mechanisms. Currently, internet users often do not bother to report harmful content they encounter online, because they do not feel that their reports will be followed up. That too must change. If content has been unfairly removed, it should be reinstated. If content should not have been on the site in question, it should be taken down. If a complaint is not upheld, the reasons should be made clear to the person who made the report.
There have been calls—including from the noble Lord, Lord Stevenson of Balmacara, with whom I look forward to working constructively, as we have done heretofore—to use the Bill to create an online safety ombudsman. We will listen to all suggestions put forward to improve the Bill and the regime it ushers in with an open mind, but as he knows from our discussions, of this suggestion we are presently unconvinced. Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them. Instead, the Bill ensures that, where providers’ user-reporting and redress mechanisms are not sufficient, Ofcom will have the power to take enforcement action and require the provider to improve its user-redress provisions to meet the standard required of them. I look forward to probing elements of the Bill such as this in Committee.
This regulatory framework could not be effective if Ofcom, as the independent regulator, did not have a robust suite of powers to take enforcement actions against companies which do not comply with their new duties, and if it failed to take the appropriate steps to protect people from harm. I believe the chairman of Ofcom, the noble Lord, Lord Grade of Yarmouth, is in his place. I am glad that he has been and will be following our debates on this important matter.
Through the Bill, Ofcom will have wide-ranging information-gathering powers to request any information from companies which is relevant to its safety functions. Where necessary, it will be able to ask a suitably skilled person to undertake a report on a company’s activity—for example, on its use of algorithms. If Ofcom decides to take enforcement action, it can require companies to take specific steps to come back into compliance.
Ofcom will also have the power to impose substantial fines of up to £18 million, or 10% of annual qualifying worldwide revenue, whichever is higher. For the biggest technology companies, this could easily amount to billions of pounds. These are significant measures, and we have heard directly from companies that are already changing their safety procedures to ensure they comply with these regulations.
If fines are not sufficient, or not deemed appropriate because of the severity of the breach, Ofcom will be able to apply for a court order allowing it to undertake business disruption measures. This could be blocking access to a website or preventing it making money via payment or advertising services. Of course, Ofcom will be able to take enforcement action against any company that provides services to people in the UK, wherever that company is located. This is important, given the global nature of the internet.
As the Bill stands, individual senior managers can be held criminally liable and face a fine for failing to ensure their platform complies with Ofcom’s information notice. Further, individual senior managers can face jail, a fine or both for failing to prevent the platform committing the offences of providing false information, encrypting information or destroying information in response to an information notice.
The Government have also listened to and acknowledged the need for senior managers to be made personally liable for a wider range of failures of compliance. We have therefore committed to tabling an amendment in your Lordships’ House which will be carefully designed to capture instances where senior managers have consented to or connived in ignoring enforceable requirements, risking serious harm to children. We are carefully designing this amendment to ensure that it can hold senior managers to account for their actions regarding the safety of children, without jeopardising the UK’s attractiveness as a place for technology companies to invest in and grow. We intend to base our offence on similar legislation recently passed in the Republic of Ireland, as well as looking carefully at relevant precedent in other sectors in the United Kingdom.
I have discussed the safety of children, adults, and everyone’s right to free speech. It is not possible to talk about this Bill without also discussing its protections for women and girls, who we know are disproportionately affected by online abuse. As I mentioned, all services in scope will need to seek out and remove priority illegal content proactively. There are a number of offences which disproportionately affect women and girls, such as revenge pornography and cyberstalking, which the Bill requires companies to tackle as a priority.
To strengthen protections for women in particular, we will be listing controlling or coercive behaviour as a priority offence. Companies will have to take proactive measures to tackle this type of illegal content. We will also bring forward an amendment to name the Victims’ Commissioner and the domestic abuse commissioner as statutory consultees for the codes of practice. This means there will be a requirement for Ofcom to consult both commissioners ahead of drafting and amending the codes of practice, ensuring that victims, particularly victims and survivors of domestic abuse, are better protected. The Secretary of State and our colleagues have been clear that women’s and girls’ voices must be heard clearly in developing this legislation.
I also want to take this opportunity to acknowledge the concerns voiced over the powers for the Secretary of State regarding direction in relation to codes of practice that currently appear in the Bill. That is a matter on which my honourable friend Paul Scully and I were pressed by your Lordships’ Communications and Digital Committee when we appeared before it last week. As we explained then, we remain committed to ensuring that Ofcom maintains its regulatory independence, which is vital to the success of this framework. As we are introducing ground-breaking regulation, our aim is to balance the need for the regulator’s independence with appropriate oversight by Parliament and the elected Government.
We intend to bring forward two changes to the existing power: first, replacing the “public policy” wording with a defined list of reasons that a direction can be made; and secondly, making it clear that this element of the power can only be used in exceptional circumstances. I would like to reassure noble Lords—as I sought to reassure the Select Committee—that the framework ensures that Parliament will always have the final say on codes of practice, and that strong safeguards are in place to ensure that the use of this power is transparent and proportionate.
Before we begin our scrutiny in earnest, it is also necessary to recognise that this Bill is not just establishing a regulatory framework. It also updates the criminal law concerning communication offences. I want to thank the Law Commission for its important work in helping to strengthen criminal law for victims. The inclusion of the new offences for false and threatening communications offers further necessary protections for those who need it most. In addition, the Bill includes new offences to criminalise cyberflashing and epilepsy trolling. We firmly believe that these new offences will make a substantive difference to the victims of such behaviour. The Government have also committed to adding an additional offence to address the encouragement or assistance of self-harm communications and offences addressing intimate image abuse online, including deep- fake pornography. Once these offences are introduced, all companies will need to treat this content as illegal under the framework and take action to prevent users from encountering it. These new offences will apply in respect of all victims of such activity, children as well as adults.
This Bill has been years in the making. I am proud to be standing here today as the debate begins in your Lordships’ House. I realise that noble Lords have been waiting long and patiently for this moment, but I know that they also appreciate that considerable work has already been done to ensure that this Bill is proportionate and fair, and that it provides the change that is needed.
A key part of that work was conducted by the Joint Committee, which conducted pre-legislative scrutiny of the Bill, drawing on expertise from across both Houses of Parliament, from all parties and none. I am very glad that all the Members of your Lordships’ House who served on that committee are speaking in today’s debate: the noble Baroness, Lady Kidron; the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, who have very helpfully been called to service on the Opposition Front Bench; the noble Lord, Lord Clement-Jones, who speaks for the Liberal Democrats; as well as my noble friends Lord Black of Brentwood and Lord Gilbert of Panteg.
While I look forward to the contributions of all Members of your Lordships’ House, and will continue the open-minded, collaborative approach established by my right honourable friend the Secretary of State and her predecessors—listening to all ideas which are advanced to make this Bill as effective as it can be—I urge noble Lords who are not yet so well-versed in its many clauses and provisions, or who might be disinclined to accept at first utterance the points I make from this Dispatch Box, to consult those noble Lords before bringing forward their amendments in later stages of the Bill. I say that not to discourage noble Lords from doing so, but in the spirit of ensuring that what they do bring forward, and our deliberations on them, will be pithy, focused, and conducive to making this Bill law as swiftly as possible. In that spirit, I shall draw my already too lengthy remarks to a close. I beg to move.
My Lords, I declare my interest, as set out in the register, as a member of the advisory council of the Free Speech Union.
This is an important Bill. It has taken time to get to us, and rightly so. Many important requirements have to be balanced in it—the removal of illegal material, and the protection of children, as we have heard so movingly already today. But, as legislators, we must also have an eye on all elements of public policy. We cannot eliminate every evil entirely, except at unacceptable cost to other objectives and, notably, to free speech.
The Bill, as it was developing last summer, was damaging in many ways to that objective. At times I was quite critical of it, so I welcome the efforts that have been made by the new broom and new team at DCMS to put it in a better place. It is not perfect, but is considerably better and less damaging to the free speech objective. In particular, I welcome the removal of the so-called legal but harmful provisions, their replacement with a duty to empower users and the decision to list out the areas that this provision applies to, rather than leaving it to secondary legislation. I also welcome the strengthening of provisions to protect the right to free speech and democratic debate more broadly, although I will come on to a couple of concerns, and the dropping of the new harmful communications offence in the original Bill. It is clear, from what we have heard so far today, that there will be proposals to move backwards—as I would see it—to the original version of the Bill. I hope that the Government will be robust on that, having taken the position that they have.
Although the Bill is less damaging, it must still be fit for purpose. With 25,000 companies in its scope, it also affects virtually every individual in the country, so it is important that it is clear and usable and does not encourage companies to be too risk averse. With that in mind, there are areas for improvement. Given the time constraints, I will focus on free speech.
I believe that in a free society, adults—not children but adults—should be able to cope with free debate, if they are given the tools to do so. Noble Lords have spoken already about the abuse that they get online, and we all do. I am sure I am not unique in that; some if it drifts into the real world as well, from time to time. However, I do not look to the Government to defend me from it. I already have most of the tools to turn that off when I want to, which I think is the right approach. It is the one that the Government are pursuing. Free speech is the best way of dealing with controversial issues, as we have seen in the last few weeks, and it is right for the Government to err on the side of caution and not allow a chilling effect in practice.
With this in mind, there are a couple of improvements that I hope the Government might consider. For example, they could require an opt-out from seeing the relevant “legal but harmful” content, rather than an opt-in to see it, and ensure those tools are easy to use. There is otherwise a risk that risk-averse providers will block controversial content and people will not even know about it. It could be useful to require providers to say how they intend to protect freedom of speech, just as they are required to say explicitly how they will manage the Clause 12 provisions. Without that, there is some risk that freedom of speech may become a secondary objective.
To repeat, there has been considerable improvement overall. I welcome my noble friend the Minister’s commitment to listen carefully to all proposals as we take the Bill through in this House. I am happy to support him in enabling the passage of this legislation in good order soon.
My Lords, I am grateful to the very many noble Lords who have spoken this afternoon and this evening. They have spoken with passion—we heard that in the voices of so many—about their own experiences, the experiences of their families and the experiences of far too many of our fellow subjects, who have harrowing examples of the need for this Bill. But noble Lords have also spoken with cool-headed precision and forensic care about the aspects of the Bill that demand our careful scrutiny. Both hearts and heads are needed to make this Bill worth the wait.
I am very grateful for the strong consensus that has come through in noble Lords’ speeches on the need to make this Bill law and to do so quickly, and therefore to do our work of scrutiny diligently and speedily. I am grateful for the very generous and public-spirited offer the noble Lord, Lord Stevenson, has just issued. I, too, would like to make this not a party-political matter; it is not and has not been in the speeches we have heard today. The work of your Lordships’ House is to consider these matters in detail and without party politics intruding, and it would be very good if we could proceed on the basis of collaboration, co-operation and, on occasion, compromise.
In that spirit, I should say at the outset that I share the challenge faced by the noble Lords, Lord Clement-Jones and Lord Stevenson. Given that so many speakers have chosen to contribute, I will not be able to cover or acknowledge everyone who has spoken. I shall undoubtedly have to write on many of the issues to provide the technical detail that the matters they have raised deserve. It is my intention to write to noble Lords and invite them to join a series of meetings to look in depth at some of the themes and areas between now and Committee, so that as a group we can have well-informed discussions in Committee. I shall write with details suggesting some of those themes, and if noble Lords feel that I have missed any, or particular areas they would like to continue to talk about, please let me know and I will be happy to facilitate those.
I want to touch on a few of the issues raised today. I shall not repeat some of the points I made in my opening speech, given the hour. Many noble Lords raised the very troubling issue of children accessing pornography online, and I want to talk about that initially. The Government share the concerns raised about the lack of protections for children from this harmful and deeply unsuitable content. That is why the Bill introduces world-leading protections for children from online pornography. The Bill will cover all online sites offering pornography, including commercial pornography sites, social media, video-sharing platforms and fora, as well as search engines, which play a significant role in enabling children to access harmful and age-inappropriate content online. These companies will have to prevent children accessing pornography or face huge fines. To ensure that children are protected from this content, companies will need to put in place measures such as age verification, or demonstrate that the approach they are taking delivers the same level of protection for children.
While the Bill does not mandate that companies use specific technologies to comply with these new duties, in order to ensure that the Bill is properly future-proofed, we expect Ofcom to take a robust approach to sites which pose the highest risk of harm to children, including sites hosting online pornography. That may include directing the use of age verification technologies. Age verification is also referred to in the Bill. This is to make clear that these are measures that the Government expect to be used for complying with the duties under Part 3 and Part 5 to protect children from online pornography. Our intention is to have the regime operational as soon as possible after Royal Assent, while ensuring that the necessary preparations are completed effectively and that service providers understand what is expected of them. We are working very closely with Ofcom to ensure this.
The noble Lord, Lord Morrow, and others asked about putting age verification in the Bill more clearly, as was the case with the Digital Economy Act. The Online Safety Bill includes references to age assurance and age verification in the way I have just set out. That is to make clear that these are measures which the Government expect to be used for complying with the duties where proportionate to do so. While age assurance and age verification are referred to in the Bill, the Government do not mandate the use of specific approaches or technologies. That is similar to the approach taken in the Digital Economy Act, which did not mandate the use of a particular technology either.
I think my noble friend Lord Bethell prefers the definition of pornography in Part 3 of the Digital Economy Act. There is already a robust definition of “pornographic content” in this Bill which is more straightforward for providers and Ofcom to apply. That is important. The definition we have used is similar to the definition of pornographic content used in existing legislation such as the Coroners and Justice Act 2009. It is also in line with the approach being taken by Ofcom to regulate UK-established video-sharing platforms, meaning that the industry will already have familiarity with this definition and that Ofcom will already have experience in regulating content which meets this definition. That means it can take action more swiftly. However, I have heard the very large number of noble Lords who are inclined to support the work that my noble friend is doing in the amendments he has proposed. I am grateful for the time he has already dedicated to conversations with the Secretary of State and me on this and look forward to discussing it in more detail with him between now and Committee.
A number of noble Lords, including the noble Baronesses, Lady Finlay of Llandaff and Lady Kennedy of The Shaws, talked about algorithms. All platforms will need to undertake risk assessments for illegal content. Services likely to be accessed by children will need to undertake a children’s risk assessment to ensure they understand the risks associated with their services. That includes taking into account in particular the risk of algorithms used by their service. In addition, the Bill includes powers to ensure that Ofcom is able effectively to assess whether companies are fulfilling their regulatory requirements, including in relation to the operating of their algorithms. Ofcom will have the power to require information from companies about the operation of their algorithms and the power to investigate non-compliance as well as the power to interview employees. It will have the power to require regulated service providers to undergo a skilled persons report and to audit company systems and processes, including in relation to their algorithms.
The noble Baroness, Lady Kidron, rightly received many tributes for her years of work in relation to so many aspects of this Bill. She pressed me on bereaved parents’ access to data and, as she knows, it is a complex issue. I am very grateful to her for the time she has given to the meetings that the Secretary of State and I have had with her and with colleagues from the Ministry of Justice on this issue, which we continue to look at very carefully. We acknowledge the distress that some parents have indeed experienced in situations such as this and we will continue to work with her and the Ministry of Justice very carefully to assess this matter, mindful of its complexities which, of course, were something the Joint Committee grappled with as well.
The noble Baroness, Lady Featherstone, my noble friend Lady Wyld and others focused on the new cyberflashing offence and suggested that a consent-based approach would be preferable. The Law Commission looked at that in drawing up its proposals for action in this area. The Law Commission’s report raised concerns about the nature of consent in instant messaging conversations, particularly where there are misjudged attempts at humour or intimacy that could particularly affect young people. There is a risk, which we will want to explore in Committee, of overcriminalising young people. That is why the Government have brought forward proposals based on the Law Commission’s work. If noble Lords are finding it difficult to see the Law Commission’s reports, I am very happy to draw them to their attention so that they can benefit from the consultation and thought it conducted on this difficult issue.
The noble Baroness, Lady Gohir, talked about the impact on body image of edited images in advertising. Through its work on the online advertising programme, DCMS is considering how the Government should approach advertisements that contribute to body image concerns. A consultation on this programme closed in June 2022. We are currently analysing the responses to the consultation and developing policy. Where there is harmful user-generated content related to body image that risks having an adverse physical or psychological impact on children, the Online Safety Bill will require platforms to take action against that. Under the Bill’s existing risk assessment duties, regulated services are required to consider how media literacy can be used to mitigate harm for child users. That could include using content provenance technology, which can empower people to identify when content has been digitally altered in ways such as the noble Baroness mentioned.
A number of noble Lords focused on the changes made in relation to the so-called “legal but harmful” measures to ensure that adults have the tools they need to curate and control their experience online. In particular, noble Lords suggested that removing the requirement for companies to conduct risk assessments in relation to a list of priority content harmful to adults would reduce protections available for users. I do not agree with that assessment. The new duties will empower adult users to make informed choices about the services they use and to protect themselves on the largest platforms. The new duties will require the largest platforms to enforce all their terms of service regarding the moderation of user-generated content, not just the categories of content covered in a list in secondary legislation. The largest platforms already prohibit the most abusive and harmful content. Under the new duties, platforms will be required to keep their promises to users and take action to remove it.
There was rightly particular focus on vulnerable adult users. The noble Baronesses, Lady Hollins and Lady Campbell of Surbiton, and others spoke powerfully about that. The Bill will give vulnerable adult users, including people with disabilities, greater control over their online experience too. When using a category 1 service, they will be able to reduce their exposure to online abuse and hatred by having tools to limit the likelihood of their encountering such content or to alert them to the nature of it. They will also have greater control over content that promotes, encourages or provides instructions for suicide, self-harm and eating disorders. User reporting and redress provisions must be easy to access by all users, including people with a disability and adults with caring responsibilities who are providing assistance. Ofcom is of course subject to the public sector equality duty as well, so when performing its duties, including writing its codes of practice, it will need to take into account the ways in which people with protected characteristics, including people with disabilities, can be affected. I would be very happy to meet the noble Baronesses and others on this important matter.
The noble Lords, Lord Hastings of Scarisbrick and Lord Londesborough, and others talked about media literacy. The Government fully recognise the importance of that in achieving online safety. As well as ensuring that companies take action to keep users safe through this Bill, we are taking steps to educate and empower them to make safe and informed choices online. First, the Bill strengthens Ofcom’s existing media literacy functions. Media literacy is included in Ofcom’s new transparency reporting and information-gathering powers. In response to recommendations from the Joint Committee, the legislation also now specifies media literacy in the risk-assessment duties. In July 2021, DCMS published the online media literacy strategy, which sets out our ambition to improve national media literacy. We have committed to publishing annual action plans in each financial year until 2024-25, setting out our plans to deliver that. Furthermore, in December of that year, Ofcom published Ofcom’s Approach to Online Media Literacy, which includes an ambitious range of work focusing on media literacy.
Your Lordships’ House is, understandably, not generally enthusiastic about secondary legislation and secondary legislative powers, so I was grateful for the recognition by many tonight of the importance of providing for them in certain specific instances through this Bill. As the noble Lord, Lord Brooke of Alverthorpe, put it, there may be loopholes that Parliament wishes to close, and quickly. My noble friend Lord Inglewood spoke of the need for “living legislation”, and it is important to stress, as many have, that this Bill seeks to be technology-neutral—not specifying particular technological approaches that may quickly become obsolete—in order to cater for new threats and challenges as yet not envisaged. Some of those threats and challenges were alluded to in the powerful speech of my noble friend Lord Sarfraz. I know noble Lords will scrutinise those secondary powers carefully. I can tell my noble friend that the Bill does apply to companies that enable users to share content online or interact with each other, as well as to search services. That includes a broad range of services, including the metaverse. Where haptics enable user interaction, companies must take action. The Bill is also clear that content generated by bots is in scope where it interacts with user-generated content such as on Twitter, but not if the bot is controlled by or on behalf of the service, such as providing customer services for a particular site.
Given the range of secondary powers and the changing technological landscape, a number of noble Lords understandably focused on the need for post-legislative scrutiny. The Bill has undoubtedly benefited from pre-legislative scrutiny. As I said to my noble friend Lady Stowell of Beeston in her committee last week, we remain open-minded on the best way of doing that. We must ensure that once this regime is in force, it has the impact we all want it to have. Ongoing parliamentary scrutiny will be vital in ensuring that is the case. We do not intend to legislate for a new committee, not least because it is for Parliament itself to decide what committees it sets up. But I welcome further views on how we ensure that we have effective parliamentary scrutiny, and I look forward to discussing that in Committee. We have also made it very clear that the Secretary of State will undertake a review of the effectiveness of the regime between two and five years after it comes into force, producing a report that will then be laid in Parliament, thus providing a statutory opportunity for Parliament to scrutinise the effectiveness of the legislation.
My noble friend and other members of her committee followed up with a letter to me about the Secretary of State’s powers. I shall reply to that letter in detail and make that available to all noble Lords to see ahead of Committee. This is ground-breaking legislation, and we have to balance the need for regulatory independence with the appropriate oversight for Parliament and the Government. In particular, concerns were raised about the Secretary of State’s power of direction in Clause 39. Ofcom’s independence and expertise will be of utmost importance here, but the very broad nature of online harms means that there may be subjects that go beyond its expertise and remit as a regulator. That was echoed by Ofcom itself when giving evidence to the Joint Committee: it noted that there will clearly be some issues in respect of which the Government have access to expertise and information that the regulator does not, such as national security.
The framework in the Bill ensures that Parliament will always have the final say on codes of practice, and the use of the affirmative procedure will further ensure that there is an increased level of scrutiny in the exceptional cases where that element of the power is used. As I said, I know that we will look at that in detail in Committee.
My noble friend Lord Black of Brentwood, quoting Stanley Baldwin, talked about the protections for journalistic content. He and others are right that the free press is a cornerstone of British democracy; that is why the Bill has been designed to protect press and media freedom and why it includes robust provisions to ensure that people can continue to access diverse news sources online. Category 1 companies will have a new duty to safeguard all journalistic content shared on their platform, which includes citizen journalism. Platforms will need to put systems and processes in place to protect journalistic content, and they must enforce their terms of service consistently across all moderation and in relation to journalistic content. They will also need to put in place expedited appeals processes for producers of journalistic content.
The noble Baroness, Lady Anderson of Stoke-on-Trent, spoke powerfully about the appalling abuse and threats of violence she sustained in her democratic duties, and the noble Baroness, Lady Foster, spoke powerfully of the way in which that is putting off people, particularly women, from going into public life. The noble Baroness, Lady Anderson, asked about a specific issue: the automatic deletion of material and the implications for prosecution. We have been mindful of the scenario where malicious users post threatening content which they then delete themselves, and of the burden on services that retaining that information in bulk would cause. We have also been mindful of the imperative to ensure that illegal content cannot be shared and amplified online by being left there. The retention of data for law enforcement purposes is strictly regulated, particularly through the Investigatory Powers Act, which the noble Lord, Lord Anderson of Ipswich, is reviewing at the request of the Home Secretary. I suggest that the noble Baroness and I meet to speak about that in detail, mindful of that ongoing review and the need to bring people to justice.
The noble Baroness, Lady Chakrabarti, asked about sex for rent. Existing offences can be used to prosecute that practice, including Sections 52 and 53 of the Sexual Offences Act 2003, both of which are listed as priority offences in Schedule 7 to the Bill. As a result, all in-scope services must take proactive measures to prevent people being exposed to such content.
The noble Lord, Lord Davies of Brixton, and others talked about scams. The largest and most popular platforms and search engines—category 1 and category 2A services in the Bill—will have a duty to prevent paid-for fraudulent adverts appearing on their services, making it harder for fraudsters to advertise scams online. We know that that can be a particularly devastating crime. The online advertising programme builds on this duty in the Bill and will look at the role of the whole advertising system in relation to fraud, as well as the full gamut of other harms which are caused.
My noble friend Lady Fraser talked about the devolution aspects, which we will certainly look at. Internet services are a reserved matter for the UK Government. The list of priority offences in Schedule 7 can be updated only by the Secretary of State, subject to approval by this Parliament.
The right reverend Prelate the Bishop of Manchester asked about regulatory co-operation, and we recognise the importance of that. Ofcom has existing and strong relationships with other regulators, such as the ICO and the CMA, which has been supported and strengthened by the establishment of the Digital Regulation Cooperation Forum in 2020. We have used the Bill to strengthen Ofcom’s ability to work closely with, and to disclose information to, other regulatory bodies. Clause 104 ensures that Ofcom can do that, and the Bill also requires Ofcom to consult the Information Commissioner.
I do not want to go on at undue length—I am mindful of the fact that we will have detailed debates on all these issues and many more in Committee—but I wish to conclude by reiterating my thanks to all noble Lords, including the many who were not able to speak today but to whom I have already spoken outside the Chamber. They all continue to engage constructively with this legislation to ensure that it meets our shared objectives of protecting children and giving people a safe experience online. I look forward to working with noble Lords in that continued spirit.
My noble friend Lady Morgan of Cotes admitted to being one of the cavalcade of Secretaries of State who have worked on this Bill; I pay tribute to her work both in and out of office. I am pleased that my right honourable friend the Secretary of State was here to observe part of our debate today and, like all noble Lords, I am humbled that Ian Russell has been here to follow our debate in its entirety. The experience of his family and too many others must remain uppermost in our minds as we carry out our duty on the Bill before us; I know that it will be. We have an important task before us, and I look forward to getting to it.