(2 days, 23 hours ago)
Lords ChamberMy Lords, I thank my noble friend Lord Holmes of Richmond for moving this amendment. I am sure we can all agree that the ICO should encourage and accommodate innovation. As I noted during the first day on Report, in a world where trade and business are ever more reliant on cross-border data transfers, data adequacy becomes ever more important.
In Committee, the noble Baroness, Lady Jones of Whitchurch, was able to give the House the reassurance that this Bill was designed with EU adequacy in mind. We were pleased to hear that the Government’s course of action is not expected to put this at risk. I also suggest that this Bill represents even less of a departure from GDPR than did its predecessor, the DPDI Bill.
We welcome the Government’s assurances, but we look to them to address the issues raised by my noble friend Lord Holmes. I think we can all agree that he has engaged constructively and thoughtfully on this Bill throughout.
I thank the noble Lord, Lord Holmes, for his Amendment 38 relating to the ICO’s innovation duty. I agree with his comments about the quality of our regulators.
I reiterate the statements made throughout the Bill debates that the Government are committed to the ongoing independence of the ICO as a regulator and have designed the proposals in the Bill with retaining EU adequacy in mind. The commissioner’s status as an independent supervisory authority for data protection is assured. The Information Commissioner has discretion over the application of his new duties. It will be for him to set out and justify his activities in relation to those duties to Parliament.
To answer the specific point, as well as that raised by the noble Lord, Lord Clement-Jones, considerations of innovations will not come at the expense of the commissioner’s primary objective to secure an appropriate level of protection for personal data. I hope that reassures the noble Lord.
I thank all noble Lords who have taken part in this short debate and thank the Minister for his response. I believe my wording would assist the ICO in its mission, but I have listened to what the Minister has said and, for the time being, I beg leave to withdraw the amendment.
My Lords, I thank the noble Baroness, Lady Kidron, for moving her amendment. Before I begin, let me declare my interest as a recently appointed director of Lumi, an edtech provider—but for graduates, not for schools.
AI has the potential to revolutionise educational tools, helping teachers spend less time on marking and more time on face-to-face teaching with children, creating more innovative teaching tools and exercises and facilitating more detailed feedback for students. AI presents a real opportunity to improve education outcomes for children, opening more opportunities throughout their lives. There are deeply compelling promises in edtech.
However—there is always a however when we talk about edtech—creating and using AI education tools will require the collection and processing of children’s personal data. This potentially includes special category data—for instance, medical information pertaining to special educational needs such as dyslexia. Therefore, care must be taken in regulating how this data is collected, stored, processed and used. Without this, AI poses a major safeguarding risk. We share the concerns of the noble Baroness, Lady Kidron, and wholeheartedly support the spirit of her amendment.
We agree that it is prudent to require the ICO to make a code of practice on children’s data and education, and I particularly welcome a requirement on the ICO to consult with and involve parents. Parents know their children best, needless to say, and have their best interests at heart; their input will be critical in building trust in AI-assisted educational tools and facilitating their rollout and benefits for children throughout the UK.
However, as I said earlier at Report—and I shall not repeat the arguments now—we have concerns about the incorporation of international law into our law, and specifically, in this instance, the UN Convention on the Rights of the Child. We cannot therefore support the amendment as drafted. That said, we hope very much that the Government will listen carefully to the arguments raised here and take steps to introduce appropriate safeguards for children and young people in our data legislation regime. I suspect that most parents will greatly welcome more reassurance about the use of their children’s data.
I thank the noble Baroness, Lady Kidron, for raising this important topic today, and thank noble Lords for the impassioned speeches that we have heard. As my noble friend Lady Jones mentioned in Committee, the ICO has been auditing the practices of several edtech service providers and is due to publish its findings later this year. I am pleased to be able to give the noble Baroness, Lady Kidron, a firm commitment today that the Government will use powers under the Data Protection Act 2018 to require the ICO to publish a new code of practice addressing edtech issues.
The noble Baronesses, Lady Kidron and Lady Harding, both raised important points about the specificity, and I will try to address some of those. I am grateful to the noble Baroness for her suggestions about what the code should include. We agree that the starting point for the new code should be that children merit special protection in relation to their personal data because they may be less aware of the risks and their rights in relation to its processing. We agree that the code should include guidance for schools on how to comply with their controller duties in respect of edtech services, and guidance for edtech services on fulfilling their duties under the data protection framework—either as processors, controllers or joint controllers. We also agree that the code should provide practical guidance for organisations on how to comply with their so-called:
“Data protection by design and by default”
duties. This would help to ensure that appropriate technical and organisational measures are implemented in the development and operation of processing activities undertaken by edtech services.
The noble Baroness suggested that the new code should include requirements for the ICO to develop the code in consultation with children, parents, educators, children’s rights advocates, devolved Governments and industry. The commissioner must already consult trade associations, data subjects and persons who appear to the commissioner to represent the interest of data subjects before preparing a code, but these are very helpful suggestions. The development of any new code will also follow the new procedures introduced by Clause 92 of this Bill. The commissioner would be required to convene an expert panel to inform the development of the code and publish the draft code. Organisations and individuals affected by the code would be represented on the panel, and the commissioner would be required to consider its recommendations before publishing the code.
Beyond this, we do not want to pre-determine the outcome of the ICO’s audits by setting out the scope of the code on the face of the Bill now. The audits might uncover new areas where guidance is needed. Ensuring a clear scope for a code, grounded in evidence, will be important. We believe that allowing the ICO to complete its audits, so that the findings can inform the breadth and focus of the code, is appropriate.
The ICO will also need to carefully consider how its codes interrelate. For example, the noble Baroness suggested that the edtech code should cover edtech services that are used independently by children at home and the use of profiling to make predictions about a child’s attainment. Such processing activities may also fall within the scope of the age-appropriate design code and the proposed AI code, respectively. We need to give the ICO the flexibility to prepare guidance for organisations in a way that avoids duplication. Fully understanding the problems uncovered by the ICO audits will be essential to getting the scope and content of each code right and reducing the risk of unintended consequences.
To complement any recommendations that come from the ICO and its audits, the Department for Education will continue to work with educators and parents to help them to make informed choices about the products and services that they choose to support teaching and learning. The noble Baroness’s suggestion that there should be a certification scheme for approved edtech service providers is an interesting one that we will discuss with colleagues in the Department for Education. However, there might be other solutions that could help schools to make safe procurement decisions, and it would not be appropriate to use the ICO code to mandate a specific approach.
The point about schools and the use of work by children is clearly important; our measures are intended to increase the protections for children, not to reduce them. The Government will continue to work closely with noble Lords, the Department for Education, the ICO and the devolved regions as we develop the necessary regulations following the conclusion of the ICO audit. I hope that the noble Baroness is pleased with this commitment and as such feels content to withdraw her amendment.
May I ask for a commitment from the Dispatch Box that, when the order is complete and some of those conversations are being discussed, we can have a meeting with the ICO, the DfE and noble Lords who have fought for this since 2018?
I am very happy to give that commitment. That would be an important and useful meeting.
I thank the Minister and the Government. As I have just said, we have been fighting for this since 2018, so that is quite something. I forgot to say in my opening remarks that edtech does not, of course, have an absolute definition. However, in my mind—it is important for me to say this to the House—it includes management, safety and tech that is used for educational purposes. All those are in schools, and we have evidence of problems with all of them. I was absolutely delighted to hear the Government’s commitments, and I look forward to working with the ICO and the department. With that, I beg leave to withdraw.
I thank the noble Baroness, Lady Kidron, for moving this incredibly important group and all those speakers who have made the arguments so clearly and powerfully. I pay tribute to noble Baroness’s work on copyright and AI, which is so important for our arts and culture sector. As noble Lords have rightly said, our cultural industries make an enormous contribution to our country, not just in cultural terms but in economic ones, and we must ensure that our laws do not put that future at risk.
In the build-up to this debate I engaged with great pleasure with the noble Baroness, Lady Kidron, and on these Benches we are sympathetic to her arguments. Her Amendment 61 would require the Government to make regulations in this area. We accept the Government’s assurance that this is something they will seek to address, and I note the Minister’s confirmation that their consultation will form the basis of the Government’s approach to this issue. Given the importance of getting this right, our view is that the Government’s consultation is in mid-flight, and we have to allow it to do its work. Whatever view we take of the design and the timing of the consultation, it offers for now a way forward that will evidence some of the serious concerns expressed here. That said, we will take a great interest in the progress and outcomes of the consultation and will come back to this in future should the Government’s approach prove unsatisfactory.
Amendment 75 in my name also seeks to address the challenge that the growth in AI poses to our cultural industries. One of the key challenges in copyright and AI is enforceability. Copyright can be enforced only when we know it has been infringed. The size and the international distribution of AI training models render it extremely challenging to answer two fundamental questions today: first, was a given piece of content used in a training model and secondly, if so, in what jurisdiction did that use take place? If we cannot answer these questions, enforcement can become extremely hard, so a necessary, if not sufficient, part of the solution will be a digital watermark—a means of putting some red dye in the water where copyrighted material is used to train AIs. It could also potentially provide an automated means for content creators to opt out, with a vastly more manageable administrative burden.
I thank the Minister for his constructive engagement on digital watermarking and look to him to give the House an assurance that the Government will bring forward a plan to develop a technological standard for a machine-readable digital watermark. I hope that, if and when he does so, he is able to indicate both a timeline and an intention to engage internationally. Subject to receiving such reassurances when he rises, I shall not move my amendment.
I congratulate the noble Baroness, Lady Kidron, on her excellent speech. I know that she feels very strongly about this topic and the creative industries, as do I, but I also recognise what she said about junior Ministers. I have heard the many noble Lords who have spoken, and I hope they will forgive me if I do not mention everyone by name.
It is vital that we get this right. We need to give creators better, easier and practical control over their rights, allow appropriate access to training material by AI firms and, most importantly, ensure there is real transparency in the system, something that is currently lacking. We need to do this so that we can guarantee the continued success of our creative industries and fully benefit from what AI will bring.
I want to make it clear, as others have, that these two sectors are not mutually exclusive; it is not a case of picking sides. Many in the creative industries are themselves users or developers of AI technology. We want to ensure that the benefits of this powerful new technology are shared, which was a point made by the noble Baroness, Lady Stowell, and her committee.
It is obvious that these are complex issues. We know that the current situation is unsatisfactory in practice for the creative industries and the AI sector. That is why we have launched a detailed consultation on what package of measures can be developed to benefit both the creative industries and the AI sector. This is a genuine consultation. Many people from a range of sectors are engaging with us to share their views and evidence. It is important, and indeed essential, that we fully consider all responses provided in the consultation before we act. Not to do so would be a disservice to all those who are providing important input and would narrow our chance to get the right solution.
I agree wholeheartedly with the noble Baroness and many other noble Lords, including the noble Lord, Lord Freyberg, on the importance of transparency about the creative content used to train AI. Transparency, about both inputs and outputs, is a key objective in the Government’s consultation on copyright and AI. This very ability to provide transparency is at the centre of what is required. The consultation also contains two other vital objectives alongside transparency: practical and clear control and reward for rights holders over the use of their work. This is quite the opposite of the notion of giving away their hard work or theft. It is about increasing their control and ensuring access to data for AI training.
The Government certainly agree with the spirit of the amendments on transparency and web crawlers and the aims they are trying to achieve—that creators should have more clarity over which web crawlers can access their works and be able to block them if they wish, and that they should be able to know what has been used and by whom and have mechanisms to be appropriately reimbursed. However, it would be premature to commit to very specific solutions at this stage of the consideration of the consultation.
We want to consider these issues more broadly than the amendments before us, which do not take into account the fact that web crawling is not the only way AI models are trained. We also want to ensure that any future measures are not disproportionate for small businesses and individuals. There is a risk that legislating in this way will not be flexible enough to keep pace with rapid developments in the AI sector or new web standards. A key purpose of our consultation is to ensure that we have the full benefit of views on how to approach these issues, so that any legislation will be future-proof and able to deliver concrete and sustainable benefits for the creators. The preferred option in the consultation is one proposal; this is a consultation to try to find the right answer and all the proposals will be considered on their merits.
The Government are also committed to ensuring that rights holders have real control over how their works are used. At the moment, many feel powerless over the use of their works by AI models. Our consultation considers technological and other means that can help to ensure that creators’ wishes are respected in practice. We want to work with industry to develop simple and reliable ways to do this that meet agreed standards, in reference to the point made by the noble Viscount, Lord Camrose.
Technical standards are an important part of this. There are technical standards that will be required to prevent web crawlers accessing certain datasets. Standards will be needed for control at the metadata level and for watermarking. I agree with the noble Viscount, Lord Camrose, that standards on the use of watermarks or metadata could have a number of benefits for those who wish to control or license the use of their content with AI. Standards on the use of web crawlers may also improve the ability of rights holders to prevent the use of their works against their wishes. We will actively support the development of new standards and the application of existing ones. We see this as a key part of what is needed. We do not intend to implement changes in this area until we are confident that they will work in practice and are easy to use.
I also want to stress that our data mining proposals relate only to content that has been lawfully made available, so they will not apply to pirated copies. Existing copyright law will continue to apply to the outputs of AI models, as it does today. People will not be able to use AI as a cover for copyright piracy. With improved transparency and control over inputs, we expect that the likelihood of models generating infringing output will be greatly reduced.
I thank the noble Lord, Lord Clement-Jones, for Amendment 46. It would require a review of the impact of transferring all data protection-related cases to the relevant tribunals. Currently there is a mixture of jurisdictions for tribunals and courts for data protection cases, depending on the nature of the proceedings. This is on the basis that certain claims are deemed appropriate for tribunal, while others are appropriate for courts, where stricter rules of evidence and procedure apply—for example, in dealing with claims by data subjects against controllers for compensation due to breaches of data protection legislation. As such, the current system already provides clear and appropriate administrative and judicial redress routes for data subjects seeking to exercise their rights.
Tribunals are in many cases the appropriate venue for data protection proceedings, including appeals by controllers against enforcement action or applications by data subjects for an order that the ICO should progress a complaint. Claims by individuals against businesses or other organisations for damages arising from breach of data protection law fall under the jurisdiction of courts rather than tribunals. This is appropriate, given the likely disparity between the resources of the respective parties, because courts apply stricter rules of evidence and procedures than tribunals. While court proceedings can, of course, be more costly, successful parties can usually recover their costs, which would not always be the case in tribunals.
I hope that the noble Lord agrees that there is a rationale for these different routes and that a review to consider transfer of jurisdictions to tribunals is therefore not necessary at this time.
My Lords, I thank the Minister for that dusty reply. I wonder whether he has been briefed about particular legal cases, such as Killock or Delo, where the judiciary themselves were confused about the nature of the different jurisdictions of tribunal and court. The Minister and, indeed, the noble Viscount, Lord Camrose, seemed to make speeches on the basis that all is wonderful and the jurisdiction of the courts and tribunals is so clearly defined that we do not need a review. That is not the case and, if the Minister were better briefed about the obiter, if not the judgments, in Delo and Killock, he might appreciate that there is considerable confusion about jurisdiction, as several judges have commented.
I am very disappointed by the Minister’s reply. I think that there will be several judges jumping up and down, considering that he has not really looked at the evidence. The Minister always says that he is very evidence-based. I very much hope that he will take another look at this—or, if he does not, that the MoJ will—as there is considerably greater merit in the amendment than he accords. However, I shall not press this to a vote and I beg leave to withdraw the amendment.
I thank my noble friend Lord Holmes for tabling the amendment in this group. I, too, believe these amendments would improve the Bill. The nature of computing and data processing has fundamentally changed since the Computer Misuse Act 1990. Third parties hold and process immense quantities of data, and the means of accessing and interacting with that data have become unrecognisably more sophisticated. Updating the definition of unauthorised computer access through Amendment 48 is a sensible reform, as this new definition takes into account that data controllers and processors now hold substantial quantities of personal data. These entities are responsible for the security of the data they hold, so their provisions on access become legally relevant and this amendment reflects this.
When updating an offence, it is equally necessary to consider the legal defences, as my noble friend has rightly done in Amendment 47 by protecting individuals accessing information to detect or prevent a crime or whose actions are in the public interest. We on these Benches feel these amendments are wholly sensible. I urge the Minister to listen to the persuasive argument that my noble friend Lord Holmes has made and consider how we can deliver these improvements to our data legislation.
I am grateful to the noble Lord, Lord Holmes, for raising this topic through Amendments 47 and 48. I am very aware of this issue and understand the strength of feeling about reforming the Computer Misuse Act, as we have heard from the noble Lord, Lord Arbuthnot, and the noble Earl, Lord Erroll.
As the noble Lord, Lord Clement-Jones, rightly pointed out, when I was the Government Chief Scientific Adviser I conducted a review making recommendations on pro-innovation regulation of technologies and I made recommendations on the issues these amendments raise. These recommendations were accepted by the previous Government.
The Government are actively taking forward these recommendations as part of the Act’s ongoing review. These issues are, of course, complex and require careful consideration. The introduction of these specific amendments could unintentionally pose more risk to the UK’s cybersecurity, not least by inadvertently creating a loophole for cybercriminals to exploit to defend themselves against a prosecution.
Our engagement with stakeholders has revealed differing views, even among industry. While some industry partners highlight the noble Lord’s view that the Computer Misuse Act may prevent legitimate public interest activity, others have concerns about the unintended consequences. Law enforcement has considerable concerns that allowing unauthorised access to systems under the pretext of identifying vulnerabilities could be exploited by cybercriminals. Without robust safeguards and oversight, this amendment could significantly hinder investigations and place a burden on law enforcement partners to establish whether a person’s actions were in the public interest.
Further work is required to consider the safeguards that would need to accompany any introduction of statutory defences. The Government will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies on this issue. The Home Office will provide an update in due course, once the proposals have been finalised—or, in the words of the noble Lord, Lord Clement-Jones, they will pop out of the bowels of the Home Office in due course. With these reassurances in mind, I hope the noble Lord will feel able to withdraw his amendments.
My Lords, I thank everybody who has taken part in this short debate. I was really hoping that we would not hear the phrase “the bowels of the Home Office” twice, but we did—now we have heard it three times. Perhaps it could be the title of somebody’s autobiography. I do not know whose, but I claim the IP rights even though the noble Lord, Lord Clement-Jones, said it first.
I am grateful for the Minister’s response. It would probably have been better to have some sense of timeline; much of what he said was very much what we heard in Committee. We are all amenable to having a course of action, but it needs more objectives attached to it as to when we are likely to see some consequences, action and changes. As every day goes by, as the Minister is well aware, risks go unchecked that could be checked, people are less safe who could be made safe and economic growth, the Government’s priority, is prevented which could be enabled.
For now, I will withdraw my amendment, but I am minded to see what is possible between now and Third Reading, because the time is now; otherwise, “in due course” will be even longer than the official statement “later in the summer”. I beg leave to withdraw.
My Lords, I thank my noble friend Lord Lucas for introducing this group. Amendments 48A and 50A, in his name, would ensure that regulated professionals, including financial services firms, are able to comply with current and future regulatory requirements. The example my noble friend has given—the FCA’s expectation that firms communicate effectively with consumers—is a good one. Clearly, we must avoid a circumstance where regulators expect businesses to take action that is not possible due to limiting legislation governing data use and access. My noble friend has made a forceful case and I hope the Government will be able to give the House appropriate assurance that businesses will not be put in this position as a result of this legislation.
Amendment 48B, in the name of the noble Lord, Lord Clement-Jones, seeks to ban cookie paywalls. I opposed a similar amendment when we debated it in Committee as it actually seeks to curtail choice. Currently, users have the options to pay money and stay private, share personal data and read for free, or walk away. Faced with these options, for instance, I have sadly chosen to forgo my regular evening reading of the Daily Mail’s excellent sports pages, but I see no reason why that newspaper, or anyone else, should be compelled to provide anything for free. In fact, it has been very persuasively argued by Jaron Lanier, Shoshana Zuboff and many others that it is the fact that so much of the internet is apparently, but not actually, free that has caused a great deal of damage, rather than having an open charging model. This approach finally reveals the exact cash value of individuals’ data that websites are harvesting and offers users choice. We do not agree with attempts to remove that choice.
My Lords, I will start with Amendments 48A and 50A in the name of the noble Lord, Lord Lucas. The Government are aware that some financial services firms have raised concerns that the direct marketing rules in the privacy and electronic communications regulations prevent them supporting consumers in some instances. I appreciate the importance of the support that financial services firms provide to their customers to help them make informed decisions on matters such as their financial investments. The Government and the FCA are working closely together to improve the support available to consumers.
In December, the FCA launched an initial consultation on a new type of support for consumers with their investments and pensions called “targeted support”. Through this consultation, the FCA will seek feedback on any interactions of the proposals and direct marketing rules. As my noble friend Lady Jones explained in the debate in Grand Committee, firms can already provide service or regulatory communication messages to their customers without permission, provided these messages are neutral in tone, factual and do not include promotional content. Promotional content can be sent if a consumer consents to receiving direct marketing. Messages which are not directed to a particular individual, such as online adverts shown to everyone who views a website, are also not prevented by the rules. I hope this explanation and the fact that there is ongoing work provide some reassurance to the noble Lord, Lord Lucas, that the Government are actively looking into this issue, and that, as such, he is content to withdraw his amendment.
Amendment 48B from the noble Lord, Lord Clement-Jones, is aimed at banning cookie paywalls. These generally work by giving web users the option to pay for a cookie-free browsing experience. Many websites are funded by advertising, and some publishers think that people should pay for a viewing experience without personalised advertising. As he rightly pointed out, the ICO released updated guidance on how organisations can deploy “consent or pay” models while still ensuring that consent is “freely given”. The guidance is detailed and outlines important factors that organisations should consider in order to operate legally. We encourage businesses to read this guidance and respond accordingly.
I note the important points that the noble Lord makes, and the counterpoints made by the noble Viscount, Lord Camrose. The Government will continue to engage with businesses, the ICO and users on these models, and on the guidance, but we do not think there is currently a case for taking action to ban the practice. I therefore hope the noble Lord will not press his amendment.
My Lords, I am grateful to the Minister for that explanation. I will, for the moment, be content to know that the Government are continuing to discuss this. There is a real problem here that will need to be dealt with, but if the Government are engaged they will inevitably find themselves having to deal with it. There are some occasions in regulatory messages where you need to make options clear: “You need to do this or something else will happen and you’ll really disadvantage yourself”. The regulator will expect that, particularly where things such as pensions are concerned, but it is clearly a marketing message. It will be difficult to be resolved, but I am happy to trust the Government to have a go at it and not to try to insist on the particular formulation of these amendments. I beg leave to withdraw my amendment.
I thank the noble Baroness, Lady Kidron, for introducing this group, and the noble Lord, Lord Clement-Jones, and the noble Earl, Lord Erroll, for their comments and contributions—particularly the salutary words of the noble Earl, Lord Erroll, on the role of the Executive here, which were very enlightening.
I agree with the noble Baroness, Lady Kidron, that Parliament should have the opportunity to scrutinise this secondary legislation. Online safety research is essential: as our lives become more and more digital, we must assess how it impacts us as people, and especially children, who are particularly vulnerable to online harms. This cannot be achieved unless researchers are able to access the unadulterated raw data. Therefore, I am sure that noble Lords—and our colleagues in the other place—would wish to scrutinise the legislation creating this access to ensure it is fit for purpose. This is why I support the spirit of Amendment 51.
Following on from this point, facilitating online harms research by making access requests enforceable under a pre-existing online safety regime, as per Amendment 52, certainly seems to me like a sensible measure. It would enable this vital research, as would Amendment 54, which removes the need to create a bespoke enforcement system for online safety research access.
Amendment 53 would also enable independent research into how online risks and harms impact different groups. This information would be extremely valuable to a broad range of stakeholders including social media platforms, data controllers, schools and parents and parliamentarians. It would help us all identify groups who are at heightened risk of online harm, what type of harm they are at risk of, which measures have reduced this risk, which have exacerbated it and what we can all do to reduce this danger.
There are many people undertaking online safety research across the globe and we should look to help these researchers access data for the purposes of safety research, even if their location is outside the UK. Of course, adequate safeguards would need to be in place, which may be dictated to some extent by the location of the researcher. However, online safety research is a benefit for all of us and Amendment 55 would keep barriers to this research to a minimum.
I am sure we would all like to think that all data holders and processors would wish to assist with prevention of online harms. However, where commercial and moral imperatives compete, we sadly cannot always count on the latter winning out. Therefore, Amendment 56 is a sensible addition that would prevent contractual exclusion of research access on online safety grounds, ensuring that online safety risks cannot be hidden or obscured.
I thank the noble Baroness, Lady Kidron, for the amendments on researchers’ access to data for online safety research, an incredibly important topic. It is clear from Committee that the Government’s proposals in this clause are broadly welcomed. They will ensure that researchers can access the vital data they need to undertake an analysis of online safety risks to UK users, informing future online safety interventions and keeping people safe online.
Amendment 51 would compel the Secretary of State to make regulations for a researcher access framework, and to do so within 12 months. While I am sympathetic to the spirit of the noble Baroness’s amendment, a fixed 12-month timescale and requirement to make regulations may risk compressing the time and options available to develop the most effective and appropriate solution, as my noble friend Lady Jones outlined in Committee. Getting this right is clearly important. While we are committed to introducing a framework as quickly as possible, we do not want to compromise its quality. We need adequate time to ensure that the framework is fit for purpose, appropriately safeguarded and future-proofed for a fast-evolving technological environment.
As required by the Online Safety Act, Ofcom is currently preparing a report into the ways in which researchers can access data and the barriers that they face, as well as exploring how additional access might be achieved. This report will be published in July of this year. We are also committed to conducting a thorough consultation on the issue prior to any enforceable requirements coming into force. The Government intend to consult on the framework as soon as practicable after the publication of Ofcom’s report this summer.
Sufficient time is required for a thorough consultation with the wide range of interested stakeholders in this area, including the research community, civil society and industry. I know that the noble Baroness raised a concern in Committee that the Government would rely on Ofcom’s report to set the framework for the regime, but I can assure her that a robust evidence-gathering process is already under way. The framework will be informed by collaboration with key stakeholders and formal consultation, as well as being guided by evidence from Ofcom’s report on the matter. Once all interested parties have had their say and the consultation is completed, the Government expect to make regulations to install the framework. It is right that the Government commit to a full consultation process and do not seek to prejudge the outcomes of that process by including a mandatory requirement for regulations now.
Amendment 53 would seek to expand the list of examples of the types of provision that the regulations might make. Clause 123 gives non-exhaustive examples of what may be included in future regulations; it certainly does not limit those regulations to the examples given. Given the central importance of protecting children and vulnerable users online, a key aim of any future regulations would be to support researchers to conduct research into the different ways that various groups of people experience online safety, without the need for this amendment. Indeed, a significant driving force for establishing this framework in the first place is to improve the quality of research that is possible to understand the risks to users online, particularly those faced by children. I acknowledge the point that the noble Baroness made about people of all ages. We would be keen to discuss this further with her as we consult on specific requirements as part of developing regulations.
I will touch on the point about legal privilege. We believe that routinely copying a lawyer on to all emails and documents is not likely to attract legal privilege. Legal privilege protects communication specifically between legal advisers and their clients being created for the purpose of giving or receiving legal advice, or for the sole or dominant purpose of litigation. It would not be satisfactory just to copy everyone on everything.
We are confident that we can draft regulations that will make it entirely clear that the legal right to data for research purposes cannot be avoided by tech companies seeking to rely on contractual provisions that purport to prevent the sharing of data for research purposes. Therefore, there is no need for a specific requirement in the Bill to override a terms of service.
I thank the Minister for his very full answer. My legal adviser on my right—the noble and learned Lord, Lord Thomas of Cwmgiedd—let me know that I was in a good place here. I particularly welcome the Minister’s invitation to discuss Ofcom’s review and the consultation. Perhaps he would not mind if I brought some of my researcher friends with me to that meeting. With that, I beg leave to withdraw the amendment.