(1 year, 8 months ago)
Commons ChamberI beg to move, That the Bill be now read a Second time.
Data is already the fuel driving the digital age: it powers the everyday apps that we use, public services are being improved by its better use and businesses rely on it to trade, produce goods and deliver services for their customers. But how we choose to use data going forward will become even more important: it will determine whether we can grow an innovative economy with well-paid, high-skill jobs, it will shape our ability to compete globally in developing the technologies of the future and it will increasingly say something about the nature of our democratic society. The great challenge for democracies, as I see it, will be how to use data to empower rather than control citizens, enhancing their privacy and sense of agency without letting authoritarian states—which, in contrast, use data as a tool to monitor and harvest information from citizens—dominate technological advancement and get a competitive advantage over our companies.
The UK cannot step aside from the debate by simply rubber-stamping whatever iteration of the GDPR comes out of Brussels. We have in our hands a critical opportunity to take a new path and, in doing so, to lead the global conversation about how we can best use data as a force for good—a conversation in which using data more effectively and maintaining high data protection standards are seen not as contradictory but as mutually reinforcing objectives, because trust in this more effective system will build the confidence to share information. We start today not by kicking off a revolution, turning over the apple cart and causing a compliance headache for UK firms, but by beginning an evolution away from an inflexible one-size-fits-all regime and towards one that is risk-based and focused on innovation, flexibility and the needs of our citizens, scientists, public services and companies.
Businesses need data to make better decisions and to reach the right consumers. Researchers need data to discover new treatments. Hospitals need it to deliver more personalised patient care. Our police and security services need data to keep our people safe. Right now, our rules are too vague, too complex and too confusing always to understand. The GDPR is a good standard, but it is not the gold standard. People are struggling to utilise data to innovate, because they are tied up in burdensome activities that are not fundamentally useful in enhancing privacy.
A recently published report on compliance found that 81% of European publishers were unknowingly in breach of the GDPR, despite doing what they thought the law required of them. A YouGov poll from this year found that one in five marketing professionals in the UK report knowing absolutely nothing about the GDPR, despite being bound by it. It is not just businesses: the people whose privacy our laws are supposed to protect do not understand it either. Instead, they click away the thicket of cookie pop-ups just so they can see their screen.
The Bill will maintain the high standards of data protection that British people rightly expect, but it will also help the people who are most affected by data regulation, because we have co-designed it with those people to ensure that our regulation reflects the way in which real people live their lives and run their businesses.
Does the Minister agree that the retention and enhancement of public trust in data is a major issue, that sharing data is a major issue for the public, and that the Government must do more—perhaps she can tell us whether they intend to do more—to educate the public about how and where our data is used, and what powers individuals have to find out this information?
I thank the hon. Lady for her helpful intervention. She is right: as I said earlier, trust in the system is fundamental to whether citizens have the confidence to share their data and whether we can therefore make use of that data. She made a good point about educating people, and I hope that this debate will mark the start of an important public conversation about how people use data. One of the challenges we face is a complex framework which means that people do not even know how to talk about data, and I think that some of the simplifications we wish to introduce will help us to understand one of the fundamental principles to which we want our new regime to adhere.
My hon. Friend gave a long list of people who found the rules we had inherited from outside the UK challenging. She might add to that list Members of Parliament themselves. I am sure I am not alone in having been exasperated by being complained about to the Information Commissioner, in this case by a constituent who had written to me complaining about a local parish council. When I shared his letter with the parish council so that it could show how bogus his long-running complaint had been, he proceeded to file a complaint with the Information Commissioner’s Office because I had shared his phone number—which he had not marked as private—with the parish council, with which he had been in correspondence for several years. The Information Commissioner’s Office took that seriously. This sort of nonsense shows how over-restrictive regulations can be abused by people who are out to stir up trouble unjustifiably.
Let me gently say that if my right hon. Friend’s constituent was going to pick on one Member of Parliament with whom to raise this point, the Member of Parliament who does not, I understand, use emails would be one of the worst candidates. However, I entirely understand Members’ frustration about the current rules. We are looking into what we can do in relation to democratic engagement, because, as my right hon. Friend says, this is one of the areas in which there is not enough clarity about what can and cannot be done.
We want to reduce burdens on businesses, and above all for the small businesses that account for more than 99% of UK firms. I am pleased that the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake), is present to back up those proposals. Businesses that do not have the time, the money or the staff to spend precious hours doing unnecessary form-filling are currently being forced to follow some of the same rules as a billion-dollar technology company. We are therefore cutting the amount of pointless paperwork, ensuring that organisations only have to comply with rules on record-keeping and risk assessment when their processing activities are high-risk. We are getting rid of excessively demanding requirements to appoint data protection officers, giving small businesses much more flexibility when it comes to how they manage data protection risks without procuring external resources.
Those changes will not just make the process simpler, clearer and easier for businesses, they will make it cheaper too. We are expecting micro and small businesses to save nearly £90 million in compliance costs every year: that is £90 million more for higher investment, faster growth and better jobs. According to figures published in 2021, data-driven trade already generates 85% of our services exports. Our new international transfers regime clarifies how we can build data bridges to support the close, free and safe exchange of data with other trusted allies.
I am delighted to hear the Secretary of State talk about reducing regulatory burdens without compromising the standards that we are none the less delivering—that is the central distinction, and greatly to be welcomed for its benefits for the entrepreneurialism and fleetness of foot of British industry. Does she agree, however, that while the part of the Bill that deals with open data, or smart data, goes further than that and creates fresh opportunities for, in particular, the small challenger businesses of the kind she has described to take on the big incumbents that own the data lakes in many sectors, those possibilities will be greatly reduced if we take our time and move too slowly? Could it not potentially take 18 months to two years for us to start opening up those other sectors of our economy?
I am delighted, in turn, to hear my hon. Friend call me the Secretary of State—I am grateful for the promotion, even if it is not a reality. I know how passionate he feels about open data, which is a subject we have discussed before. As I said earlier, I am pleased that the Under-Secretary of State for Business and Trade is present, because this morning he announced that a new council will be driving forward this work. As my hon. Friend knows, this is not necessarily about legislation being in place—I think the Bill gives him what he wants—but about that sense of momentum, and about onboarding new sectors into this regime and not being slow in doing so. As he says, a great deal of economic benefit can be gained from this, and we do not want it to be delayed any further.
Let me first draw attention to my entry in the Register of Members’ Financial Interests. Let me also apologise for missing the Minister’s opening remarks—I was taken by surprise by the shortness of the preceding statement and had to rush to the Chamber.
May I take the Minister back to the subject of compliance costs? I understand that the projected simplification will result in a reduction in those costs, but does she acknowledge that a new regime, or changes to the current regime, will kick off an enormous retraining exercise for businesses, many of which have already been through that process recently and reached a settled state of understanding of how they should be managing data? Even a modest amount of tinkering instils a sense among British businesses, particularly small businesses, that they must put everyone back through the system, at enormous cost. Unless the Minister is very careful and very clear about the changes being made, she will create a whole new industry for the next two or three years, as every data controller in a small business—often doing this part time alongside their main job—has to be retrained.
We have been very cognisant of that risk in developing our proposals. As I said in my opening remarks, we do not wish to upset the apple cart and create a compliance headache for businesses, which would be entirely contrary to the aims of the Bill. A small business that is currently compliant with the GDPR will continue to be compliant under the new regime. However, we want to give businesses flexibility in regard to how they deliver that compliance, so that, for instance, they do not have to employ a data protection officer.
I am grateful to the Minister for being so generous with her time. May I ask whether the Government intend to maintain data adequacy with the EU? I only ask because I have been contacted by some business owners who are concerned about the possible loss of EU data adequacy and the cost that might be levied on them as a result.
I thank the hon. Gentleman for pressing me on that important point. I know that many businesses are seeking to maintain adequacy. If we want a business-friendly regime, we do not want to create regulatory disruption for businesses, particularly those that trade with Europe and want to ensure that there is a free flow of data. I can reassure him that we have been in constant contact with the European Commission about our proposals. We want to make sure that there are no surprises. We are currently adequate, and we believe that we will maintain adequacy following the enactment of the Bill.
I was concerned to hear from the British Medical Association that if the EU were to conclude that data protection legislation in the UK was inadequate, that would present a significant problem for organisations conducting medical research in the UK. Given that so many amazing medical researchers across the UK currently work in collaboration with EU counterparts, can the Minister assure the House that the Bill will not represent an inadequacy in comparison with EU legislation as it stands?
I hope that my previous reply reassured the hon. Lady that we intend to maintain adequacy, and we do not consider that the Bill will present a risk in that regard. What we are trying to do, particularly in respect of medical research, is make it easier for scientists to innovate and conduct that research without constantly having to return for consent when it is apparent that consent has already been granted for particular medical data processing activities. We think that will help us to maintain our world-leading position as a scientific research powerhouse.
Alongside new data bridges, the Secretary of State will be able to recognise new transfer mechanisms for businesses to protect international transfers. Businesses will still be able to transfer data across borders with the compliance mechanisms that they already use, avoiding needless checks and costs. We are also delighted to be co-hosting, in partnership with the United States, the next workshop of the global cross-border privacy rules forum in London this week. The CBPR system is one of the few existing operational mechanisms that, by design, aims to facilitate data flows on a global scale.
World-class research requires world-class data, but right now many scientists are reluctant to get the data they need to get on with their research, for the simple reason that they do not know how research is defined. They can also be stopped in their tracks if they try to broaden their research or follow a new and potentially interesting avenue. When that happens, they can be required to go back and seek permission all over again, even though they have already gained that permission earlier to use personal data. We do not think that makes sense. The pandemic showed that we cannot risk delaying discoveries that could save lives. Nothing should be holding us back from curing cancer, tackling disease or producing new drugs and treatments. This Bill will simplify the legal requirements around research so that scientists can work to their strengths with legal clarity on what they can and cannot do.
The Bill will also ensure that people benefit from the results of research by unlocking the potential of transformative technologies. Taking artificial intelligence as an example, we have recently published our White Paper: “AI regulation: a pro-innovation approach”. In the meantime, the Bill will ensure that organisations know when they can use responsible automated decision making and that people know when they can request human intervention where those decisions impact their lives, whether that means getting a fair price for the insurance they receive after an accident or a fair chance of getting the job they have always wanted.
I spoke earlier about the currency of trust and how, by maintaining it through high data protection standards, we are likely to see more data sharing, not less. Fundamental to that trust will be confidence in the robustness of the regulator. We already have a world-leading independent regulator in the Information Commissioner’s Office, but the ICO needs to adapt to reflect the greater role that data now plays in our lives alongside its strategic importance to our economic competitiveness. The ICO was set up in the 1980s for a completely different world, and the pace, volume and power of the data we use today has changed dramatically since then.
It is only right that we give the regulator the tools it needs to keep pace and to keep our personal data safe while ensuring that, as an organisation, it remains accountable, flexible and fit for the modern world. The Bill will modernise the structure and objectives of the ICO. Under this legislation, protecting our personal data will remain the ICO’s primary focus, but it will also be asked to focus on how it can empower businesses and organisations to drive growth and innovation across the UK, and support public trust and confidence in the use of personal data.
The Bill is also important for consumers, helping them to share less data while getting more product. It will support smart data schemes that empower consumers and small businesses to make better use of their own data, building on the extraordinary success of open banking tools offered by innovative businesses, which help consumers and businesses to manage their finances and spending, track their carbon footprint and access credit.
The Minister always delivers a very solid message and we all appreciate that. In relation to the high data protection standards that she is outlining, there is also a balance to be achieved when it comes to ensuring that there are no unnecessary barriers for individuals and businesses. Can she assure the House that that will be exactly what happens?
I am always happy to take an intervention from the hon. Member. I want to assure him that we are building high data protection standards that are built on the fundamental principles of the GDPR, and we are trying to get the right balance between high data protection standards that will protect the consumer and giving businesses the flexibility they need. I will continue this conversation with him as the Bill passes through the House.
I thank the Minster for being so generous with her time. With regard to the independent commissioner, the regulator, who will set the terms of reference? Will it be genuinely independent? It seems to me that a lot of power will fall on the shoulders of the Secretary of State, whoever that might be in the not-too-distant future.
The Secretary of State will have greater powers when it comes to some of the statutory codes that the ICO adheres to, but those powers will be brought to this House for its consent. The whole idea is to make the ICO much more democratically accountable. I know that concern about the independence of the regulator has been raised as we have been working up these proposals, but I wish to assure the House that we do not believe those concerns to be justified or legitimate. The Bill actually has the strong support of the current Information Commissioner, John Edwards.
The Bill will also put in place the foundations for data intermediaries, which are organisations that can help us to benefit from our data. In effect, we will be able to share less sensitive data about ourselves with businesses while securing greater benefits. As I say, one of the examples of this is open banking. Another way in which the Bill will help people to take back control of their data is by making it easier and more secure for people to prove things about themselves once, electronically, without having to dig out stacks of physical documents such as passports, bills, statements and birth certificates and then having to provide lots of copies of those documents to different organisations. Digital verification services already exist, but we want consumers to be able to identify trustworthy providers by creating a set of standards around them.
The Bill is designed not just to boost businesses, support scientists and deliver consumer benefits; it also contains measures to keep people healthy and safe. It will improve the way in which the NHS and adult social care organise data to deliver crucial health services. It will let the police get on with their jobs by allowing them to spend more time on the beat rather than on pointless paperwork. We believe that this will save up to 1.5 million hours of police time each year—
I know that my hon. Friend has been passionate on this point, and we are looking actively into her proposals.
We are also updating the outdated system of registering births and deaths based on paper processes from the 19th century.
Data has become absolutely critical for keeping us healthy, for keeping us safe and for growing an economy with innovative businesses, providing jobs for generations to come. Britain is at its best when its businesses and scientists are at theirs. Right now, our rules risk holding them back, but this Bill will change that because it was co-designed with those businesses and scientists and with the help of consumer groups. Simpler, easier, clearer regulation gives the people using data to improve our lives the certainty they need to get on with their jobs. It maintains high standards for protecting people’s privacy while seeking to maintain our adequacy with the EU. Overall, this legislation will make data more useful for more people and more usable by businesses, and it will enable greater innovation by scientists. I commend the Bill to the House.
I am delighted to speak in support of this long-awaited Bill. It is a necessary piece of legislation to learn the lessons from GDPR and look at how we can improve the system, both to make it easier for businesses to work with and to give users and citizens the certainty they need about how their data will be processed and used.
In bringing forward new measures, the Bill in no way suggests that we are looking to move away from our data adequacy agreements with the European Union. Around the world, in north America, Europe, Australia and elsewhere in the far east, we see Governments looking at developing trusted systems for sharing and using data and for allowing businesses to process data across international borders, knowing that those systems may not be exactly the same, but they work to the same standards and with similar levels of integrity. That is clearly the direction that the whole world wants to move in and we should play a leading role in that.
I want to talk briefly about an important area of the Bill: getting the balance between data rights and data safety and what the Bill refers to as the “legitimate interest” of a particular business. I should also note that this Bill, while important in its own right, sits alongside other legislation—some of it to be introduced in this Session and some of it already well on its way through the Parliamentary processes—dealing with other aspects of the digital world. The regulation of data is an aspect of digital regulation; it is in some ways the fuel that powers the digital experience and is relevant to other areas of digital life as well.
To take one example, we have already established and implemented the age-appropriate design code for children, which principally addresses the way data is gathered from children online and used to design services and products that they use. As this Bill goes through its parliamentary stages, it is important that we understand how the age-appropriate design code is applied as part of the new data regime, and that the safeguards set out in that code are guaranteed through the Bill as well.
There has been a lot of debate, as has already been mentioned, about companies such as TikTok. There is a concern that engineers who work for TikTok in China, some of whom may be members of the Chinese Communist party, have access to UK user data that may not be stored in China, but is accessed from China, and are using that data to develop products. There is legitimate concern about oversight of that process and what that data might be used for, particularly in a country such as China.
However, there is also a question about data, because one reason the TikTok app is being withdrawn from Government devices around the world is that it is incredibly data-acquisitive. It does not just analyse how people use TikTok and from that create data profiles of users to determine what content to recommend to them, although that is a fundamental part of the experience of using it; it is also gathering, as other big apps do, data from what people do on other apps on the same device. People may not realise that they have given consent, and it is certainly not informed consent, for companies such as TikTok to access data from what they do on other apps, not just when they are TikTok.
It is a question of having trusted systems for how data can be gathered, and giving users the right to opt out of such data systems more easily. Some users might say, “I’m quite happy for TikTok or Meta to have that data gathered about what I do across a range of services.” Others may say, “No, I only want them to see data about what I do when I am using their particular service, not other people’s.”
The Online Safety Bill is one of the principal ways in which we are seeking to regulate AI now. There is debate among people in the tech sectors; a letter was published recently, co-signed by a number of tech executives, including Elon Musk, to say that we should have a six-month pause in the development of AI systems, particularly for large language models. That suggests a problem in the near future of very sophisticated data systems that can make decisions faster than a human can analyse them.
People such as Eric Schmidt have raised concerns about AI in defence systems, where an aggressive system could make decisions faster than a human could respond to them, to which we would need an AI system to respond and where there is potentially no human oversight. That is a frightening scenario in which we might want to consider moratoriums and agreements, as we have in other areas of warfare such as the use of chemical weapons, that we will not allow such systems to be developed because they are so difficult to control.
If we look at the application of that sort of technology closer to home and some of the cases most referenced in the Online Safety Bill, for example the tragic death of the teenager Molly Russell, we see that what was driving the behaviour of concern was data gathered about a user to make recommendations to that person that were endangering their life. The Online Safety Bill seeks to regulate that practice by creating codes and responsibilities for businesses, but that behaviour is only possible because of the collection of data and decisions made by the company on how the data is processed.
This is where the Bill also links to the Government’s White Paper on AI, and this is particularly important: there must be an onus on companies to demonstrate that their systems are safe. The onus must not just be on the user to demonstrate that they have somehow suffered as a consequence of that system’s design. The company should have to demonstrate that they are designing systems with people’s safety and their rights in mind—be that their rights as a worker and a citizen, or their rights to have certain safeguards and protections over how their data is used.
Companies creating datasets should be able to demonstrate to the regulator what data they have gathered, how that data is being trained and what it is being used for. It should be easy for the regulator to see and, if the regulator has concerns up-front, it should be able to raise them with the company. We must try to create that shift, particularly on AI systems, in how systems are tested before they are deployed, with both safety and the principles set out in the legislation in mind.
My hon. Friend makes a strong point about safety being designed, but a secondary area of concern for many people is discrimination—that is, the more data companies acquire, the greater their ability to discriminate. For example, in an insurance context, we allow companies to discriminate on the basis of experience or behaviour; if someone has had a lot of crashes or speeding fines, we allow discrimination. However, for companies that process large amounts of data and may be making automated decisions or otherwise, there is no openly advertised line of acceptability drawn. In the future it may be that datasets come together that allow extreme levels of discrimination. For example, if they linked data science, psychometrics and genetic data, there is the possibility for significant levels of discrimination in society. Does he think that, as well as safety, we should be emphasising that line in the sand?
I thank all Members for their contributions, including the hon. Members for Manchester Central (Lucy Powell), for Glasgow North West (Carol Monaghan), for Bristol North West (Darren Jones), for Cambridge (Daniel Zeichner), for Oxford West and Abingdon (Layla Moran), for Strangford (Jim Shannon) and for Barnsley East (Stephanie Peacock) and my right hon. Friend the Member for Maldon (Sir John Whittingdale) and my hon. Friends the Members for Folkestone and Hythe (Damian Collins), for Loughborough (Jane Hunt) and for Aberconwy (Robin Millar). The debate has been held in the right spirit, understanding the importance of data, and I will try to go through a number of the issues raised.
Adequacy has come up on a number of occasions. We have been straight from the beginning that adequacy is very important and we work with the EU Commission on this; we speak to it on a regular basis, but it is important to note that the EU does not require exactly the same rules to be in place to be adequate. We can see that from Japan and from New Zealand, so we are trying to get the balance right and making sure that we remain adequate not just with the EU but with other countries with which we want to have data bridges and collaboration. We are also making sure that we can strip back some of the bureaucracy not just for small businesses, but for public services including GPs, schools and similar institutions, as well as protecting the consumer, which must always be central.
Automated decision-making was also raised by a number of Members. The absence of meaningful human intervention in solely automated decisions, along with opacity in how those decisions can be reached, will be mitigated by providing data subjects with the opportunity to make representations about, and ultimately challenge, decisions of this nature that are unexpected or seem unwarranted. For example, if a person is denied a loan or access to a product or services because a solely automated decision-making process has identified a high risk of fraud or irregularities in their finances, that individual should be able to contest that decision and seek human review. If that decision is found to be unwarranted on review, the controller must re-evaluate the case and issue an appropriate decision.
Our reforms are addressing the uncertainty over the applications of safeguards. They will clarify when safeguards apply to ensure that they are available in appropriate circumstances. We will develop that with businesses and other organisations in guidance.
The hon. Member for Glasgow North West talked about joint-working designation notices and it is important to note that the police and intelligence services are working off different data regimes and that can make joint-working more difficult. Many of the changes made in this Bill have come from learning from the Fishmongers’ Hall terrorist incident and the Manchester Arena bombing.
Members raised the question of algorithmic bias. We agree that it is important that organisations are aware of potential biases in data sets and algorithms and bias monitoring and correction can involve the use of personal data. As we set out in our response to the consultation on the Bill, we plan to introduce a statutory instrument that will provide for the monitoring and correction of bias in AI systems by allowing the processing of sensitive personal data for this purpose with appropriate safeguards. However, as we know from the AI White Paper we published recently, this is a changing area so it is important that we remain able to flex in Government in the context of AI and that type of decision-making.
The hon. Member for Bristol North West talked about biometrics. That is classed as sensitive data under the UK GDPR, so is already provided with additional protection. It can only be processed if a relevant condition is met under article 9 or schedule 1 of the Data Protection Act. That requirement provides sufficient safeguards for biometric data. There are significant overlaps in the current oversight framework, which is confusing for the police and the public, and it inhibits innovation. That is why the Bill simplifies the oversight for biometrics and overt surveillance technologies.
The hon. Gentleman talked about age-appropriate guidance. We are committed to protecting children and young people online. The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code. Any breach of our data protection laws will result in enforcement action by the Information Commissioner’s Office.
The hon. Gentleman also talked about data portability. The Bill increases data portability by setting up smart data regulations. He talked about social media, but it is far wider than that. Smart data is the secure sharing of customer data with authorised third parties on the customer’s request. Those third parties can then use that data to provide innovative services for the consumer or business user, utilising AI and data-driven insights to empower customer choice. Services may include clear account management across services, easier switching between offers or providers, and advice on how to save money. Open banking is an obvious live example of that, but the Bill, with the smart data changes within it, will turbocharge the use of this matter.
My hon. Friend the Member for Loughborough talked about policing. It will save 1.5 million police hours, but it is really important that we do more. We are looking at ways of easing redaction burdens for the police while ensuring we maintain victim and witness confidence. It is really important to them, and in the interests of public trust, that the police do not share information not relevant to a case with other organisations, including the Crown Prosecution Service and the defence. Removing information, as my hon. Friend says, places a resource burden on officers. We will continue to work with the police and the Home Office on that basis.
On UK-wide data standards, raised by my hon. Friend the Member for Aberconwy, improving access to comparable data and evidence from across the UK is a crucial part of the Government’s work to strengthen the Union. The UK Government and the Office for National Statistics have an ongoing and wide-ranging work programme to increase coherency of data across the nations, as my hon. Friend is aware. We remain engaged in discussions and will continue to work with him, the Wales Office and the ONS to ensure that we can continue.
On international data transfer, it is important that we tackle the uncertainties and instabilities in the current regime, but the hon. Member for Strangford is absolutely right that in doing that, we must maintain public trust in the transfer system.
Finally, on the ICO, we believe that the Bill does not undercut its independence. It is really important that, for the trust issues I have talked about, we retain its independence. It is not about Government control over an independent regulator and it is not about a Government trying to exert influence or pressure for what are deemed to be more favourable outcomes. We are committed to the ICO’s ongoing independence and that is why we have worked closely with the ICO. The Information Commissioner himself is in favour of the changes we are making. He has spoken approvingly about them.
This is a really important Bill, because it will enable greater innovation while keeping personal protections to keep people’s data safe.
Question put and agreed to.
Bill accordingly read a Second time.
Data Protection and Digital Information (No. 2) Bill (Programme)
Motion made, and Question put forthwith (Standing Order No. 83A(7)),
That the following provisions shall apply to the Data Protection and Digital Information (No. 2) Bill:
Committal
(1) The Bill shall be committed to a Public Bill Committee.
Proceedings in Public Bill Committee
(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Tuesday 13 June 2023.
(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.
Consideration and Third Reading
(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.
(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.
(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.—(Joy Morrissey.)
Question agreed to.
Data Protection and Digital Information (No. 2) Bill (Money)
King’s recommendation signified.
Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),
That, for the purposes of any Act resulting from the Data Protection and Digital Information (No. 2) Bill, it is expedient to authorise the payment out of money provided by Parliament of—
(a) any expenditure incurred under or by virtue of the Act by the Secretary of State, the Treasury or a government department, and
(b) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Joy Morrissey.)
Question agreed to.
Data Protection and Digital Information (No. 2) Bill (Ways and Means)
Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),
That, for the purposes of any Act resulting from the Data Protection and Digital Information (No. 2) Bill, it is expedient to authorise:
(1) the charging of fees or levies under or by virtue of the Act; and
(2) the payment of sums into the Consolidated Fund.—(Joy Morrissey.)
Question agreed to.
Data Protection and Digital Information (No. 2) Bill (Carry-over)
Motion made, and Question put forthwith (Standing Order No. 80A(1)(a)).
That if, at the conclusion of this Session of Parliament, proceedings on the Data Protection and Digital Information (No. 2) Bill have not been completed, they shall be resumed in the next Session.—(Joy Morrissey.)
Question agreed to.