To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Pupils: Databases
Tuesday 23rd April 2024

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Department for Education:

To ask His Majesty's Government whether they give third parties access to national pupil data or learner records, and whether they charge a fee for any such access.

Answered by Baroness Barran - Parliamentary Under-Secretary (Department for Education)

The department will only share pupil, or learner, level data with others where it is lawful, secure and ethical to do so. Where these conditions are met and data is shared, the department do not charge any fee.

All requests for data from the department are subject to a robust approvals process where senior data experts assess all applications for public benefit, proportionality, legal underpinning and strict information security standards. The approvals process where senior data experts assess all applications is known as the DfE Data Sharing Approval Panel (DSAP). The DSAP panel also includes external members who scrutinise the ongoing decision making in order to increase public trust.

As part of the department’s commitment to transparency, it publishes details of all organisations it has shared personal data with alongside a short description of the project. This publication is updated quarterly and is available from GOV.UK at the link below: https://www.gov.uk/government/publications/dfe-external-data-shares.


Written Question
Education: Artificial Intelligence
Tuesday 23rd April 2024

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Department for Education:

To ask His Majesty's Government what assessment they have made of the future use of artificial intelligence in education using national pupil data or learner records.

Answered by Baroness Barran - Parliamentary Under-Secretary (Department for Education)

The department has conducted research and has a work programme around artificial intelligence in education settings. To date, the department has not used national pupil data or learner records in setting the strategy for the department's work in this area.


Written Question
Centre for Data Ethics and Innovation
Tuesday 17th October 2023

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what is the current status of the Advisory Board of the Centre for Data Ethics and Innovation.

Answered by Viscount Camrose - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

The Centre for Data Ethics and Innovation (CDEI) Advisory Board was appointed on a fixed term basis, with terms ending in September 2023. As CDEI’s work evolves to keep pace with developments in data and AI, the CDEI will engage with a broader pool of expertise from across the Department for Science, Innovation and Technology (DSIT). CDEI will continue its work to enable trustworthy innovation using data and AI as part of DSIT, including developing tools, guidance and standards to help public and private sector organisations to use AI and data in a way that builds public trust.


Written Question
Centre for Data Ethics and Innovation
Tuesday 17th October 2023

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what future plans there are for the Centre for Data Ethics and Innovation.

Answered by Viscount Camrose - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

The Centre for Data Ethics and Innovation (CDEI) Advisory Board was appointed on a fixed term basis, with terms ending in September 2023. As CDEI’s work evolves to keep pace with developments in data and AI, the CDEI will engage with a broader pool of expertise from across the Department for Science, Innovation and Technology (DSIT). CDEI will continue its work to enable trustworthy innovation using data and AI as part of DSIT, including developing tools, guidance and standards to help public and private sector organisations to use AI and data in a way that builds public trust.


Written Question
Energy: Meters
Wednesday 26th October 2022

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Department for Business, Energy and Industrial Strategy:

To ask His Majesty's Government what steps they are taking to protect customers with prepayment energy meters this winter.

Answered by Lord Callanan - Parliamentary Under Secretary of State (Department for Energy Security and Net Zero)

Many customers choose prepayment meters to help them budget and avoid going into debt. Ofgem rules require energy suppliers to offer emergency and additional support credit or alternative short-term support to help prepayment meter customers stay on supply.

The Energy Price Guarantee will ensure that a typical household will pay on average £2,500 a year on their energy bill for the next two years from 1 October 2022. This includes prepayment customers, and will save a typical household £1,000 a year based on current energy prices.


Written Question
Energy: Meters
Wednesday 26th October 2022

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Department for Business, Energy and Industrial Strategy:

To ask His Majesty's Government what steps they are taking to reduce the number of new prepayment energy meters installed this upcoming winter.

Answered by Lord Callanan - Parliamentary Under Secretary of State (Department for Energy Security and Net Zero)

Many customers prefer prepayment meters to help them budget.

Ofgem’s License Conditions require suppliers to consider all options for appropriate debt management. This can include installing a prepayment meter, but suppliers have to consider whether this is safe and practicable, including whether a prepayment meter is appropriate for the specific customer. Ofgem rules restrict the force fitting of a prepayment meter to repay debt except as a last resort.


Written Question
Energy: Meters
Wednesday 26th October 2022

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Department for Business, Energy and Industrial Strategy:

To ask His Majesty's Government what estimate they have made of the number of (1) forced prepay energy meter installs, and (2) forced prepay energy meter switches, that will take place this winter.

Answered by Lord Callanan - Parliamentary Under Secretary of State (Department for Energy Security and Net Zero)

The Government does not make an estimate of the number of prepayment meters installed or switched. The energy Regulator, Ofgem, has reported that the number of prepayment meters installed for debt under warrant in 2021 was 49,552.

Ofgem rules restrict the force fitting of a prepayment meter to repay debt except as a last resort.


Written Question
Autonomous Weapons: Treaties
Thursday 4th August 2022

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Ministry of Defence:

To ask Her Majesty's Government, further to their policy paper Ambitious, Safe, Responsible: Our approach to the delivery of AI enabled capability in Defence, published on 15 June, which states that weapons that identify, select and attack targets without context-appropriate human involvement "are not acceptable", whether they will be supporting the negotiation of a legally binding international instrument that both (1) prohibits autonomous weapons that identify, select and attack targets without context-appropriate human involvement, and (2) regulates other autonomous weapons systems to ensure meaningful human control over the use of force.

Answered by Baroness Goldie

The UK does not support calls for further legally binding rules that prohibit autonomous weapons that identify, select and attack targets without context-appropriate human involvement and regulate other autonomous systems. International Humanitarian Law already provides a robust, principle-based framework for the regulation of development and use of all weapons systems including weapons that contain autonomous functions.

Without international consensus on the definitions or characteristics of weapons with levels of autonomy, a legal instrument would have to ban undefined systems, which would present difficulties in the application of any such ban and which could severely impact legitimate research and development of AI or autonomous technologies.


Written Question
Autonomous Weapons
Thursday 4th August 2022

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Ministry of Defence:

To ask Her Majesty's Government, further to their policy paper Ambitious, Safe, Responsible: Our approach to the delivery of AI enabled capability in Defence, published on 15 June, which says that "We do not rule out incorporating AI within weapon systems" and that real-time human supervision of such systems "may act as an unnecessary and inappropriate constraint on operational performance", when this would be seen as a constraint; and whether they can provide assurance that the UK's weapon systems will remain under human supervision at the point when any decision to take a human life is made.

Answered by Baroness Goldie

The 'Ambitious, Safe, Responsible' policy sets out that the Ministry of Defence opposes the creation and use of AI enabled weapon systems which operate without meaningful and context-appropriate human involvement throughout their lifecycle. This involvement could take the form of real-time human supervision, or control exercised through the setting of a system's operational parameters.

We believe that Human-Machine teaming delivers the best outcomes in terms of overall effectiveness. However, in certain cases it may be appropriate to exert rigorous human control over AI-enabled systems through a range of safeguards, process and technical controls without always requiring some form of real-time human supervision. For example, in the context of defending a maritime platform against hypersonic weapons, defensive systems may need to be able to detect incoming threats and open fire faster than a human could react.

In all cases, human responsibility for the use of AI must be clearly established, and that responsibility underpinned by a clear and consistent articulation of the means by which human control is exercised across the system lifecycle, including the nature and limitations of that control.


Written Question
Autonomous Weapons: Ethics
Thursday 4th August 2022

Asked by: Lord Bishop of Oxford (Bishops - Bishops)

Question to the Ministry of Defence:

To ask Her Majesty's Government, further to their policy paper Ambitious, Safe, Responsible: Our approach to the delivery of AI enabled capability in Defence, published on 15 June, what assessment they have made of the specific ethical problems raised by autonomous weapons that are used to target humans and which have been raised by the International Committee of the Red Cross.

Answered by Baroness Goldie

We're very aware of the ethical concerns raised by numerous stakeholders including the ICRC around the potential misuse of AI in Defence, including its impact on humans and the potential use of autonomous systems in ways which might violate international law. We published the Ambitious, Safe, Responsible specifically in order to ensure clarity and support ongoing conversations around the UK approach.

With respect to autonomous weapons systems: the UK's focus is on setting clear international norms for the safe and responsible development and use of AI, to ensure compliance with International Humanitarian Law through meaningful and context-appropriate levels of human control. We propose development of a compendium of good practice mapped against a weapon systems' lifecycle which would provide a clear framework for the operationalisation of the eleven guiding principles agreed by the UN Group of Government Experts on Certain Conventional Weapons 2017-19.

We are keen to continue extensive discussions on this issue with the international community and NGOs on this issue, including through discussions at the UN.