Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, with reference to the report entitled Securing meaningful transparency of public sector use of AI: Comparative approaches across five jurisdictions, published by the Public Law Project in October 2024, what assessment he has made of the potential merits of introducing a requirement on public bodies, when a decision has been taken about an individual that was (a) made and (b) supported by (i) AI, (ii) an algorithmic and (iii) automated tool, to proactively provide an explanation of (A) how and (B) why the decision was reached.
Answered by Feryal Clark - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Central government departments and arm’s-length bodies (ALBs) have been working to draft Algorithmic Transparency Recording Standard (ATRS) records since this became mandatory earlier this year. Publication plans were disrupted by the general election, but multiple records are expected to be published soon.
Since the introduction of a mandatory requirement for use of ATRS in cross-government policy, we have seen a significant acceleration in progress towards adopting it, which will be reflected soon in published records. As such, we do not believe that legislation is necessary at this time. We will continue to explore further options for encouraging and enforcing the use of the ATRS, and the need to extend the breadth of the policy beyond central government.
In the UK’s data protection framework, Article 22 of the UK GDPR sets out the rules relating to solely automated decisions that have legal or similarly significant effects on individuals. Under these circumstances, individuals have the right to specific safeguards, including being notified of the decisions, being provided information about the solely automated decision making that has been carried out, and the right to contest those decisions and to obtain human intervention.
These specific safeguards for solely automated decision making complement the wider data protection framework’s existing data subject rights, including the rights to transparency, objection and access. Organisations must also continue to observe the data protection principles to ensure personal data is processed lawfully, fairly and transparently. These rules apply to all organisations, including public bodies.
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, with reference to the report entitled Securing meaningful transparency of public sector use of AI: Comparative approaches across five jurisdictions, published by the Public Law Project in October 2024, whether he has made an assessment of the potential merits of introducing a requirement on public bodies to notify individuals when a decision has been taken about them that was (a) made and (b) supported by (i) AI, (ii) an algorithmic and (iii) automated tool.
Answered by Feryal Clark - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Central government departments and arm’s-length bodies (ALBs) have been working to draft Algorithmic Transparency Recording Standard (ATRS) records since this became mandatory earlier this year. Publication plans were disrupted by the general election, but multiple records are expected to be published soon.
Since the introduction of a mandatory requirement for use of ATRS in cross-government policy, we have seen a significant acceleration in progress towards adopting it, which will be reflected soon in published records. As such, we do not believe that legislation is necessary at this time. We will continue to explore further options for encouraging and enforcing the use of the ATRS, and the need to extend the breadth of the policy beyond central government.
In the UK’s data protection framework, Article 22 of the UK GDPR sets out the rules relating to solely automated decisions that have legal or similarly significant effects on individuals. Under these circumstances, individuals have the right to specific safeguards, including being notified of the decisions, being provided information about the solely automated decision making that has been carried out, and the right to contest those decisions and to obtain human intervention.
These specific safeguards for solely automated decision making complement the wider data protection framework’s existing data subject rights, including the rights to transparency, objection and access. Organisations must also continue to observe the data protection principles to ensure personal data is processed lawfully, fairly and transparently. These rules apply to all organisations, including public bodies.
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, with reference to the report entitled Securing meaningful transparency of public sector use of AI: Comparative approaches across five jurisdictions, published by the Public Law Project in October 2024, whether he has made an assessment of the potential merits of putting public sector compliance with the Algorithmic Transparency Recording Standard on a statutory footing.
Answered by Feryal Clark - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Central government departments and arm’s-length bodies (ALBs) have been working to draft Algorithmic Transparency Recording Standard (ATRS) records since this became mandatory earlier this year. Publication plans were disrupted by the general election, but multiple records are expected to be published soon.
Since the introduction of a mandatory requirement for use of ATRS in cross-government policy, we have seen a significant acceleration in progress towards adopting it, which will be reflected soon in published records. As such, we do not believe that legislation is necessary at this time. We will continue to explore further options for encouraging and enforcing the use of the ATRS, and the need to extend the breadth of the policy beyond central government.
In the UK’s data protection framework, Article 22 of the UK GDPR sets out the rules relating to solely automated decisions that have legal or similarly significant effects on individuals. Under these circumstances, individuals have the right to specific safeguards, including being notified of the decisions, being provided information about the solely automated decision making that has been carried out, and the right to contest those decisions and to obtain human intervention.
These specific safeguards for solely automated decision making complement the wider data protection framework’s existing data subject rights, including the rights to transparency, objection and access. Organisations must also continue to observe the data protection principles to ensure personal data is processed lawfully, fairly and transparently. These rules apply to all organisations, including public bodies.
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, with reference to the Answer of 14 May 2024 to Question 24976 on Artificial Intelligence: Government Departments, what recent estimate he has made of when phase one Departments will publish their first Algorithmic Transparency Recording Standard (ATRS) records on the ATRS hub.
Answered by Feryal Clark - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Central government departments and arm’s-length bodies (ALBs) have been working to draft Algorithmic Transparency Recording Standard (ATRS) records since this became mandatory earlier this year. Publication plans were disrupted by the general election, but multiple records are expected to be published soon.
Since the introduction of a mandatory requirement for use of ATRS in cross-government policy, we have seen a significant acceleration in progress towards adopting it, which will be reflected soon in published records. As such, we do not believe that legislation is necessary at this time. We will continue to explore further options for encouraging and enforcing the use of the ATRS, and the need to extend the breadth of the policy beyond central government.
In the UK’s data protection framework, Article 22 of the UK GDPR sets out the rules relating to solely automated decisions that have legal or similarly significant effects on individuals. Under these circumstances, individuals have the right to specific safeguards, including being notified of the decisions, being provided information about the solely automated decision making that has been carried out, and the right to contest those decisions and to obtain human intervention.
These specific safeguards for solely automated decision making complement the wider data protection framework’s existing data subject rights, including the rights to transparency, objection and access. Organisations must also continue to observe the data protection principles to ensure personal data is processed lawfully, fairly and transparently. These rules apply to all organisations, including public bodies.
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment she has made of the (a) adequacy of the work of the Counter Disinformation Unit and (b) impact of that work on freedom of speech.
Answered by John Whittingdale - Shadow Minister (Health and Social Care)
The Counter Disinformation Unit (CDU), now called the National Security Online Information Team (NSOIT), is focused exclusively on risks to national security and public safety.
Preserving freedom of expression is an extremely important principle underpinning the team’s work. The Government believes that people must be allowed to discuss and debate issues freely.
The NSOIT does not monitor the social media accounts of individuals and does not take any action that could impact anyone’s ability to discuss and debate issues freely. When the NSOIT identifies content which is within one of the areas of focus ministers have agreed, is assessed to pose a risk to national security or public safety and which is assessed to breach the terms and conditions of the relevant platform it may share that content with the platform. No action is mandated by the Government, it is entirely up to the platform to determine whether or not to take any action in line with their terms of service. Under no circumstances is content from Parliamentarians or journalists ever referred to platforms. Ministers continue to keep the work of the NSOIT under review and the approach to sharing any content with platforms.
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether online content flagged for removal by the Counter Disinformation Unit can include lawful expression.
Answered by Paul Scully
Preserving freedom of expression is an extremely important principle underpinning the Counter Disinformation Unit’s (CDU) work. The CDU does not monitor political debate and the CDU does not refer any content from journalists, politicians or political parties to social media platforms.
The CDU works closely with the major social media platforms to understand their terms of service and to encourage them to promote authoritative sources of information.
Where the unit encounters content which poses a demonstrable risk to public health, safety or national security and is assessed to breach the platform’s terms of service, content may be referred to the platform concerned for their consideration.
No action is mandated by the Government and it is up to the platform to independently decide whether or not to take any action in line with their terms of service.
A fact sheet providing further information on the work of the CDU can be found here.
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment she has made of the potential impact of her Department's counter-disinformation unit on freedom of expression.
Answered by Paul Scully
The Counter Disinformation Unit (CDU) leads HMG’s operational and policy response to understand and counter disinformation and attempts to manipulate the information environment, with the potential to impact domestic audiences. In addition to the Russian invasion of Ukraine and COVID-19, the CDU has considered disinformation relating to key national events such as Operation London Bridge and elections.
Freedom of expression and the media are essential qualities of any functioning democracy; people must be allowed to discuss and debate issues freely. The CDU’s work is consistent with the government’s principles and values on protecting freedom of expression and promoting a free, open, and secure internet.
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what policy areas the Counter Disinformation Unit focuses on in addition to Covid-19 and Ukraine.
Answered by Paul Scully
The Counter Disinformation Unit (CDU) leads HMG’s operational and policy response to understand and counter disinformation and attempts to manipulate the information environment, with the potential to impact domestic audiences. In addition to the Russian invasion of Ukraine and COVID-19, the CDU has considered disinformation relating to key national events such as Operation London Bridge and elections.
Freedom of expression and the media are essential qualities of any functioning democracy; people must be allowed to discuss and debate issues freely. The CDU’s work is consistent with the government’s principles and values on protecting freedom of expression and promoting a free, open, and secure internet.