Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what transparency conditions are currently required when government departments procure AI systems from private companies; and what mechanisms are in place to ensure public sector bodies can explain AI-driven decisions to citizens when the underlying models are proprietary.
Answered by Ian Murray - Minister of State (Department for Science, Innovation and Technology)
Since February 2024, all government departments and arm’s-length bodies must comply with the Algorithmic Transparency Recording Standard (ATRS), which mandates publishing details on algorithmic tools, including decision-making processes, human oversight, technical specifications, and risk assessments. Suppliers are required to provide sufficient information for transparency records, with exemptions balancing commercial sensitivities. Over 36 ATRS records have been published to date.
The AI Knowledge Hub further enhances transparency by sharing open-source code, problem statements, and performance metrics.
Additionally, the Open Source AI Fellowship promotes explainability through publicly inspectable models. These measures enable government to explain AI-driven decisions while maintaining accountability.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to establish standardised testing frameworks for identifying bias in AI datasets; and whether she will consider introducing requirements for the quality of databases used to train artificial intelligence systems.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
AI is already regulated in the UK. A range of existing rules already apply to AI systems, such as data protection, competition, equality legislation and sectoral regulation. The government is committed to supporting regulators to promote the responsible use of AI in their sectors, including identifying and addressing bias.
To help tackle this issue, we ran the Fairness Innovation Challenge (FIC) with Innovate UK, the Equality and Human Rights Council (EHRC), and the ICO. FIC supported the development of novel of solutions to address bias and discrimination in AI systems and supported the EHRC and ICO to shape their own broader regulatory guidance.
The government is committed to ensuring that the UK is prepared for the changes AI will bring.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to ensure that safety-by-design principles are integrated into AI systems from inception rather than as retrospective additions especially given the persistence in harmful online content including deep-fake CSAMs that are visible across the internet.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The government is committed to tackling the atrocious harm of child sexual exploitation and abuse (CSEA). Making, distributing or possessing child sexual abuse material (CSAM) is a serious criminal offence, and the Online Safety Act requires services to proactively identify and remove such content.
The Act requires in-scope services, including AI services, to take a safety by design approach to tackling these harms. Ofcom has set out safety measures, including requiring risky services to use technology to detect known images and scan for links to such content. There are also measures to tackle online grooming.
We are taking further action in the Crime and Policing Bill to criminalise AI models which have been optimised to create CSAM and creating a new legal defence which will allow designated experts (such as AI developers and third sector organisations) to stringently test whether AI systems can generate CSAM, and develop safeguards to prevent it.
The government remains committed to taking further steps, if required, to ensure that the UK is prepared for the changes that AI will bring.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment her Department has made of the benefits of (a) a duty of candour requiring AI developers and deployers to publicly disclose when biases are discovered in their algorithms or training data and (b) providing clear mitigation strategies, similar to disclosure requirements in other regulated sectors such as medicines.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
A range of existing rules already apply to AI systems such as data protection, competition, equality legislation and sectoral regulation. The government is also committed to supporting regulators to promote the responsible use of AI in their sectors and mitigate AI-related challenges, such as identifying and addressing algorithmic bias.
To help tackle this issue, we ran the Fairness Innovation Challenge (FIC) with Innovate UK, the Equality and Human Rights Council (EHRC), and the ICO. FIC supported the development of novel of solutions to address bias and discrimination in AI systems and supported the EHRC and ICO to shape their own broader regulatory guidance.
This is complemented by the work of the AI Security Institute (AISI) who work in close collaboration with AI companies to assess model safeguards and suggest mitigations to risks pertaining to national security.
To date, AISI has tested over 30 models from leading AI companies, including OpenAI, Google DeepMind and Anthropic.
The government is committed to ensuring that the UK is prepared for the changes AI will bring and AISI’s research will continue to inform our approach.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment her Department has made of the potential impact of unspecified price increases in fixed-term telecoms contract on consumers; and whether her Department has had discussions with Ofcom about reviewing the regulation of such increases.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
It is imperative that people feel empowered when interacting with the telecoms market and that they can be confident they are getting a fair deal.
The Secretary of State wrote to Ofcom’s CEO on 31 October to seek Ofcom’s assessment of existing consumer protections and to explore what could be done further and faster on transparent and fair pricing. The Secretary of State has also met with consumer advocate Martin Lewis of MoneySavingExpert, to discuss issues raised in the letter and ideas to further strengthen protections for ordinary people.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what recent discussions her department has had with a) telecoms companies b) consumer groups on unspecified discretionary price rises in consumer telecoms contracts.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
It is imperative that people feel empowered when interacting with the telecoms market and that they can be confident they are getting a fair deal.
The Secretary of State wrote to Ofcom’s CEO on 31 October to seek Ofcom’s assessment of existing consumer protections and to explore what could be done further and faster on transparent and fair pricing. The Secretary of State has also met with consumer advocate Martin Lewis of MoneySavingExpert, to discuss issues raised in the letter and ideas to further strengthen protections for ordinary people.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment her Department has made of the potential merits of geographical indication protections for regionally significant natural stones.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
I am responding as minister with responsibility for intellectual property. Geographical indications for craft and industrial products, e.g. natural stones, can be protected in the UK via specialised collective and certification trade marks. Collective and certification trade marks can be applied for via the Intellectual Property Office and are accompanied by regulations that set out the conditions of use of the trade mark. This can include that the goods or services covered by the mark have a specific geographical origin. As trade marks are private rights, it is for potential applicants to decide whether to seek such trade mark protection.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to help increase the level of funding for the pharmaceutical and life sciences sector for clinical trials to (a) optimise existing treatments and (b) support innovation in repurposed drugs for paediatric brain cancer.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Department for Science, Innovation and Technology invests approximately £200 million into cancer research annually via UK Research and Innovation. In parallel, the Department of Health and Social Care funds cancer research via the National Institute for Health and Care Research, investing £133 million in 2023/24.
The government is supporting commercial clinical research through the Commercial Research Delivery Networks as part of the voluntary scheme for branded medicines pricing, access and growth Investment Programme. Government investment and infrastructure can be used to optimise existing treatments and support innovation in drug repurposing. The forthcoming National Cancer Plan will also detail plans for improving care across all cancer types, including paediatric brain cancers.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether Ofcom plans to (a) measure and (b) report on the effectiveness of the Online Safety Act 2023 for tackling online child grooming.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Monitoring and evaluation are key to understanding how effective the online safety regime is. The Government and Ofcom are actively monitoring the regime’s impact through a programme of evaluation work.
This work will track the effect of the online safety regime over time and feed into a statutory Post Implementation Review of the Online Safety Act. The review will assess the performance of the legislation against its primary objectives, including how the online safety regime has protected children online.
Asked by: Victoria Collins (Liberal Democrat - Harpenden and Berkhamsted)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether his Department has had recent discussions with representatives of the creative industries on a potential copyright and AI framework.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Government will continue to engage extensively with stakeholders on copyright and AI. This includes establishing a stakeholder working group to inform the development of copyright and AI policy.
This work commenced over the Summer, where three initial meetings were convened with representatives of the creative, media and AI sectors, by the Secretaries of State for the Department for Culture, Media and Sport and the Department for Science, Innovation and Technology.
Information relating to the stakeholder working group will be published on Gov.uk, which will include further details and a list of working group members.