Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps she is taking to monitor the quality of the deployment of British Sign Language AI across public services; and whether Deaf people have been consulted on that deployment.
Answered by Ian Murray - Minister of State (Department for Science, Innovation and Technology)
85294: We are not aware of any digital public services currently using AI generated BSL content. The Service Manual and Service Standard guide service teams across the public sector on the design and development of digital services, including those enabled by AI.
A service must be accessible to everyone who needs it, including services only used by public servants. Digital services must meet level AA of the Web Content Accessibility Guidelines (WCAG 2.2) as a minimum and service teams must include disabled people and people who use assistive technologies in the design of those services.
The compliance of central government digital services with the WCAG regulations is monitored by the Government Digital Service.
85295: In addition to above (85294) c) services must make sure the non-digital parts of a service are accessible. For example, government departments must make sure that users who are deaf or have a speech impairment are offered a way to contacting the service (by text, email or in person with a British Sign Language translator or lip reader).
This standard would still apply if the service used BSL content that was AI generated.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether British Sign Language AI procurement is subject to algorithmic impact assessments.
Answered by Ian Murray - Minister of State (Department for Science, Innovation and Technology)
We are not aware of any cross-Government British Sign Language AI procurement.
The government has committed to ensure that algorithmic tools used in the public sector are used safely and transparently and is taking active steps to ensure this. The Algorithmic Transparency Recording Standard is mandatory for all government departments. It communicates information about how and why algorithmic tools are used, who is responsible for them, how they are embedded in broader decision-making processes, their technical specifications, and relevant risk mitigations and impact assessments.
The Data Ethics Framework guides appropriate and responsible data use in government and the wider public sector. It helps public servants understand ethical considerations, address these within their projects, and encourages responsible innovation.
Additionally, the Service Manual and Service Standard guide service teams across the public sector on the design and development of digital services, including those enabled by AI.
A service must be accessible to everyone who needs it, including services only used by public servants. Digital services must meet level AA of the Web Content Accessibility Guidelines (WCAG 2.2) as a minimum and service teams must include disabled people and people who use assistive technologies in the design of those services. WCAG 2.2 addresses the needs of people who are deaf or hard of hearing primarily through guidelines for multimedia, such as providing captions, transcripts, and sign language interpretations.
The compliance of central government digital services with the WCAG regulations is monitored by the Government Digital Service.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what British Sign Language (BSL) standards are being used in (a) government and (b) public services in the commissioning of BSL AI.
Answered by Ian Murray - Minister of State (Department for Science, Innovation and Technology)
Providing BSL translations of pre-recorded audio and video content is a WCAG 2.2 AAA criterion. As outlined in the Government Service Standard, all digital government services must as a minimum meet Level AA. AAA is best practice.
Current best practice guidance for use of BSL in digital public services advises that BSL videos are independently assured by a Deaf-led BSL supplier.
We are not aware of any digital public services currently using AI generated BSL content. No specific accessibility standards for this use case of AI are currently applied and would be guided by both the government’s Data Ethics Framework and Service Standard.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether she is taking steps with Cabinet colleagues to provide oversight of the monitoring of the (a) quality of, (b) adequacy of engagement with deaf people and (b) other aspects of the deployment of British Sign Language AI systems in public services.
Answered by Ian Murray - Minister of State (Department for Science, Innovation and Technology)
The Government Digital Service set and assess the cross government digital service standard. Before going live, services are assessed against this 14-point standard which includes the service team providing evidence for how the service is accessible to everyone who needs it.
To meet the standard and assessment, digital services must conduct research with disabled people, including Deaf users and where appropriate to the service provision, those who use sign language or a sign language interpreter to interact with the service.
Services must make sure any BSL video is culturally appropriate by working with the BSL community, testing it, or getting feedback.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what responsibility her Department has for ensuring media literacy.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
DSIT is committed to making the internet safer by ensuring platforms limit harmful content under the Online Safety Act and equipping people with the skills to navigate the online world.
As the lead department for media literacy, DSIT is committed to improving media literacy through coordinated cross-government work, funding innovative community-based interventions, launching an awareness campaign to build digital resilience and integrating media literacy with digital skills to meet evolving online challenges.
DSIT supports Ofcom’s updated media literacy duties and leads the relationship with Ofcom, ensuring strategic alignment and promoting best practice across sectors.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment she has made of the role of media literacy in supporting (a) public health, (b) national security and (c) democracy.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Media literacy enables citizens to critically assess information and make informed choices. It supports public health, national security and democracy by countering misinformation, improving society’s resilience to online threats, and empowering safe, confident participation online.
Media literacy is a cross-government priority, delivered through coordinated action across departments, civil society and industry, supported by targeted funding and community-led initiatives.
The Online Safety Act requires social media platforms to tackle illegal content relating to national security, health and democracy. It also updates Ofcom’s statutory duty to promote media literacy, which includes raising awareness of misinformation and helping users assess the reliability of content.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, if he will hold discussions with Google on its compliance with the Frontier AI Safety Commitments made at the AI Seoul Summit 2024, published on 21 May 2024.
Answered by Feryal Clark
We expect all signatories to the Seoul commitments to stand by their agreements. The AI Security Institute, within DSIT, has ongoing discussions will all major developers, including Google DeepMind, about the implementation of frontier AI frameworks that guide the safe development of AI.
The government welcomes Google's recently published framework that prioritises the emerging risk of deception in AI models and their plans to publish safety cases.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, if he will make an estimate of the number and proportion of (a) children and (b) adults who access the internet through a virtual private network.
Answered by Feryal Clark
The Government does not hold this information. However, Ofcom’s Technology Tracker (2024) indicates that 30% of the UK’s population over 16 years old has connected to the internet using a virtual private network (VPN) for work, education or other purposes. This increases to 38% for 16-17 year olds. This data does not demonstrate how regularly respondents use VPNs to access the internet.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether she plans to bring forward a code of practice regarding violence against women and girls online.
Answered by Saqib Bhatti - Shadow Minister (Education)
The Online Safety Act (OSA) gives online user-to-user services and search service providers new safety duties. They will need to take steps to tackle illegal content and protect children. The major social media platforms – known as ‘Category 1 services’ in the Act – will also be required to take steps to enforce their terms of service and offer user empowerment tools. As the regulator for the OSA, Ofcom will set out steps providers can take for their different duties in codes of practice and guidance. This will include steps for content which disproportionately affects women and girls.
Ofcom will also produce guidance summarising all the measures it has recommended in its different codes of practice and guidance that will protect women and girls. This guidance will ensure it is easy for platforms to implement holistic and effective protections for women and girls, across their various OSA duties.
Asked by: Anneliese Dodds (Labour (Co-op) - Oxford East)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether she will take steps to (a) publish an equality impact assessment of the Brexit Freedoms Bill in accordance with the public sector equality duty or (b) demonstrate due regard to equality in another way.
Answered by Nusrat Ghani
The equality impact assessment for the Retained EU Law (Revocation and Reform) Bill was published on 22 September 2022. This is available on the Bill page on www.parliament.uk via the following weblink: https://publications.parliament.uk/pa/bills/cbill/58-03/0156/REUL_Bill_Impact_Assessment_22-09-2022.pdf.