Asked by: David Pinto-Duschinsky (Labour - Hendon)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what recent estimate he has made with Cabinet colleagues of the number and proportion of antisemitic attacks associated with small forums spreading online hate.
Answered by Feryal Clark - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Online Safety Act (OSA) establishes Ofcom as the UK online safety regulator. The OSA gives online platforms new duties where there are risks of their services being used to carry out certain priority offences, including posting illegal antisemitic content which stirs up hatred.
Ofcom will set out steps in codes of practice that different platforms can take to fulfil these duties. Ofcom must consult on proposed steps. For these consultations, it publishes evidence about in-scope harms. For example, in November 2023, it published research into these matters for its consultation on its OSA ‘illegal content duties’ proposals.
https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/protecting-people-from-illegal-content-online
Asked by: David Pinto-Duschinsky (Labour - Hendon)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment he mas made of the potential merits of using powers under Schedule 11 of the Online Safety Act to extend category 1 regulation to online forums to help tackle antisemitic hate speech online.
Answered by Feryal Clark - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Secretary of State for Science, Innovation and Technology will make Regulations pertaining to Schedule 11 of the Online Safety Act as soon as reasonably practicable.
Under the Act, all user-to-user services – including online forums - will be required to proactively tackle illegal hate speech, such as illegal antisemitic abuse. If such a service is likely to be accessed by children, they will also be required to protect children from encountering specific types of legal but harmful content. This includes legal content which is abusive or incites hate on the basis of race or religion.