Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, if he will take steps to prevent social media companies adjusting network algorithms to promote actions by users (a) favouring political (i) candidates and (ii) outcomes and (b) who are potential national security threats.
The Online Safety Act gives platforms duties to tackle illegal content. The regulator, Ofcom, has outlined steps providers can take to fulfil these duties in codes of practice, including recommending steps for stopping illegal foreign interference and terrorism content being promoted via algorithms. These duties should be in effect by spring 2025.
The Act will also require all services to have clear, accessible Terms of Service (ToS) and will require Category 1 services to state what legal content for adults is not accepted. Companies must have effective reporting mechanisms, enabling users to raise concerns about enforcement of ToS, if they feel companies are not fulfilling their duties.