(1 week, 2 days ago)
Lords ChamberThe detection of breaks is done from land, but the ability to repair them is through an agreement with the commercial companies, which pay into a fund that allows a ship to be on 24/7 standby to provide protection. That is paid for by the companies that put the cables in place.
My Lords, we of course recognise and share the Government’s and House’s concern about increased Russian military activity around these undersea cables. I was pleased that the Minister a couple of times referenced the risk assessments going on, but can he tell the House a little more and expand on his earlier answers about those risk assessments? How do they take place and how often do they occur?
The national risk assessment is undertaken regularly and led by the Cabinet Office. In this instance, DSIT is the department responsible for the risk to the cables overall, but it is in collaboration with the MoD, the Cabinet Office and others, particularly in relation to assessing risks other than those that I have outlined.
(3 weeks, 1 day ago)
Lords ChamberThis is a critical question. The Royal Institute of Navigation has recently—in fact, today—launched a paper on how to prepare for this. It is something that all critical national infrastructure will be urged to look at, to have a plan for what would happen in the event of GPS failure. There is a longer-term question about the alternatives to space-based navigation and there is active work going on in the UK on terrestrial approaches, including the use of quantum systems to try to get a robust secondary approach to PNT.
My Lords, now that over 70 nations have their own space agency, how will the Government pursue the widest and most effective possible international co-operation in support of Astra Carta’s aim,
“to care for the infinite wonders of the universe”?
There is a series of international collaborations in place. We are a member of the European Space Agency. A large proportion of the £1.9 billion of the UK Space Agency money goes to the European Space Agency and our collaborators there. We also spend through the MoD and through UKRI. We are members of the UN bodies that deal with the question of a sustainable space sector and space environment. The space environment is increasingly important and needs attention. We will continue to raise this question at the UN bodies.
(1 month, 3 weeks ago)
Lords ChamberThat is an area that of course comes under several other parts of regulation already. It is also an area where there are massive changes in the way that these models perform. If one looks at GPT-4 versus GPT-3—I know it is not facial recognition, but it gives an indication of the types of advances—it is about twice as good now as it was a year ago. These things are moving fast and there is indeed a need to understand exactly how facial recognition technology is valid and where it has problems in recognition.
My Lords, the supply chain for the development of the more advanced AI systems is, in almost every case, highly global in nature. That means that it becomes quite straightforward for AI developers to offshore their activities from any jurisdiction whose regulations they might prefer not to follow. This being the case, do the Government agree that the regulations for AI development, as distinguished mostly from use, are going to have to be global in nature? If the Government agree with that, how is it reflected in their plans for AI regulation going forward?
The noble Viscount makes an important point. This will be global; there is no question about it. Therefore, there needs to be some degree of interoperability between different regions in terms of the regulations put in place. At the moment, as I said, of the two most advanced, the US is the biggest AI nation in the world and is developing a regulation along similar lines to ours, we believe. The EU is of course the most regulated place in the world for AI and we need to work out, in consultation over the next months, how to make sure that we work out where the areas of interoperability will lie.
(1 month, 3 weeks ago)
Lords ChamberThe convention sets out activities in the life cycle of AI systems, and they should not infringe our values of human rights, democratic processes and the effectiveness of democratic institutions or the rule of law. It applies to the public sector, to the public sector when using the private sector, and there is an obligation to consider how private sector activities can be taken into account when this is implemented in a national framework.
My Lords, international bodies currently working on AI safety and regulation include the UN, UNESCO, the ITU, the G7, the G20 and the GPI, among several others. Do the Government agree that although each of these groups is crucial and has a very important role to play in creating safe and well-regulated AI globally, they will be successful only to the extent that they are effectively co-ordinated? If so, what steps are the Government taking to bring that about?
We are in active discussion with all those partners. As we consider an AI Act, we will work closely with partners in the US and elsewhere and apply it only to the limited number of companies at the very forefront of AI, to those models of tomorrow which carry particular risk and, again, where guard-rails have been asked for.